Share and Enjoy !

Using Power Automate to create Flows that automate tasks is an easy and powerful tool within the Microsoft 365 suite and for citizen developers who are working directly in their production tenant, the process is very straightforward. However, if you are a traditional developer who wants some level of software development lifecycle support where you develop in one tenant and deploy to another tenant things can get messy quickly. Microsoft has a good walkthrough on exporting and importing flows across environments so I will not cover those steps. Instead, I want to share some techniques we have come up with at ThreeWill that have worked out well for us.

Never pick a SharePoint list or library from the drop-down

Even though the “Batch Updates” is showing as a choice in the screenshot below, I still choose to enter a custom value and type in “Batch Updates”. The reason is that when you choose a list from the drop-down, internally Flow will use the unique id for that list. However, if you choose to enter a custom value it will use the list title you have provided instead. In Microsoft’s walkthrough, the ability to re-configure the data sources like SharePoint lists is listed as a limitation, and using this technique is one way to avoid that pitfall.

 

 

 

 

 

 

 

With list titles instead of list ids your import process much easier where you do not have to go find and fix all your SharePoint list-related actions after import.

You can see the differences in the definition json examples below. The top “table” has an id that is most definitely not going to exist in any target environment whereas the bottom has a list title that will exist in a target environment.

 

 

 

 

 

 

 

 

 

Use Variables much as possible

To some, this may be obvious but if you have worked in Flow you know that creating and using variables, well, it just is not as easy as it is in code. You may be tempted when creating an expression to put some string value right in the expression and call it a day. However, I have found that putting any and everything that might change in a variable right at the beginning of your Flow will pay off later.

Unfortunately, you cannot wrap them in a control action like a scope so you will likely end up with a lot of “initialize …” actions at the top, but in my opinion this is still preferred to them being nested as a string down in some function. The key is using very good variable naming conventions as we do in source code for our variables so the purpose is clear. Then when the business wants to change who gets emailed for an article review email you can change the “articleReviewerEmails” variable rather than finding the Send Mail shape deep down inside your flow.

Source Control your json files (not your exported package)

This can sometimes feel odd given that the Flow editor is web-based and you can easily export and import packages with a little massaging of the Flow during import. However, in my experience having all the json files that make up your Flow in source control will allow you to monitor changes between versions much more easily.

My process is to export the flow to a zip file, extract that zip file, and put it in a folder named for the flow. Typically, it will look like below with a lot of json files in folders.

 

 

 

 

Then I use a code editor like Visual Studio Code to format the json files, so they go from being on a single line like below:

 

 

 

 

To being neatly formatted like this:

 

 

 

 

 

 

 

 

 

Having the json neatly formatted means that you can more easily difference two files when you need to research changes between versions. It is also much easier to read and it helps with our next tip.

Use PowerShell to prime and package based on your target environment

Now that our data sources have been partially genericized (our list titles at least) and our json files have been formatted, it is easy to use PowerShell to replace values like the site collection URL, tenant or other things based on which environment we are going to deploy to. Admittedly, I did not invent this process but have been using the work of others at ThreeWill to keep our processes streamlined and standardized.

Below is an example of a typical script for creating a package based on a target site collection. This one only replaces site collection URLs but really any source value can be replaced with a target one. For example, you may want to update email addresses or variables you initialize. The key is really line 20 which will do a find and replace in the definition json file and then line 25 that creates a zip file that includes the tenant and site collection name so based on the name of the package you know where it should be deployed.

 

 

 

 

 

 

 

Hopefully, you find some of these tips and tricks helpful.  I know I do as a traditional developer living in a playground intended for no-code/low-code solutions and citizen developers.  If you have any tips you’d like to share feel free to comment below.

CONTACT THREEWILL FOR SUPPORT WHEN CONSOLIDATING OR MIGRATING CONTENT ACROSS TENANTS

Share and Enjoy !