Although Microsoft really didn't plan (or probably want you to) most organisations understood the benefits of application lifecycles and instigated a Dev / Test / Prod environment strategy.
With all of the benefits unfortunately came a big negative, and that is the overhead of deploying solutions.
This lead the the launch of Power Platform pipelines, but back to the first sentence, this was implemented quite right. And that's because a big part of lifecycle management includes segregation of duty, ago the developer should not have access to test or prod. The standard pipeline requires that the developer has full maker access to test and prod, which in the real world leads to developers editing in prod.
Microsoft again listened and released delegated deployments, using either a application account (SPN) or a normal but non-human account (Service Account). But again there were issues:
SPN Issues
- Requires the developer who creates the pipeline to own the SPN in Azure, not good security practice
- Uses the developers connections, so although they don't have access to test/prod environment, they have access to test/prod data
Service Account
- Cant deploy connectors (yep you heard me, only solutions with no connections!)
Luckily there is a workaround to the issues, to get Power Platform Pipelines fit for use.
- Design
- Implementation
- Flows
1. Design
The workaround works on 3 key facts:
- Delegated Service Accounts can create their own connections
- SPNs can have connections shared with them
- Solutions have a config json included
So the workaround is a little bit of hot potato, with the connections being passed around.
- First we update the config json with the Service Accounts connections
- The Pipeline then deploys using the Service Accounts connections
- After deployment the SPN swaps the connections (and the components) to the new owner
2. Implementation
Unfortunately then implementation is a little more complex, as we need to have some prior setup and extra admin.
Extra Setup
What's great about the pipeline is it allows you to create the connection in the target environment when you trigger the run. But that can't be done if not using the developers connections, so we have to set them up before.
In every target environment we need to create all possible connections that the Service Account could use.
Additionally we have to setup the solutions new owners connections, and share them with the SPN.
Extra Admin
As we have add extra complexity we need a way to administer this, and the key data we need is:
- Solution
- New Owning Service Account
as without that data we wont know which account to change to after import.
The easy way is to create a Dataverse table and Model Driven app (I call it the Pipeline Register), and while we are doing all that work we might as well add in some useful data like:
- Approval
- Documentation Link
- Change Number
- Developer team (so we can share read only access)
And once all that is done we end up with something like this:

Better resolution version here
3. Flows
Power Platform Pipelines have a very important feature, Gated Extensions.
So we going to leverage these gates to run flows to action our required updates.
OnApprovalStarted
These flow will create the Solution config json file which will update the connection references to use the delegated deployment service account.
- First we get the deployment stage run details
- Next we parse the json config from the table (
@{outputs('Get_a_row_by_ID')?['body/deploymentsettingsjson']}) - Then we need to find the connections for the Service Account in the target environment using Get Connections as Admin and filtering
- Then from the parse we loop over each connection reference and find the exact connection in the target environment
and(
contains(item()?['id']
,
split(items('For_each')?['ConnectorId'],'/apis/')[1])
,
equals(
item()?['properties/displayName']
,
parameters('Pipeline-DelegatedAccount (ia_PipelineDelegatedAccount)')
)
)
- Next is to set the property of a holding array variable to the new connection reference
setProperty(
items('For_each')
,
'ConnectionId'
,
body('Filter_array_connections')[0]?['name']
)
- Finally we update the deployment stage run detail item will found at the beginning
We get the environment variables from the parse JSON so not to wipe them
As this is also prior to import this is where I would add additional checks and approvals:
- Is it in the Pipeline Register
- If its to prod is it approved
OnDeploymentCompleted
After the solution is imported we need to change everything over from the delegated service account to the owning account.
The 2 key steps are:
- Changing connections - I've done a deep dive how to here: Power Automate - How to Change Connection Owners
- Changing component owner - again deep dive already done here: Automating Changing Solution Owner in the Power Platform
You must do it in above order as the new owner must have access to the connection used
And like pre approval there are other actions you can take:
- Share any flows with dev team
- Back up solution to external repository
- Share App url's
As you can see there is definitely added complexity, especially around pre setting up connections. I tried the Power Automate Management action, Create Connection but always get errors. The other option is our trust Power Automate Desktop to do it through the UI, but I really wish there was some sort of credential bank for admins to manage them.
But if your org follows good practice like true separation of duty and application lifecycle management, then you can at least now use the inbuild Pipelines.




Top comments (0)