Featured image of post Extracting Dynamics 365 / Power Apps Solutions Using YAML (Azure DevOps)

Extracting Dynamics 365 / Power Apps Solutions Using YAML (Azure DevOps)

Continuous integration and build automation continue to remain the core tenants of a successful DevOps and Application Lifecycle Management (ALM) process. Regardless of the type of software system you are working with, you should always make reasonable endeavours within both of these areas, so that you can meet the following objectives:

  • Store all software artefacts within a shared code repository, that provides a full history of all changes and the ability to determine a last, known good configuration or version.
  • Provide the mechanism to store and then quickly deploy compiled code artefacts into any target environment.
  • Automate all aspects of the previous steps, and more besides, to reduce the amount of human intervention required.

Thankfully, although a lot of this may sound tricky to implement on the face of it, we have a plethora of tools at our disposal to help speed us along. Azure Pipelines is a competent tool in this regard and, via the use of YAML build definitions, we can achieve the above objectives and more. For example, using YAML, we can very quickly put together a library of deployment templates that are suitable for use across multiple projects, and subject to versioning/change control too. In today’s post, we’ll see how we can use YAML to automate the extraction of Dynamics 365 Online / Power App Solution files.

For those with longstanding experience working in the Microsoft Business Applications space, Solutions have been the mainstay mechanism for defining, controlling and migrating bespoke customisations to our Common Data Service environments. Consisting of a .ZIP file, containing a multitude of different components (such as XML, DLL’s, JavaScript, image files and more), we can best think of them of as a complete “bundle” of all the changes we wish to apply into a given environment. However, in most cases, the extraction and deployment of these solution files have traditionally required manual intervention to achieve, unless you were an experienced coder with a few hours to spare. Also, the solution .ZIP file in its base form is impractical to store from a repository standpoint. Although we’ll be able to track whenever our pipeline generates a new solution file, we have no visibility over what has changed underneath the hood. Again, to do this, we’d need to look at some bespoke mechanism to extract out all of the raw components into a logical, accessible folder/file structure. Altogether, then, it has previously been rather tricky to set up the kind of DevOps/ALM solution that I indicated at the start of this post.

We can be thankful, therefore, that we live in more enlightened times these days; and, in particular, that we have some effective tools at our disposal to help us build something quickly, without having to resort to writing much custom code. Using the example YAML definition below and, courtesy of the Power Platform Build Tools, we can perform a daily extract of a solution file at 8 PM each day, expand out its contents and then push all changes into a chosen Git branch:

name: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r)

trigger: none
schedules:
- cron: "0 20 * * *"
  displayName: Daily Build
  branches:
    include:
    - MyArea/MyBranch
  always: true

jobs:
- job: ExtractMySolution
  pool:
    vmImage: 'windows-latest'
  steps:
  - task: PowerPlatformToolInstaller@0
    inputs:
      DefaultVersion: true
  - task: PowerPlatformSetSolutionVersion@0
    inputs:
      authenticationType: 'PowerPlatformEnvironment'
      PowerPlatformEnvironment: 'My Environment'
      SolutionName: 'MySolution'
      SolutionVersionNumber: '1.0.0.$(Build.BuildID)'
  - task: PowerPlatformExportSolution@0
    inputs:
      authenticationType: 'PowerPlatformEnvironment'
      PowerPlatformEnvironment: 'My Environment'
      SolutionName: 'MySolution'
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\MySolution.zip'
      AsyncOperation: true
      MaxAsyncWaitTime: '60'
  - task: PowerPlatformUnpackSolution@0
    inputs:
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\MySolution.zip'
      SolutionTargetFolder: '$(Build.SourcesDirectory)\JJG.MyProject\MySolution'
  - task: CmdLine@2
    inputs:
      script: |
        echo commit all changes
          git config user.email "[email protected]"
          git config user.name "Automatic Build"
          git checkout MyArea/MyBranch
          git add --all
          git commit -m "Latest solution changes."
          echo push code to new repo
          git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin MyArea/MyBranch        

Let’s talk through exactly what this YAML file is doing:

  1. To begin, we must install all of the pre-requisite components needed by the Power Platform Build Tools. This is a mandatory step and will prevent any nasty errors further down the line.
  2. We then update the version of our solution, using a combination of a fixed version number and the unique build ID from Azure DevOps.
  3. Next, we perform an export of the unmanaged solution from the tenant, using an Asynchronous operation to process this. The pipeline then stores the resulting .zip file within a local directory on the build agent.
  4. Then, the pipeline unpacks the entire contents of the solution file into a new directory within the sources directory; which, in this instance, will be a direct copy of the contents of our MyArea/MyBranch branch.
  5. Finally, we run a series of Git commands via a command prompt to push all of the extracted solution contents back into our remote repository. As part of this, we define a custom user name and commit message as part of these changes.

So in less than six steps, we’ve been able to extract out every single component of our solution, and automate the entire process of getting these changes back into our repository. And all without requiring a single manual step - nice!

While this YAML file is relatively self-contained in terms of its functionality, there’s also a couple of things you’ll need to set up around this, to get it working as intended.

  • The branch in question (in this case, MyArea/MyBranch) will need to exist in your target repository before running the pipeline for the first time.
  • Make sure the Project Collection Build Service account has been granted Contribute privilege over the repository you are working with. You can verify this by navigating to Project Settings -> Repositories, selecting the repository you are working with and ensuring that the Contribute privilege is to set to Allow**.** This account will typically contain the name of your DevOps organisation in the title.
  • Ditto above, but this time for the Build Service account for your project. This account will typically have the naming format of Build Service ().
  • Create a generic service connection, which will store the details of the Dynamics 365 / Common Data Service environment hosting your solution file. For this, you will need the URL of your instance and the username/password value of an account with sufficient privileges to extract solutions from the environment.

Altogether then, by using the solution outlined in this post, or a variant thereof, developers no longer need to worry about manually extracting and checking in their solution changes each day. The MyArea/MyBranch branch can then remain open for all incoming changes, which we can then push where they need to go as part of a Pull Request further down the line. And, finally, we can assure the business that we are meeting the three objectives outlined at the start of this post…and, hopefully, preventing some poor individual from completing repetitive, manual tasks at the end of each day too. đŸ™‚

comments powered by Disqus
Built with Hugo
Theme Stack designed by Jimmy