Back
Featured image of post Exam PL-600 Revision Notes: Designing Integrations for the Power Platform

Exam PL-600 Revision Notes: Designing Integrations for the Power Platform

Welcome to the sixth post in my series focused on providing revision notes for the PL-600: Microsoft Power Platform Solution Architect exam. Last time around, we dove into data modelling and what potential routes are open to us as solution architects when working with the Power Platform. This week, we move onto the next topic, which concerns integrations and how we can best approach designing them:

Design integrations

  • design collaboration integrations
  • design integrations between Microsoft Power Platform solutions and Dynamics 365 apps
  • design integrations with an organization’s existing systems
  • design third-party integrations
  • design an authentication strategy
  • design a business continuity strategy
  • identify opportunities to integrate and extend Microsoft Power Platform solutions by using Microsoft Azure

Although the Power Platform is billed primarily as a citizen developer platform, we will, at times, need to leverage pro-code extensibility or other “fusion” development approaches to help build out our solution. And as part of this, we will likely need to factor in one or several different integrations involving our existing business systems. Let’s unwind this in further detail, and look at some of the routes available to us to help meet the challenges that may arise.

The aim of this post, and the entire series, is to provide a broad outline of the core areas to keep in mind when tackling the exam, linked to appropriate resources for more focused study. Ideally, your revision should involve a high degree of hands-on testing and familiarity with the platform if you want to do well. And, given the nature of this exam, it’s expected that you already have the necessary skills as a Power Platform Developer or Functional Consultant, with the certification to match.

Fostering Collaboration within the Power Platform

When building any form of business application, the central aim is to ensure that the organisation can better understand the critical information points relating to our customers and core business. As part of this, we can look to foster better collaboration by ensuring that we equip staff members with the correct information at the right time. From there, we can have the appropriate discussions to celebrate a particularly successful quarter or to start figuring out why sales dropped during February. These “side-benefits” will naturally materialise as we begin to deploy our Power Apps, Power BI Reports, and other components. A solution architect does not necessarily need to do anything specifically to help realise this. However, it is helpful that we keep in mind some of the following key features across the Power Platform that we could guide the organisation to start using to improve collaboration further:

As we’ve established previously in the series, Power Platform solution architects are very much subject matter experts, or SME’s, with a natural expectation that we can advise and encourage adoption of the above elements whenever we can establish a suitable usage case. Ensure we keep the above in the back of our minds whenever we want to promote further collaboration within our Power Platform solutions.

Integrating Dynamics 365 Customer Engagement with the Power Platform

We’ve discussed the core features previously within each Dynamics 365 Customer Engagement app, and how we can factor these apps in as part of our overall solution, so there’s no point retreading old ground. Suffice to say, our journey and considerations when it comes to integrating alongside any of the Dynamics 365 Customer Engagement apps are simplified because they leverage many of the base components within the Power Platform. It becomes effortless for us to integrate both toolsets and trigger, for example, a Power Automate cloud flow on creating a new Dynamics 365 Sales Lead row. A solution architect may need to consider and plan contingency actions relating to how we use Dynamics 365 components and whether it may be appropriate to set up different environments to “contain” this functionality within a specific business area. Overall, these challenges shouldn’t present much difficulty in overcoming.

To get started with any Dynamics 365 Customer Engagement app, we need to fully appreciate the capabilities on offer as part of the Power Platform Admin Center. This portal provides us with a central location to manage each of our Dataverse environments, and, through here, we can look to install one or all of the different Dynamics 365 apps available to us. The only real constraint is to ensure we have the appropriate licenses provisioned on the tenant for the app(s) we want to install.

Demo: Power Platform Admin Center Overview

To help better understand how we can link together one (or several) Dynamics 365 Customer Engagement applications within our Power Platform solution, take a look at the video below, which provides a detailed walkthrough of what we can do within the Power Platform Admin Center:

Outline Approaches to Integrations

Anything involving some form of integration, from an IT standpoint, will typically include some form of collective moan. 😫 This reaction can be understandable. If you’ve worked on similar projects that I have, technical integrations very often are easier said than done and can end up being incredibly tedious to work with. The Power Platform is no different in this regard. Although we have tools at our disposal to help ease us on our journey, we will often find ourselves grappling with particularly tedious integrations. And as part of this, we may need to determine which approach to go down - do we fully (and exclusively) only leverage “out of the box” components, or do we invest in building a bespoke integration using custom code?

Under the first approach, we can work in a relatively brisk fashion and leverage all platform aspects to their fullest extent. In this regard, Power Automate becomes a natural tool for us to turn to, thanks to its wide array of connectors and its visual designer. In addition, we can look to empower our functional consultants or citizen developers to build out these integrations themselves and significantly reduce our reliance on custom code as part of the integration we are building out. This will reduce the technical cost of our solution and make it easier to maintain in the future, by ourselves and Microsoft as well. Despite this, integrations like these can lead to verbosity in our solutions and challenging for out-of-the-box components to cope with. It can also lead to our solution becoming borderline indecipherable to understand. I’ve seen my fair share of “spaghetti flows”, which take a whole day to digest and understand fully. Based on this, I feel qualified to make this assertion. 😉

Alternatively, we can invest heavily into pro-code or “fusion” development efforts to implement our particular integration. Indeed, this may be our only course of action if we are working with a particularly troublesome integration involving a legacy application system or where we envision a significant amount of transformation effort for our data. Integrations of this nature tend to be a lot more formalised from an IT standpoint, thereby making them easier to include as part of any testing or Application Lifecycle Management (ALM) processes that we follow. The challenge with these integrations is that they can sometimes involve a helluva lot of custom code and bespoke development effort, making our overall solution more difficult to maintain. Therefore, it starts to become a tricky balancing act, and we could accidentally introduce further cost as part of what we’ve built out in future

Regardless of the different approaches available to us, the solution architect’s primary concern is to maintain a careful balancing act. By this, I mean ensuring that we don’t fall too far into one pathway and that we cleverly leverage the platform’s specific capabilities based on the scenario we are faced with. For example, by using a custom connector, our pro-code developers can leverage their knowledge to build a connector that will work with our legacy application system and become something that our citizen developers can leverage as part of the apps and cloud flows they create. By thinking carefully and planning our integrations correctly, we can hopefully end up in a position where we are minimising bespoke development efforts and exploiting the capabilities of the Power Platform to its fullest extent.

Authentication Overview

Let’s suppose our organisation does not currently leverage any other Microsoft 365 service or Azure Active Directory (AAD). In this situation, it could be our journey into the Power Platform provides our first opportunity to work with this identity platform. Consequently, we may need to consider and advise the business on the available capabilities and how they can be used to meet (and exceed) our expectations from an information security standpoint. The topic of AAD could be the subject of an entire blog series but, to summarise, this identity provider gives us the following benefits:

  • OAuth 2.0 Support: As a thoroughly modern identity provider, AAD supports the latest version of the Open Authorization (OAuth) standard, providing a consistent and familiar standard for developers working with multiple cloud providers. Specifically for AAD, we also have support for flexible authentication flows that can leverage standard user accounts, service accounts or service principals.
  • Multi-Factor Authentication: To help provide an additional security layer for all login attempts, administrators can require end-users to set up multi-factor authentication for their accounts. This will mandate that the user provides a second piece of information during their logon attempt - typically a one-time passcode sent to a mobile device. Users can look to install the Microsoft Authenticator app onto their mobile devices, which will allow them to approve or deny a particular login attempt.
  • Conditional Access Policies: To provide an additional security layer, AAD administrators can configure different access policies for the various cloud services we implement. As part of this, we can look to block access to certain services based on a device’s location or, if it is outside the corporate network, mandate the use of MFA in these scenarios. There are several flexible options at our disposal.
  • Federation / Single Sign On (SSO): Most applicable for scenarios where we have an existing, on-premise Active Directory forest, implementation of this will help to ensure that users are not continually prompted to sign-in to the Power Platform; instead, simply logging into a domain-joined device will be all that’s needed.

A solution architect will need to advise implementing one, all, or additional features on offer as part of AAD to ensure that our Power Platform users can securely and straightforwardly access the applications built for them. At this stage, it may be appropriate for us to bring in the expertise of dedicated ADD consultants or an Azure solution architect to advise further; because let’s face it, we can’t expect to be domain experts on this topic as well. 😉 Therefore, don’t worry too much about grasping AAD in depth.

Ensuring Business Continuity with the Power Platform

As a fully managed, software-as-a-service (SaaS) platform, Microsoft will handle many aspects of the platform for us and provide us with a level of guarantee for business continuity in a severe outage or a disaster recovery scenario. Nothwithstanding these guarentees, there may still be steps that we, as solution architects, need to implement to ensure our solution continues to work if the unforeseeable occurs. There are several considerations that we should take into account here:

  • Is it sufficient to rely on the system Dataverse backups that Microsoft performs on our behalf, or do we need to create our own manual backups on top of this? Using backups to restore our environment, potentially to a different region, will be something we need to plan for in a disaster recovery scenario. It may also be prudent to consider and implement backups to other locations, such as an on-premise location or another public cloud provider, but there may be some technical challenges to overcome if this is required (given that we can’t just download a backup of our Dataverse environment).
  • Ensuring we have all aspects of our solution stored in source control, leveraging the approaches we’ve spoken about previously in the series, will allow us to perform “clean slate” restores, should the need arise. As part of this, we’ll need to consider restoring any configuration and business data.
  • Typically, as part of disaster recovery planning, we need to work towards a specific recovery time objective (RTO), which provides the organisation with an indication of the amount of time it will take to get us back to normal. Ensuring that we communicate an accurate timescale here and, most crucially, that we’ve performed a full test to verify any assumptions will be vital.

Many of the requirements here will be largely dictated by the organisation in question, its size and industry type, and no two projects or businesses are the same in this regard. Therefore, the solution architect may have to spend considerable time planning and implementing bespoke technical solutions to ensure we can maintain business continuity at all times.

Azure Integration Options

Microsoft Azure provides us with our most straightforward mechanism to extend out the capabilities of the Power Platform and should always be the first point of consideration for any solution architect. There are innumerable ways we can leverage Azure in this fashion, but, most typically, the following types of integrations will be ones that we would normally recommend as solution architects:

  • Azure Synapse Link: Previously, if we wanted to export our Dataverse database data out into Azure continuously, we would have to look at options such as the Data Export service. Given now, however, that this service is deprecated, we can instead look to leverage Azure Synapse Link to export our data out into an Azure Data Lake Storage Gen2 location and then surface this within an Azure Synapse Analytics Workspace. This will then allow us to consume our Dataverse environment data as part of any “big data” processing solutions or simply get this data fed into our organisation’s existing data warehouses.
  • Connectors: We have well over 350+ different connectors available for us as part of our canvas Power Apps, or Power Automate flows and, as part of this, we also have access to a variety of Azure-based services, such as Azure Blob Storage, Azure Key Vault, Azure Data Factory and more. It, therefore, becomes relatively trivial for citizen developers to extend their solutions into these services as and when the need arises.
  • Dataverse Extensibility: Developers of Dataverse plug-ins can look to extend out plug-ins and execute them within Microsoft Azure by exporting the complete transaction details into Azure Service Bus or by calling an HTTP endpoint residing on an Azure Function or similar. This can be useful in circumnavigating some of the sandbox limitations relating to plug-in execution and as a mechanism to “feed” Dataverse data out into other external systems.

Other topics also stray into this area, such as the preview capabilities that allow us to link our app/Dataverse consumption to an Azure subscription, to account for any overages or unpredictable usage patterns. Typically, preview features will not be topics for consideration as part of an exam. Still, it remains beneficial for a solution architect to be appraised of such capabilities so we’re adequately prepared for when they move out into general availability.

Demos: Reviewing Microsoft Azure Extensibility Options

To better understand what options are available to us, take a look at the following videos below, which demonstrate how to work with Azure Synapse Link and how we can export Dataverse transactions into an Azure Function or to Azure Service Bus:

If, as solution architects, we design our integrations well, it will ensure that we get the most value out of our Power Platform solution and increase its utility across the organisation. Next time in the series, we will look at how to design the most effective security model within the Power Platform.

comments powered by Disqus
Built with Hugo
Theme Stack designed by Jimmy