Back
Featured image of post Exam PL-400 Revision Notes: Designing a Technical Architecture for the Power Platform

Exam PL-400 Revision Notes: Designing a Technical Architecture for the Power Platform

This one crept under the radar for me, but the PL-400: Microsoft Power Platform Developer exam recently came out of beta and is now - one would hope - in a good state for a broader audience to sit and, all being well, attain the brand new certification aligned towards this. I sat the exam while it was in beta and was rather chuffed and surprised when I got news of my result a few days ago…

With a passing grade secured for this, I now feel (somewhat) more confident to start another revision notes series, to assist those taking the exam in future. So, therefore, I’m pleased to welcome you to the first post in this series! As always, we focus our attention on the Skills Measured area of each exam, which is freely available for study by all and sundry. Top of the agenda is the Create a technical design area of the exam, which has a total weighting of 10-15% and the first section of which concerns the following:

Validate requirements and design technical architecture

  • design and validate the technical architecture for a solution
  • design authentication and authorization strategy
  • determine whether you can meet requirements with out-of-the-box functionality
  • determine when to use Logic Apps versus Power Automate flows
  • determine when to use serverless computing, plug-ins, or Power Automate
  • determine when to build a virtual entity data source provider and when to use connectors

Before we start, some of the more eagle-eyed readers of the blog may start getting a feeling of D__éjà Vu as you read through this post and the ones that follow on. That’s because I’ve adapted content from my previous series on the now legacy MB-400 exam. In actual fact, a lot of the core content is broadly similar between both exams. There are some crucial differences, particularly around terminology, core focus areas and the introduction of new functionality that wasn’t present in late 2019. Rather than fixing the previous posts, I thought it would be better to keep things separate, re-utilise existing content, and refresh it accordingly. I should also highlight that this post and the series aims to provide a broad outline of the core areas to keep in mind when tackling the exam, linked to appropriate resources for more focused study. Ideally, your revision should involve a high degree of hands-on testing and familiarity working with the platform. But that’s enough jabbering - let’s dive into the first topic!

Power Platform Technical Architecture

The Power Platform is Microsoft’s low-code, rapid business application development platform, which can help inspire organisations to do more with less, and often forego the need to develop a new software solution from scratch. Within the Power Platform, we have several different independent, yet closely related products, that we can leverage:

  • Power Apps: These come in two flavours. Model-driven apps are designed for desktop-based scenarios, where your application needs to sit within the confines of a strict data model. In this respect, you may often hear these types of apps referred to as data-driven applications. If you’ve come from a Dynamics CRM / Dynamics 365 background, then you may recognise a lot of the functionality available within model-driven apps. In comparison, Canvas apps are geared towards mobile-first scenarios, providing app developers with a high degree of freedom in designing their apps and deploying them to a wide variety of different devices or alongside other applications within the Power Platform. Canvas apps also have the benefit of being interoperable with a wide variety of data sources. Whether you wish to connect to an on-premise SQL Server instance, other Microsoft solutions such as SharePoint or third-party apps, such as Twitter, connectors are available to perform common Create, Read, Update and Delete (CRUD) operations and more.
  • Power BI: A next-generation Business Intelligence (BI) tool, Power BI provides impressive data modelling, visualisation and deployment capabilities, that enable organisations to understand data from their various business systems better. Despite having its own set of tools and languages, traditional Excel power users should have little difficulty getting to grips with Power BI, thereby allowing them to migrate existing Excel-based reports across with ease.
  • Power Automate: As a tool designed to automate various business processes, Power Automate flows can trigger specific activities based on events from almost any application system. It is a modern and flexible tool that you can use to address various integration requirements.
  • Power Virtual Agents: Many of us will be familiar with the various live chat solutions that we see across different websites that are often operated by one or multiple individuals and help answer commonly asked queries. Power Virtual Agents takes this a step further, by allowing for an automated, always-on bot to reside within your website or Microsoft Teams site, that individuals can then engage with. Developers construct a chatbot using an interactive editor, and can straightforwardly incorporate external integrations without writing any code.
  • Microsoft Dataverse (formerly known as the Common Data Service): The Dataverse provides a “no-code” environment to create tables, relationships and business logic, to name but a few of its capabilities. Within the Dataverse, Microsoft has standardised the various tables to align with The Common Data Model. This open-source initiative seeks to provide a standard definition of commonly used business data constructs, such as Account or Contact.

The diagram below lazily stolen lovingly recycled from Microsoft illustrates all of these various applications and how they work together with other Microsoft services you may be familiar with:

You may often hear questions about how Dynamics 365 fits within the Power Platform, the answer to which sometimes raises more questions than it answers. For this exam, you don’t need to worry too much about this dimension. However, to briefly summarise, solutions such as Dynamics 365 Sales or Dynamics 365 Service leverage aspects of the Power Platform underneath the hood. For example, both of the previously mentioned solutions use Microsoft Dataverse and model-driven Power Apps.

Understanding how each separate Power Platform application can work in tandem is critical when building an all-encompassing business application. The examples below provide a flavour of how these applications can work together, but the full list would likely cover several pages:

  • Including a Power Automate flow as part of a Dataverse solution, allowing you then deploy this out to multiple environments with ease.
  • Being able to embed a Power BI tile or Dashboard with a personal dashboard setup in a model-driven Power App.
  • Embedding a canvas-driven Power App into Power BI, allowing users to update data in real-time.
  • Call a Power Automate flow from a Power Virtual Agent, to return information from an on-premise Oracle database.

As developers of the platform, Microsofts expects us to know the detailed scenarios that the Power Platform can unlock for organisations and, where possible, identify the most efficient solution to adopt, that may often negate the need for writing custom code.

Handling Security & Authentication

Ensuring that critical business data is subject to reasonable and, where appropriate, elevated access privileges is typically an essential requirement as part of any central business system. The key benefit that the Power Platform brings to the table here is that it uses one of the best identity management platforms available today - Azure Active Directory (AAD). Some of the benefits that AAD can bring to the table include:

  • Providing a true single sign-on (SSO) experience across multiple 1st/3rd party applications, backed up by robust administrator controls and auditing capabilities.
  • Allowing full support for user principal or security group level controls, via role-based access controls (RBAC).
  • Access to a wide range of security-minded features, such as Multi-Factor Authentication (MFA), risky sign-in controls and self-service password reset capabilities, should a user account or its associated password be detected as a potential risk.

When it comes to managing security or access within the Power Platform, this will differ, based on which application you are working with:

Typically, a developer will want to design any application to use the Dataverse as the underlying data source for the solution. The security and record restriction features afforded here will more than likely be suitable for most situations.

Comparing Logic Apps to Microsoft Power Automate Flows

Confusion can arise when figuring out what Azure Logic Apps are and how they relate to Power Automate. That’s because they are almost precisely the same; Power Automate uses Azure Logic Apps underneath the hood and, as such, contains most of the same functionality. Determining the best situation to use one over the other can be a bit of a challenge. The list below summarises the pro/cons of each tool:

  • Azure Logic Apps
    • Enterprise-grade authoring, integration and development capabilities.
    • Full support for Azure DevOps or Git source control integration.
    • “Pay-as-you-go” - only pay for if and when your Logic App executes.
    • Cannot be included in solutions.
    • Must be managed separately in Microsoft Azure.
    • Does not support Office 365 data loss prevention (DLP) policies
    • Target Audience: Developers who are familiar with dissecting structured JSON definitions
  • Power Automate
    • Easy-to-use development experience
    • Can be included within solutions and trigger based on specific events within the Dataverse
    • Supports the same connectors provided within Azure Logic Apps
    • Difficult to configure alongside complex Application Lifecycle Management (ALM) processes.
    • Fixed monthly subscription, with quotas/limits - may be more expensive compared to Logic Apps.
    • Must be developed using the browser/mobile app, with no option to modify underlying code definition.
    • Target Audience: Office 365 power users or low/no-code developers

In short, you should always start with Power Automate flows in the first instance. Consider migrating across to Logic Apps if your solution grows in complexity, your flow executes hundreds of time per hour, or you need to look at implementing more stringent ALM processes as part of your development cycles. Fortunately, Microsoft makes it easy to migrate your Power Automate flows to a Logic Apps.

Comparing Serverless Computing to Plug-ins

Serverless is one of those buzz words that gets thrown around a lot these days 😀 But it is something worth considering, particularly in the context of the Power Platform. With the recent changes around API limits, it also makes serverless computing - via the Azure Service Bus, for example - a potentially desirable option to reduce the number of API calls made into the Dataverse. The list below summarises the pro/cons of each route:

  • Serverless Compute
    • Allows developers to build solutions using familiar tools, but leveraging the benefits of Azure.
    • Not subject to any sandbox limitations for code execution.
    • Not ideal when working with non-Azure based services/endpoints.
    • Additional setup and code modifications required to implement.
    • No guarantee of the order of execution for asynchronous plug-ins.
  • Plug-ins
    • Traditional, well-tested functionality, with excellent samples available.
    • Reduces technical complexity of any solution, by ensuring it remains solely within the Dataverse.
    • Full exposure to the underlying database transaction.
    • Impractical for long-running transactions/code.
    • Not scalable and subject to any platform performance/API restrictions.
    • Restricts your ability to integrate with separate, line-of-business (LOB) applications.

Comparing Virtual Tables to Connectors

The core idea of adopting the Power Platform is to ultimately reduce the number of separate systems within an organisation and, therefore, any complex integration requirements. Unfortunately, this endeavour usually fails in practice and, as system developers, we must, therefore, contemplate two routes to bringing data into the Dataverse:

  • Virtual Tables: Available now for several years, this feature allows developers to “load” external data sources in their entirety and work with them as standard tables. Provided that this external data source is accessible via an OData v4 endpoint, it can be hooked up to straightforwardly; for more complex needs, developers can build a custom data provider, thereby allowing operability with any possible data source. The critical restriction around virtual tables is that all data will be in a read-only state once retrieved, and it is not possible to write or create new records. Virtual Tables now support full CRUD operations, as a consequence of recent updates from Microsoft. You can refer to the following article for details on how to get started on this. Thanks to WB below in the comments for providing the link.
  • Connectors (AKA Data Flows): A newer experience, available from within the Power Apps maker portal, this feature leverages the full capabilities provided by Power Query (M) to allow you to bring in data from a multitude of different sources. As part of this, developers can choose to create tables automatically, map data to an existing table and specify whether to import the data once or continually refresh it in the background. Because any rows loaded are stored within a proper table, there are no restrictions on working with the data. However, this route does require additional setup and familiarity with Power Query. It’s also not bi-directional (i.e. any changes to records imported from SQL Server will not synchronise back across).

Ultimately, the question you should ask yourself when determining which option to use is, Do I need the ability to create, update or delete records from my external system? If the answer is No, then consider using Virtual Tables.

Hopefully, this first post has familiarised yourself with some of the core concepts around extending the Power Platform. In the next post, we’ll be looking at how we can design various components within our Power Platform solution, using the tools on offer from Microsoft.

comments powered by Disqus
Built with Hugo
Theme Stack designed by Jimmy