Welcome to the second post in my series focused on providing a set of revision notes for the PL-400: Microsoft Power Platform Developer exam. In last week’s post, we discussed the high-level concepts you need to successfully grasp to build a technical architecture within the Power Platform. This topic sits within the Create a specialised design area of the exam, which has a 10-15% weighting and also comprises of the following, second topic, which will be the focus for today’s post:
Design solution components
- design a data model
- design Power Apps reusable components
- design custom connectors
- design server-side components
Here, we start to look at some of the more technical aspects of the Power Platform, which can often form the foundation of a more complex solution that can limit the amount of actual code we need to write. However, it can be tricky to determine what particular technical features Microsoft refers to as part of the blurb above; so let’s dive in and try and make some sense of it all. 😀
As with all posts in this series, the aim is to provide a broad outline of the core areas to keep in mind when tackling the exam, linked to appropriate resources for more focused study. Ideally, your revision should involve a high degree of hands-on testing and familiarity working with the platform if you want to do well in this exam.
Data Model Design Fundamentals
A common pitfall for any developer working with applications, like the Power Platform, is to miss the bleeding obvious; namely, by ignoring features, such as Business Rules or Power Automate flows, that reduce the need for custom code. Related to this, we can often overlook many of the capabilities built within Microsoft Dataverse, that provide neat capabilities well-suited towards modelling out business data in an effective way. When designing and implementing any data model using the Dataverse, you should:
- Carefully review the entire list of tables within the Common Data Model. Determine, as part of this, whether an existing table is available that captures all of the information types you need, can be customised to include additional, minor details or whether a brand new table will be necessary instead.
- Consider the different types of table ownership options and how this relates to your security model. For example, if you need to restrict rows to specific users or business units, ensure that you configure the table for User or team-owned ownership. You can review further details on these options here.
- Review the differences between a standard and an Activity table, and choose the correct option, based on your requirements. For example, when setting up a table recording individual WhatsApp messages to customers, use the Activity table type.
- Digest and fully understand the list of different field types available for creation. Ensure as part of this that you select the most appropriate data types for any new fields and factor in any potential reporting requirements as part of this.
- Understand the fundamental concepts around table relationships. For the exam, Microsoft expects you to tell the difference between 1:N and N:N relationships, including the differences between native and manual N:N relationships.
- Be familiar with using Microsoft Visio tools and, particularly, crow’s foot notation diagrams, to help plan and visualise your proposed data model.
These tips provide just a flavour of some of the things to consider when designing your Dataverse data model. Future posts will dive into technical topics relating to this, which should ultimately factor back into your thinking when architecting a solution.
Designing Reusable Components
Arguably, one of the benefits of leveraging a solution like the Power Platform is its ability to pull together solutions that can be adapted quickly to multiple scenarios. This introduces several advantages for developers and ultimately allows for our labour’s fruits to become instantly adaptable to suit different organisations, industries or business units. Specifically, from a developers standpoint, we can look to leverage the following features in support of this objective:
- Canvas Power App Components: As we build out our canvas apps, we often fuse various individual controls to form common, repeatable groups, that we then wish to use across our app multiple times. For example, we could generate a custom ribbon for our application, comprised of a shape, label and picture control. However, merely grouping it doesn’t make it possible to export this out and use it within different apps. In this situation, components come to the fore, by facilitating this capability and allowing us to build low-code, extensible components that we can create once, and deploy multiple times. Components also support a particular formula - OnReset - that does what it says on the tin; namely, returns it to its default state. OnReset is particularly useful if you wish to perform a calculation, based on input from the main app itself. Components should be your first port of call when you are developing multiple apps, and you want to streamline your development process. However, they are limited in scope - they don’t, for example, support model-driven Power Apps. Note also that Microsoft still (at the time of writing this post) lists this feature as being in public preview, meaning it’s unlikely for you to receive a potential question or scenario relating to it. However, this will likely change, as Microsoft continually refreshes the exam content and as this feature moves into general availability.
Getting into the mindset of reducing the amount of time it takes to deploy a solution, by first ensuring that your solution is ultimately reusable, is an essential concept for any Power Platform developer to grasp and should always be the key objective of your daily work. Use the above tools wisely to support this objective and don’t always resort to PCF controls, if components or another solution will do the job.
Custom Connectors Overview
As we touched upon in last week’s post, an arguable benefit of adopting canvas Power Apps is that they are ultimately agnostic when it comes to the data sources you wish to connect to. Indeed, as we saw, it’s possible to connect up various cloud or on-premise applications, covering both Microsoft and third-party vendors. However, consider the following two scenarios:
- Your organisation has a legacy, on-premise API, that you need to communicate with from the Power Platform. This API is complex, using technology such as SOAP. You need to find a way of securely exposing this out for access into the Power Platform and allow others in the organisation to authenticate and access the API’s core functionality straightforwardly.
- You are an ISV Developer with a bespoke API you’ve built out. You want to allow your customers to interact quickly with your solution without needing detailed knowledge of how to construct Web API queries or an understanding of concepts such as OData.
In both of these situations, we may struggle to identify a suitable, default connector on offer that allows us to meet the core objectives - namely, providing a straightforward and familiar way for Power Platform users to work with the API within their apps and flows. Enter stage right custom connectors. Using these, developers can import the definition of their APIs from either a custom wizard, a Postman collection or an OpenAPI definition, define the various authentication and capabilities of the API, and then share it out for users to start working with. ISV developers can then go a step further by getting their connector certified by Microsoft, allowing them to publish this onto AppSource for anyone worldwide to start using. Custom connectors are ideal if you anticipate multiple users needing to interact with a single API across the Power Platform, as they reduce the complexity involved in interacting with API’s. When used appropriately, custom connectors can speed up delivering multiple solutions and ensure that we are making our work infinitely reusable within an organisation. Again, custom connectors will be a crucial focus later on in the blog series, so let’s not get too lost in the woods with them right now.
What are Server-Side Components?
As a developer, it may be desirable for us to ensure we are building components that reside and execute server, as opposed to client, side. This desire is particularly true in the context of Microsoft Dataverse, as there will be situations where we need to validate, reject, override or approve specific user actions automatically, to ensure that we can always follow our desired business processes. Typically, components of this nature will execute without the user necessarily being aware of what’s going on, but this is not always the case (e.g. we may want to return a custom error message when a user violates a business condition). When building our solutions on top of the Power Platform and, specifically, Microsoft Dataverse, it’s prudent to make ourselves familiar with the following components:
- Business Rules: Although more traditionally targeted towards client-side validation, we can also configure Business Rules to execute at the table (i.e. server) level. This will allow us to, for example, set the value of a field if a condition is met, without needing to refer to a Power Automate flow, classic workflow or another tool. Our logic will always be obeyed, regardless of whether we’re using a model-driven or canvas app. Business rules are most useful when you have to model this very simplistic kind of logic, but may start to fall over if you need to, for example, perform calculations or complex logic evaluation targeting multiple tables.
- Plug-ins: When we find our business logic impossible to map without resorting to code of some kind, plug-ins come to the rescue in allowing us to implement C# or VB.NET based class assemblies, that can interact directly with the platform, either synchronously (i.e. as part of the database transaction) or asynchronously (i.e. as a separate process, after the core database transaction has completed). As we can incorporate any bespoke logic we want, plug-ins offer a high degree of flexibility, within the confines of specific limitations and with the natural expectation of needing experience in the appropriate programming language to implement.
- Business Process Flows (BPF’s): Similar to Business Rules, BPF’s will most often be utilised within the context of the user interface and, specifically, a model-driven app. However, it is worth highlighting that the core information relating to a BPF is managed server-side. Details regarding the current process stage, its duration within a stage, and even custom attributes are ultimately stored within the Dataverse. Developers are free to interact with this at any time. They can even alter the progress of a BPF, based on any pre-requisite conditions, thereby ensuring that rows proceed to the appropriate resolution, business area or individual.
- Real-Time Classic Workflows: As a final consideration (for reasons I will highly shortly), developers can also implement simple or complex workflow automation steps, using a guided interface. This experience is virtually identical to the workflow creation experience within Dynamics CRM / 365 Customer Engagement. It also affords the same benefits - namely, in allowing us to trigger synchronous business logic without using a plug-in. We can also have the best of both the plug-in and real-time workflow world, by implementing a custom workflow assembly to perform more complex operations and - again - achieve a high degree of reusability with a solution. So why do I say we should consider real-time workflows as a last resort? Microsoft has very clearly indicated that we should avoid creating classic, background workflows and use Power Automate flows instead to achieve the same functionality. There has been no official word (yet) regarding the status of real-time workflows. Still, my recommendation is to avoid using them unless they are necessary, and you do not have sufficient C#/VB.NET knowledge to create a plug-in instead.
Once more, please don’t concern yourself with these complex topics for now, as we’ll be returning to them later on. Instead, focus your attention on the potential usage cases, benefits and disadvantages of each component.
Today, we’ve summarised several concepts at a high-level, all of which are useful to consider at the design stage of a Power Platform solution. We’ll be deep-diving into many of these in the weeks and months ahead. In the next post in this series, we’ll review the various extensibility points within the various Power Platform applications, rounding off our discussion concerning the first exam area in the process.