Welcome to the sixth post in my series focused on providing revision notes for the PL-600: Microsoft Power Platform Solution Architect exam. After a
disgracefully long pleasantly short hiatus from when we last discussed designing integrations targeting the Power Platform, I’m pleased to be resuming this series by looking at the final topic within the Architect a solution area of the exam, titled Design the security model. For this area, candidates must demonstrate knowledge relating to the following:
Design the security model
- design the business unit and team structure
- design security roles
- design field security
- design security models to address complex sets of requirements
- determine security model management policies and processes
- identify Azure Active Directory groups and app registrations required to support a solution
- identify data loss prevention (DLP) policies for a solution
- determine how external users will access a solution
Security remains a paramount concern for any IT solution that an organisation adopts. Naturally, the solution architect of any Power Platform solution will need to be able to recommend and advise on the best solution to adopt. For the most robust security for any application we build, the involvement of Microsoft Dataverse over other data sources, such as SharePoint Online, may also become necessary. Let’s unpack all of this and dive deeper into each of the aforementioned features.
The aim of this post, and the entire series, is to provide a broad outline of the core areas to keep in mind when tackling the exam, linked to appropriate resources for more focused study. Ideally, your revision should involve a high degree of hands-on testing and familiarity with the platform if you want to do well. And, given the nature of this exam, it’s expected that you already have the necessary skills as a Power Platform Developer or Functional Consultant, with the certification to match.
Dataverse Security Recap
You should have already passed either the PL-200 or PL-400 exam at this stage of your Microsoft certification journey. As such, you should be able to readily recall the various details about Microsoft Dataverse’s security capabilities. Nevertheless, let’s spend a few moments to recap the core elements:
- Business Units: As our highest-level mechanism of defining security boundaries within Dataverse, we can use business units to physically separate users, teams and rows from others and as a means of modelling out complex hierarchies within our organisation.
- Teams: Most useful when working in the context of multiple users, don’t forget that we have multiple different types of teams available to us. Owner Teams are the most common type that we use, that support security role assignment and row ownership, with Access Teams being most valuable in the context of Access Team Templates and complex row sharing requirements in the platform.
- Security Roles: Using this feature, we can not only control the precise permissions a user has access to, but also the most appropriate boundary that this should apply to - either globally across the entire Dataverse environment, right through to business unit level and lower.
- Column (Field) Security Profiles: Best for those situations where we are working with highly sensitive categories of data, our profiles provide a means of restricting Read, Update or Create privilege on a specific column within your Dataverse environment.
Further discussion of the technical capabilities of each of these is beyond the scope of this exam, so I’d recommend going back a few steps to the earlier exams if you have any significant knowledge gaps.
How To Design the Best Dataverse Security Model
With such a wide array of features available, it can sometimes be tricky to figure out what (and what won’t) be necessary for us to architect out within Dataverse and get to that ultimate sweet spot - protecting the organisation and not frustrating users in the process.
The basic rule of thumb I always adhere to here, and I would advise any solution architect, is to KISS - Keep It Simple, Stupid. This adage is particularly true when it comes to Business Units. The temptation may be to model out your organisational structure 1:1 via Business Units, thereby resulting in many tens or hundreds of new Business Units being created. This is not only going to make it incredibly difficult to maintain your solution in the long term but also cause you potential performance problems too. Far better instead to think carefully about the minimum number of required security boundaries within your organisation and only implement these as Business Units.
Another thing I would advise is to use what’s already there, particularly when it comes to Security Roles. You’ll notice that every environment contains the same set of Roles, such as Basic User. These roles contain many of the base privileges users need to open a model-driven app and interact with most (if not all) of the different tables within the Common Data Model. Attempting to create new security roles from blank and re-populating them with privileges can become tedious (I speak from experience here 😉), so far better instead to take a copy of any existing Role and remove the privileges you don’t need.
Finally, I’d always recommend “mixing-and-matching” between the various features discussed. You’ll rarely be able to achieve your overall requirements by using just, for example, Security Roles. Like the Power Platform itself, each feature is powerful in isolation but more effective when we combine them together. Often, the most effective security model is the one that judiciously and prudently uses most, if not all, of the capabilities within Dataverse to achieve your overall requirements.
Demo: Overview of Microsoft Dataverse Security Features
In this video, we’ll dive into Microsoft Dataverse and perform a quick recap to see how we can work with business units, teams, security roles, and column (field) security profiles:
Azure Active Directory Groups in the Power Platform
For anyone with at least some basic experience of Active Directory, you’ll be familiar with the concept of Security Groups. These are typically used in the on-premise world as a means of applying group policies, sharing access and applying other uniform sets of privileges to teams, departments or the organisation as a whole. In the cloud world, from an Azure Active Directory (AAD) perspective, the concept is still very much alive; partly as a backward compatibility feature but also as a means of achieving the same kinds of things we would do in the on-premise world for our users. We can primarily use AAD groups in two ways within the Power Platform:
- To control access to specific Dataverse environment(s). New or existing Dataverse environments can be linked to any group set up on the tenant, and only members of the group will be provisioned as Users. All other Users will be ignored or deactivated if existing already. Solution architects should always advocate and utilise this functionality, as it will typically avoid situations where all users get created into an environment.
- To link as part of a Team row we create in Dataverse. By doing this, membership of the team automatically updates based on the membership list defined in AAD, thereby reducing the administrative effort involved in maintaining this within Dataverse. This functionality also supports Microsoft 365 (Office 365) groups as well.
It can be rare that an organisation is not already using AAD groups in some capacity, so it behoves a solution architect to investigate, identify, and promote the usage of anything existing in the tenant that could benefit the project.
Demo: Working with Azure Active Directory Groups in the Power Platform
In this video, we’ll see how we can create security groups within the Azure Portal, and we can then use these within the Power Platform:
Azure App Registrations: Who, What, Why
Application Registrations come into play when we plan to do any form of integration or pro-code/fusion extensibility targeting the Dataverse platform. Given that various legacy authentication mechanisms, such as Basic, are now no longer supported, we must instead turn to the capabilities on offer as part of the OAuth 2.0 protocol within Azure Active Directory. As well as offering multiple potential authentication patterns, it’s also incredibly easy to use, regardless of the tech stack we’re using.
One way in which we can use them, from a Power Platform and Dataverse standpoint, is to authenticate into the platforms Web API as our user account.. As noted in the documentation, there are some specific setup steps required (such as enabling Implicit Flow), but the general pattern results in an access token being generated, which we then use to gain entry into our Dataverse environment. In the implicit pattern, what we are doing is impersonating our existing user account, utilising the Application Registration and its permissions to achieve this.
To go a step further, Application Registrations can also then be used to create an Application User in a Dataverse environment. This approach is most suitable for when we plan to have a server-to-server based integration, and don’t want to rely on a specific individuals user account. The user account that results from this process behaves and acts the same way as any other user. This means we can assign it security roles, rows (records) and place them within our chosen business unit. I would typically recommend using Application Users for scenarios such as connecting to Dataverse from Azure DevOps or to run plug-ins/cloud flows with an account using scoped privileges.
Demo: Working with Azure Application Registrations in the Power Platform
In this next video, we’ll walk through the process of creating an Application Registration in the Azure Portal and how we can then use this to create an Application User within a Dataverse environment:
Data Loss Prevention (DLP) Policies Overview
The primary benefit of a solution such as the Power Platform is the flexibility it can give you when easily connecting to different applications and services. However, this can also be the platform’s Achilles heel if we don’t manage things correctly. Suppose a scrupulous user in our organisation, who is leaving the next day permanently, decides to create a cloud flow that exports all of our prospect data from our Dynamics 365 Sales system and transfers this across into a personal Dropbox. Not only has this erstwhile colleague got the inside track on what our selling team are doing, but we’ve also - invariably - caused a data breach. Not a great situation to be in or an optimal return on investment, I’m sure you’ll agree.
To help mitigate against these common scenarios and provide administrators with control over their environments, we can configure DLP policies to dictate precisely which connectors we want to use. Now, we shouldn’t be aiming to use this capability to turn our environments into the Power Platform’s equivalent of Fort Knox. Instead, we should consider and categorise the connectors we want to allow, based on the users involved, their department and what they are trying to achieve with the Power Platform. Once our DLP policies are configured, they take effect immediately, and our users will start seeing the appropriate notifications appear if they breach a policy. DLP policies work primarily in the context of our cloud & desktop flows and our canvas apps. Solution architects should be aware of this and be familiar with the experience of setting them up within the Power Platform Admin Center. Typically, we’ll also need to instruct and guide the organisation towards appropriately using them.
Demo: Working with Data Loss Prevention (DLP) Policies
In this final video, we’ll go through the steps involved in creating policies to control what connectors can be used across your different Power Platform environments:
External Users and the Power Platform
The majority of apps, automations, and reporting tools we build-out within the Power Platform will be for internal consumption within the organisation only. This will be particularly true if we work with sensitive, personally identifiable (PI) or other commercially sensitive categories of data. Having it so that all and sundry can access this information freely could cause us some difficulty. 😅 Notwithstanding this, there are occasional situations where giving external users access is necessary. The main scenarios where we see this come into play include:
- Power Apps Portals: This is perhaps the most obvious and, arguably, the most secure mechanism at our disposal, provided of course, our portal has implemented user authentication. Any possible information type existing within our Dataverse environment can be exposed out onto the portal. We have fine-tuned control over precisely what information we can show. We can even go a step further and have different roles assigned to different categories of users, using a set of security permissions that align strongly to the various security model principles of Dataverse we’ve reviewed already.
- Power BI: In most scenarios, we’ll be consuming Power BI internally, either personally or via Workspaces (provided our users are licensed somehow). However, there are opportunities to embed Power BI report content in a website, publish our report to any web location or even to go a step further and leverage Power BI Embedded analytics instead. Although these options may be tempting, the solution architect needs to review the benefits and disadvantages and ensure that the organisation doesn’t accidentally share data that it shouldn’t.
- Connectors: As we should already have an excellent awareness of, the sheer plethora of connectors available to us makes it possible to get information exposed to external users. For example, we could use the Twitter connector to post to social media about the latest updates relating to our organisation for everyone to see and interact with.
The solution architect should carefully review any potential licensing implications for other scenarios. If not planned carefully, we could find ourselves falling foul of multiplexing, which the latest version of the licensing guide defines as follows:
Multiplexing refers to the use of hardware or software that a customer uses to pool connections, reroute information, or reduce the number of users that directly access or use the Power Apps, Power Automate and Power Virtual Agents service. Multiplexing does NOT reduce the number of subscription licenses of any type required to access the Power Apps, Power Automate and Power Virtual Agents apps. Any user or device that accesses Power Apps, Power Automate and Power Virtual Agents apps, directly or indirectly must be properly licensed.
While the patterns above could open a technical route to providing external users access to our Power Platform solution, we would nonetheless find ourselves in breach of our licensing agreement. We could then expect an unwelcome knock on the door in the future by Microsoft if we are discovered. 😉 An old saying stands true here; just because we can do something doesn’t mean we should.
We’re on the home stretch now as we get towards the end of this series. In the final post next time around, we’ll tackle the final exam area, all concerning how we implement a Power Platform solution