For the past 13 weeks on the blog, I have delivered a series of posts concerning Microsoft Exam 70-778, specifically focused towards providing a set of detailed revision notes that cover the broad array of Power BI features assessed as part of the exam. To round things off, today’s blog will bridge together everything I have discussed thus far in the series; with the hope being that this post can be a single reference point for those who have not been following the series to date.

Microsoft Exam 70-778 Overview

The exam, with its full title Analyzing and Visualizing Data with Microsoft Power BI, is targeted towards Business Intelligence (BI) and data professionals who are looking to validate their skills in working with Power BI. The exam is a necessary component, alongside Exam 70-779: Analyzing and Visualizing Data with Microsoft Excel, in attaining the Microsoft Certified Solutions Associate (MCSA) certification in BI Reporting. Successful candidates can then (optionally) pass an additional “elective” exam to gain the Microsoft Certified Solutions Expert (MCSE) certification in Data Management and Analytics.

Skills Measured in the Exam

The skills measured are outlined below, alongside links to the relevant posts from the series and the list of essential points to remember:

Consuming and Transforming Data By Using Power BI Desktop

Connect to data sources.
Skills Measured

May include: Connect to databases, files, folders; import from Excel; connect to SQL Azure, Big Data, SQL Server Analysis Services (SSAS)

Revision Notes

Exam 70-778 Revision Notes: Importing from Data Sources

Key Takeaways
  • Power BI supports a broad range of database systems, flat file, folder, application and custom data sources. While it is impossible to memorise each data source, you should at least broadly familiarise yourself with the different types at our disposal.
  • A crucial decision for many data sources relates to the choice of either Importing a data source in its entirety or in taking advantage of DirectQuery functionality instead (if available). Both routes have their own defined set of benefits and disadvantages. DirectQuery is worth consideration if there is a need to keep data regularly refreshed and you have no requirement to work with multiple data sources as part of your solution.
  • Live Connection is a specific data connectivity option available for SQL Server Analysis Services. It behaves similarly to DirectQuery.
  • It is possible to import an existing Excel BI solution into Power BI with minimal effort, alongside the ability to import standard worksheet data in the same manner as other flat file types.
Perform transformations
Skills Measured

May include: Design and implement basic and advanced transformations; apply business rules; change data format to support visualization

Revision Notes

Exam 70-778 Revision Notes: Performing Data Transformations

Key Takeaways
  • The Power Query M formula language is used to perform transformations to data once loaded into Power BI. Although it is possible to do this via code, Power BI allows us to define all of our required data changes from within the interface, without the need to write a single line of code.
  • Each data source connected to represents itself as a Query within Power BI. There are many options at your disposal when working with Queries, such as renaming, merging, duplication and the ability to disable or reference as part of other Queries.
  • There are wide-range of column transformations that can be applied, which are too numerous to mention. The Transform tab provides the best means of seeing what is available, with options ranging from formatting through to grouping and pivoting/unpivoting.
  • New columns are addable via the Add Column tab. You can choose to base new columns on calculations, conditional logic, other column values or as a defined list of ascending numbers, which may be useful for indexing purposes.
  • It is possible to merge or append queries together to suit your specific requirements. Merging involves the horizontal combination of Queries, whereas appending represents a vertical combination.
  • Parameters can be used to help optimise any complex filtering requirements.
  • Where possible, Power Query will attempt to use the most optimal query for your data source, based on the transformation steps you define. This action is known as Query Folding and, in most cases, SQL-derived data sources will support this option by default.
Cleanse data
Skills Measured

May include: Manage incomplete data; meet data quality requirements

Revision Notes

Exam 70-778 Revision Notes: Cleansing Data

Key Takeaways
  • Data can be filtered directly within Power Query, using Excel-like functionality to assist you in only returning the most relevant data in your queries. The data type of each field plays a particularly important part of this, as only specific filter options will be at your disposal if, for example, you are working with numeric data.
  • From a data quality perspective, you typically will need to handle column values that contain one of two possible value types:
    • Errors: This will usually occur as a result of a calculated column field not working correctly. The best solution will always be to address any issues with your calculated column, such as by using a conditional statement to return a default value.
    • Blanks/NULLs: A common symptom when working with SQL derived data sources, your real problems with blank values start to appear when you attempt to implement DAX custom columns/Measures outside of the Power Query Editor. It is, therefore, recommended that these are dealt with via a Replace action, depending on your fields data types. For example, a number field with blank/NULL values should be replaced with 0.
  • The Remove Rows option(s) can act as a quick way of getting rid of any Error or Blank/NULL rows and can also be utilised further to remove duplicates or a range of rows. In most cases, you will have similar options available to you with Keep Rows instead.
  • There are a variety of formatting options available to us when working with text/string data types. These range from fixing capitalisation issues in data, through to removing whitespace/non-printable character sets and even the ability to prepend/append a new value.

Modeling and Visualizing Data

Create and optimize data models.
Skills Measured

May include: Manage relationships; optimize models for reporting; manually type in data; use Power Query

Revision Notes

Exam 70-778 Revision Notes: Create and Optimise Data Models

Key Takeaways
  • Relationships form the cornerstone of ensuring the long-term viability and scalability of a large data model. Assuming you are working with well-built out, existing data sources, Power BI will automatically detect and create Relationships for you. In situations where more granular control is required, these Relationships can be specified manually if needed. It is worth keeping in mind the following important features of Relationships:
    • They support one-to-one (1:N), one-to-many (1:N) and many-to-one (N:1) cardinality, with many-to-many (N:N) currently in preview.
    • Filter directions can be specified either one way or bi-directionally.
    • Only one relationship can be active on a table at any given time.
  • It is possible to sort columns using more highly tailored custom logic via the Sort By Column feature. The most common requirement for this generally involves the sorting of Month Names in date order but can be extended to cover other scenarios if required. To implement, you should ensure that your data has a numbered column to indicate the preferred sort order.
  • Moving outside of the Power Query Editor presents us with more flexibility when it comes to formatting data to suit particular styling or locale requirements. While the majority of this functionality provides date/time and currency formatting options, for the most part, it is also possible to categorise data based on Location, the type of URL it is or on whether or not it represents a Barcode value; these options can assist Power BI when rendering certain types of visualizations.
  • There may be ad-hoc requirements to add manually defined data into Power BI – for example, a list of values that need linking to a Slicer control. The Enter Data button is the “no-code” route to achieving this and supports the ability to copy & paste data from external sources. For more advanced scenarios, you also have at your disposal a range of M code functionality to create Lists, Records and Tables, which can be extended further as required.
Create calculated columns, calculated tables, and measures
Skills Measured

May include: Create DAX formulas for calculated columns, calculated tables, and measures; Use What If parameters

Revision Notes

Exam 70-778 Revision Notes: Using DAX for Calculated Columns

Key Takeaways
  • DAX is the primary formula language when working with datasets outside of Power Query. It includes, to date, more than 200 different types of functions that can assist in all sorts of data modelling.
  • An important concept to grasp within DAX is context and, specifically, row context (formulas that calculate a result for each row in a dataset) and filter context (formulas that automatically apply any filtering carried out at report level).
  • The sheer amount of DAX functions available makes it impossible to master and remember all of them, particularly when it comes to the exam. Your learning should, therefore, focus on learning the general syntax of DAX and the general types of functions available (aggregation, date/time etc.)
  • There are three principal means of utilising DAX with Power BI:
    • As Measures: These typically present a scalar value of some description, often an aggregation or a result of a complex formula. Using them in association with a Card visualization type is recommended, but this is not a strict requirement.
    • As Calculated Columns: Similar to the options available within Power Query, Calculated Columns provide a dynamic and versatile means of adding new columns onto your datasets. Compared with the options available within Power Query and the complexity of the M language, DAX Calculated Columns might represent a more straightforward means of adding custom columns onto your datasets.
    • As Calculated Tables: A powerful feature, mainly when used in conjunction with Calculated Columns, you have the ability here to create entirely new datasets within the model. These will typically derive from any existing datasets you have brought in from Power Query, but you also have functionality here to create Date tables, sequence numbers and manually defined datasets as well.
  • What-if Parameters provide of means of testing DAX formulas, as well as allowing report users to perform predictive adjustments that can affect multiple visualizations on a report.
Measure performance by using KPIs, gauges and cards.
Skills Measured

May include: calculate the actual; calculate the target; calculate actual to target; configure values for gauges; use the format settings to manually set values

Revision Notes

Exam 70-778 Revision Notes: Utilising KPIs with Gauge Visualisations

Key Takeaways
  • There are two principle visualization types available within Power BI to help track actual-to-target progress – KPIs and Gauges.
  • KPIs provide a more visually unique means of a binary success/fail determination when tracking towards a target. It is also possible to use KPI’s to track variance over time via the Trend axis. The Indicator will typically be the result of some form of aggregation or Measure.
  • Gauges provide a less visually distinctive, but non-binary, mechanism of viewing progress towards a target. Gauges support more potential field well values when compared with KPIs, nearly all of which are optional in some way. You can also manually define some of these values, for situations where your data model does not contain the required information.
  • All visualizations within Power BI are modifiable from a display or formatting perspective. The same basic options will generally be supported – such as changing a font type or background colour – with more specific configuration properties available per unique visualization type. For example, a KPI visualization can be customised to hide the background Trend Axis entirely. All of these options are designed to give developers greater control over the look and feel of their reports and to mirror them as closely as possible to any potential branding requirement.
  • When building out a solution designed to monitor progress within Power BI, the steps involved will typically be more in-depth than merely creating a new visualization. In most cases, there will be a requirement to bring together a lot of the other skills that have been discussed previously within this series – such as creating DAX formulas, modifying data within Power Query or bringing together different data sources into a single model. It is essential, therefore, not to underestimate the amount of time and effort involved in creating a practical solution that takes advantage of KPIs or Gauges.
Create hierarchies
Skills Measured

May include: Create date hierarchies; create hierarchies based on business needs; add columns to tables to support desired hierarchy

Revision Notes

Exam 70-778 Revision Notes: Creating Hierarchies

Key Takeaways
  • Hierarchies within Power BI provide a means of logically categorising data into an order of preference or precedence, providing greater flexibility to Power BI report users when they interact with visualizations.
  • Date Hierarchies are created and managed automatically by Power BI for each Date or Date/Time field defined within your model. These automatically create fields that contain the Year, Quarter, Month & Day values from the respective date fields. These fields can then be utilised as part of a Table visualization or within a DAX formula. Date Hierarchies can also be completely disabled if required.
  • Custom (or User-Defined) Hierarchies need to be created manually and provide additional customisation options when compared to Date Hierarchies, such as the number of fields they contain, the order and its name. A Custom Hierarchy will typically make use of one of several Parent/Child DAX functions, such as PATH or PATHITEM.
  • Including a hierarchy as part of a chart visualization, such as a Pie chart or Donut chart, opens up other drill-down capabilities around your data. Indicated by the additional arrow icons included at the top of the visualization, they provide the means for users to interrogate data points that interest them the most straightforwardly.
Create and format interactive visualizations.
Skills Measured

May include: Select a visualization type; configure page layout and formatting; configure interactions between visual; configure duplicate pages; handle categories that have no data; configure default summarization and data category of columns; position, align, and sort visuals; enable and integrate R visuals; format measures; Use bookmarks and themes for reports

Revision Notes

Exam 70-778 Revision Notes: Create and Format Interactive Visualizations

Key Takeaways
  • Power BI delivers, out of the box, a range of different visualizations that cater towards most (if not all) reporting requirements. Should you find yourself in need of additional visualizations, then Microsoft AppSource is your go-to destination for finding visualizations developed by others. If you have experience working with either Node.js or R, then these can be used to build bespoke visualizations also.
  • When first developing a report, you should be able to match a requirement for a specific visualization type, to ensure that you are delivering a solution that is both meaningful and useful. From an exam perspective, this becomes a more critical consideration, and you should be prepared to suggest the most optimal visualization to use when given a specific scenario.
  • After adding visualization’s to your report, you have additional options available to customise them further. For example, you can specify a preferred sorting order for your data, override any summarizations used and move/align your visual on the report page.
  • By default, visualizations in Power BI are designed to change automatically, based on how users interact with the report. All of these options are controllable via the Edit interactions button, allowing you to specify your preferred cross-filtering and cross-highlighting conditions.
  • There is a range of report page customisation options available to developers. It is possible to resize a page to any possible height/width, allowing you to optimise your report for specific devices. Also, you can modify the colour of a page (or its wallpaper) or add an image instead. Pages can also be renamed, reordered or duplicated.
  • Measures can be formatted in the same way as calculated columns, meaning you can specify a data type or, for numerics, modify the number of decimal places.
  • Bookmarks allow developers to set up “checkpoints” within a report, based on how a report page has been filtered. These can then be used to automatically navigate the user through a report, applying these filtering steps automatically. This feature can help transform your report into an interactive story.
  • Visualizations will automatically inherit their various colour properties from the currently selected report theme. Although these can be modified granularly, the fastest and most consistent way of making these changes en masse is to change the Theme. Power BI includes some Themes out of the box, but you also have the option of building your own using a custom JSON file; this can then be imported into different reports, providing a portable means of enforcing a particular branding requirement.
Manage custom reporting solutions
Skills Measured
  • May include: Configure and access Microsoft Power BI Embedded; enable developers to create and edit reports through custom applications; enable developers to embed reports in applications; use the Power BI API to push data into a Power BI dataset; enable developers to create custom visuals
Revision Notes

Exam 70-778 Revision Notes: Managing Custom Reporting Solutions

Key Takeaways
  • Power BI Embedded is an Azure hosted offering that allows you add Power BI Report content into bespoke applications. This deployment option can be incredibly useful if you wish to make available your Power BI solution to users outside of your organisation or if you have an existing, bespoke application system that can benefit from utilising Power BI content. An Azure subscription is required to begin working with Power BI Embedded and you are billed based on node size, not individual user licenses. All Power BI content requires publishing to the online service before its contents become available for Power BI Embedded to access. Report developers will, therefore, need granting a Power BI Professional license to carry out these activities.
  • The Power BI API grants access to developers to perform automation or administrative actions programmatically against the Power BI Online service. Utilising a REST API, developers can determine the optimal programming language of choice to interact with the API, allowing them to streamline the deployment of Reports or Dashboards to the Power BI service or leverage additional functionality when utilising Power BI Embedded. The API can also cater to specific data load requirements, although more complex needs in this area would require addressing via alternate means (SSIS, Azure Data Factory etc.)
  • Developers can add their own bespoke visualizations to a Power BI Report by either developing them using Node.js or using the R language. The first of these options facilitate a more streamlined deployment mechanism and allows developers to add their visualizations to AppSource, whereas the second option may be more useful for complex visualization types with an analytical or statistical function.

Configure Dashboards, Reports and Apps in the Power BI Service

Access on-premises data
Skills Measured

May include: Connect to a data source by using a data gateway;publish reports to the Power BI service from Power BI Desktop; edit Power BI Service reports by using Power BI desktop

Revision Notes

Exam 70-778 Revision Notes: Report Publishing, On-Premise Gateway & Creating Dashboards

Key Takeaways
  • The Power BI On-Premise Gateway provides a streamlined route to working with non-cloud data sources within Power BI, Microsoft Flow and PowerApps. As a lightweight and easy-to-configure client application, it supports a wide variety of data sources, making them accessible as if they were in the cloud. Once set up, corresponding Data Sources are then made available for configuration and for leveraging as part of any Power BI Dataset.
  • Reports can be published into Power BI Online, meaning that they become accessible online and to a broader group of users, without requiring access to Power BI Desktop. Reports need deploying into a Workspace, which can be created manually or derived from an Office 365 Group. Each Report contains a corresponding Dataset, where all queries defined within Power BI Desktop exist.
  • Reports that already exist on Power BI Online can be updated by just publishing a new version of the Report from Power BI Desktop. It is also possible to modify Reports from directly within the browser and by downloading a copy of the .pbix Report file as well, which can then be altered and re-published.
Configure a dashboard
Skills Measured

May include: Add text and images; filter dashboards; dashboard settings; customize the URL and title; enable natural language queries

Revision Notes

Exam 70-778 Revision Notes: Report Publishing, On-Premise Gateway & Creating Dashboards

Key Takeaways
  • Dashboards provide a means of grouping together various content as tiles, designed for at-a-glance analysis and optimal end-user experience.
  • The list of content that can be pinned to a Dashboard includes:
    • Visualizations
    • Web content
    • Images
    • Text boxes
    • Videos
    • Custom streaming data
  • Pinned content can be re-arranged on Dashboard via drag and drop functionality. It is also possible to resize tiles to any given height/width.
  • Within the settings of a Dashboard, it is possible to enable/disable features such as natural language queries (Q&A’s) and Notes.
  • Some features of a Dashboard are only available if you have a Power BI Professional subscription, such as sharing and email subscriptions.

 

Publish and embed reports
Skills Measured

May include: Publish to web; publish to Microsoft SharePoint; publish reports to a Power BI Report Server

Revision Notes

Exam 70-778 Revision Notes: Publish and Embed Reports

Key Takeaways
  • The Publish to web option allows for non-licensed, external users to view a Power BI Report in its entirety. A URL and IFrame embed code can be generated for this at any time within the portal and then dropped into virtually any website. Although you will lose some functionality when deploying a Report out in this manner, you can expect that users will be able to perform most types of interactions with visualizations, Report pages and other components, as if they were accessing the Report through Power BI Online. In some cases, you may be unable to use the Publish to web option if your Report uses certain kinds of features, such as R Visuals or row-level security. You must also take into account any privacy or data protection concerns, as Reports deployed out in this manner will be publically accessible; where this is an issue, the Embed option is available as a secure alternative.
  • There are three steps involved if you wish to add a Report to SharePoint. First, you must generate the unique SharePoint embed URL within Power BI. Secondly, you then need to add on the dedicated control for this feature on your target SharePoint page and configure the relevant display options. Finally, you then need to ensure that all SharePoint users have been granted access to the Report, either at a Workspace level (recommended option) or by having the Report shared with them. By implication, in this scenario, all SharePoint users would have to have at least a Power BI Professional license to take full advantage of this functionality.
  • Publishing a Report to Power BI Report Server is mostly the same as if you were to do the same with the online version of the product. Instead of selecting a Workspace to add the Report to you, specify the name of the Report Server folder where the Report will reside. From a development standpoint, the dedicated Power BI Desktop for Power BI Report Server must be used and may differ in functionality from the “normal” version of the tool. There is also no option to edit a report from within Power BI Report Server like you can through the online version.
Configure security for dashboards, reports and apps.
Skills Measured

May include: Create a security group by using the Admin Portal; configure access to dashboards and app workspaces; configure the export and sharing setting of the tenant; configure Row-Level Security

Revision Notes

Exam 70-778 Revision Notes: Securing Power BI Dashboards, Reports and Apps

Key Takeaways
  • Workspaces act as a container for the various components that form a Power BI Reporting solution. Within a Workspace, you will find all of the Dashboards, Reports, Workbooks and Datasets that developers have published content to. Each User has a Workspace created for them in Power BI when they first access the service. Additional Workspaces can be added through Office 365 Groups or by installing a Power BI App from AppSource. Dashboards and Reports created within your a Users Workspace are shareable to other Users, provided that your account has a Power BI Professional license assigned to it.
  • To help manage permissions to Dashboards/Reports in a more efficient manner, Administrators can create Security Groups on the same Office 365 Tenant where Power BI Online resides. These can contain multiple groups of Users, allowing administrators to minimise the amount of effort involved in managing Dashboard/Report access. Most crucially, this will also enable Users that do not have an Exchange Online mailbox to access Dashboards/Reports when they are shared out in this manner.
  • Administrators have a whole host of options available to them within the Tenant settings area of the Admin Portal. These include, but are not limited to:
    • Export and Sharing Settings
    • Enable/Disable Content Sharing
    • Enable/Disable Publish To Web
    • Enable/Disable Export Reports as PowerPoint Presentations
    • Enable/Disable Print Dashboards and Reports
    • Content Pack and App Settings
    • Integration Settings
    • Custom Visuals Settings
    • R Visuals Settings
    • Audit and Usage Settings
    • Dashboard Settings
    • Developer Settings
  • All of these settings can be enabled for a specific security group, the entire organisation (excepting specific security groups) or allowed for particular security groups, excluding all others in the organisation.
  • Row-Level Security (RLS) allows report developers to restrict data, based on Roles. Row-level DAX evaluation formulas are used to achieve this, which filters the data that is returned, depending on a TRUE/FALSE logic test. To utilise the feature, you must define both the Roles and DAX formulas for each query within your data model. Then, after deploying your Report to Power BI Online, you then assign Users or Security Groups to the Role(s) created within Power BI Desktop. It is possible to view the effect of a Role at any time, within Power BI Desktop or Online, via the View As Role functionality. With the wide-array of DAX formulas available, including specific ones that return the details for the current user accessing a Report, it is possible to define very granular filtering within a Power BI report, to suit particular security or access models.
Configure apps and apps workspaces.
Skills Measured

May include: Create and configure an app workspace; publish an app; update a published app; package dashboards and reports as apps

Revision Notes

Exam 70-778 Revision Notes: Working with Apps and App Workspaces

Key Takeaways
  • Workspaces act as a container for the various components that form a Power BI Reporting solution. Within a Workspace, you will find all of the Dashboards, Reports, Workbooks and Datasets that developers have published content to. Each User has a Workspace created for them in Power BI when they first access the service. It is also possible to create additional Workspaces, either through the Power BI Online interface or by creating an Office 365 Group. A new experience for creating Workspaces is currently in preview which, once released, would negate the need for each Workspace to have an associated Office 365 Group.
  • When creating a Workspace, you can define various settings such as the type of access each user has (read-only or ability to modify its content), its members and whether it requires assignment to a Power BI Premium node. It is not possible to change the access type for a Workspace after creation, but you can freely change its name or modify its membership at any time.
  • The contents of a Workspace can be published as an App, enabling you to expose your solution to a broader audience within or outside your organisation. Once published, users navigate to the Power BI AppSource store for their tenant, which lists all Apps available for installation. Once installed, they will then become visible from within the Apps area of the application. You can update content within an App at any time by republishing its corresponding Workspace. It is also possible to define individual properties within an App, such as its description, access rights and landing page. To install and use Apps, the user in question must have a Power BI Professional license.

Additional Preperation Resources

The official Microsoft exam reference book is a helpful learning aid for this exam, particularly given that it includes numerous exercises that you can work through to familiarise yourself with different Power BI functionality. There is also an online course available on the edX website which, despite not covering the whole exam syllabus, does provide a useful visual aid and includes a lot of the features you are expected to know for the exam. Finally, nothing beats actually working with the product itself and trying out the various features yourself. Power BI Desktop is a free download and, with access to one of the sample databases provided by Microsoft, you can very quickly provision an environment on your own home computer to enable you to experience everything that Power BI has to offer.

Exams are always a nightmarish experience, both when preparing for them and when you are sat there in the test centre. I hope that this post, and this whole series, proves to be useful in helping with your exam preparation and getting you ready to pass the exam with flying colours 🙂

Welcome to the twelfth and penultimate post in my blog series concerning Microsoft Exam 70-778, where I hope to provide a revision tool for those planning to take the exam or a learning aid for those looking to increase their Power BI knowledge. We’re on the home stretch now and, after reviewing last week the various options available to publish Power BI Reports both online and on-premise, we now take a deep dive into some vital security concepts as part of the Configure security for dashboards, reports and apps theme, which covers the following skill areas:

Create a security group by using the Admin Portal; configure access to dashboards and app workspaces; configure the export and sharing setting of the tenant; configure Row-Level Security

Before exploring these topics further, however, it is essential to outline a concept that this series has continually skated around – Power BI Workspaces.

Workspace Overview

We’ve seen so far in the series how it is possible to deploy Power BI Desktop Reports into Power BI Online. As part of this process, you must define a Workspace where your Reports and Datasets will appear after publishing. There are three types of Workspaces:

  • My Workspace – Each user, by default, will have a personal Workspace, which cannot be deleted or changed.
  • Office 365 Group Workspace
  • App Workspace

Workspaces are, for the most part, a logical grouping of all the components that you need to start building out a BI solution. They are worked with from within Power BI Online only (meaning that they do not exist as part of Power BI Report Server) and can be interacted with from the left-hand pane within Power BI Online:

As indicated above, each user’s Workspace can contain:

  • Dashboards – These are created within the Power BI service, as we saw a fortnight ago.
  • Reports – These are built out in Power BI Desktop or uploaded directly into Power BI Online from a .pbix file.
  • Workbooks – These will show a list of Excel workbooks that have been uploaded into Power BI, allowing you to leverage an existing solution built out using Excel PivotTables/PivotCharts almost immediately through Power BI. For this exam, it is not something you need to worry about necessarily, but be aware this topic does crop up within Exam 70-779.
  • Datasets – Contains a list of all data models uploaded/created for Power BI Reports or Workbooks.

It is possible to share out Dashboards and Reports to other Users/Security Groups, and we will see how this can be done with the example later on in this post. One consideration to bear in mind is that, when sharing Reports, this does not share out any Dashboards that reference it. Content shared to you will become visible within the Shared with me tab on Power BI Online:

Next week’s post will go into further detail on how to create and manage Workspaces, and how to handle access to App Workspaces.

Office 365 Security Groups

Those who have experience administrating on-premise Active Directory (AD) domains will have full familiarity with the concept of Security Groups – logical containers of Users that can then be assigned file/folder privileges, group policy objects and access to other principals on the domain. Given that Power BI uses Azure Active Directory (AAD) as its identity provider, the same kind of concepts come into play when you start to determine how to manage access to Power BI Dashboards, Reports and Datasets to specific groups of Users. Office 365 Security Groups are virtually identical to their on-premise AD equivalent; the primary difference being is that Administrators must create them from within the Office Microsoft 365 Admin Center. It is also possible to add them through Microsoft Azure as well, so your choice here really comes down to preference. In either case, you must ensure that you have the relevant administrator privileges on your AAD tenant to create and manage them. Once created and defined with your required list of Users, they then become available as a shareable object within Power BI Online.

In the example towards the end of this post, we will walk through how to create a Security Group and how this can then be used to share out a Dashboard.

Managing Export and Sharing Settings

With the introduction of GDPR last year, data privacy concerns remain a paramount concern for organisations. These concerns can often come into conflict with new functionality that technology solutions can offer us such as, for example, the ability to export a Power BI Report as a PowerPoint presentation. To help with these considerations and in line with Microsoft’s overall commitments from a GDPR standpoint, Power BI Online provides several options that allow you to granularly define various actions that Users can and cannot do, such as using custom visuals in reports, accessing audit/usage information and the ability to use preview features, such as dataflows. The list of settings most relevant to data sharing can be found under the Export and sharing settings section on the Admin Portal -> Tenant settings area of Power BI Online:

Each of the listed features can be enabled or disabled globally on the Office 365 tenant. Alternatively, by utilising Security Groups, you can grant or curtail specific functionality to a group/department within an organisation. You have no option to specify individual User access as part of this, so it becomes a requirement to have your required Security Groups defined within Office 365 before you can start working with this feature.

Row-Level Security

Granting global allow/deny privileges at a Report level may not be sufficient for specific business requirements. It could be, for example, that a single Sales report is in place for both junior and senior sales professionals, and there is a need to only present data that is most relevant to their role. Or, for example, there is a need to show data that has the most relevance for an individual’s particular geographic region. In these situations and, to avoid a scenario where you would have to define separate queries to segregate this data appropriately, Row-Level Security (or RLS) becomes a significant asset. It allows you to set Roles linked to DAX expressions, which tell Power BI which data to show to a particular group of Users.

There are two steps involved as part of implementing RLS. First, you must create a Role that defines the list of privileges for one or multiple Users. This step can be achieved by navigating to the Modeling tab within Power BI Desktop and selecting the Manage Roles button, which will open the appropriate dialog window:

Next, you must define a DAX Expression for each table that requires filtering as part of the Role. These can be set up for as many Tables as you like, but the critical thing to remember is that the DAX Expression must conform to a TRUE/FALSE equality check. The example below – whether it will either be TRUE or FALSE that the TotalSales value on a row will be greater than or equal to 500 – meets this requirement:

With the Role defined, it is then possible to test it locally from within Power BI Desktop by using the View as Roles button to select your corresponding Role:

With everything built out and working with Power BI Desktop, the second step is to publish your Report to Power BI Online and then assign the Role to Users or a Security Group by navigating to the Dataset in question:

It is also possible to use the Test as role feature within Power BI Online, which behaves identically to its Desktop client equivalent:

To help leverage additional functionality out of RLS, Microsoft provides the following two DAX functions:

  • USERNAME() – Returns the domain name of the User accessing the Report. For Desktop Reports, this will typically be in the format <Domain>\<User Name>; when viewing the Report online, the value rendered instead will either be the user’s email address or onmicrosoft.com account name.
  • USERPRINCIPALNAME() – Returns the User Principal Name (UPN) of the User accessing the Report. The UPN will almost always be the user’s email address or, in some cases for Power BI Online, their onmicrosoft.com user account name.

By using these functions in tandem with IF DAX constructs, you have the additional capability to restrict access to specific data, based on user account names. All in all, RLS is a powerful feature to have at your disposal but, as highlighted in last week’s post, you should be aware of its limitations. RLS is incompatible when there is a need to Publish to web a report and the feature is also not available if you are querying a SQL Server Analysis Services data source via a live connection.

Example: Sharing a Power BI Dashboard with a Security Group

The steps that follow will show how to create an Office 365 Security Group and then share out a Dashboard to it from within Power BI. To complete the steps outlined, you should ensure that you are assigned either the Global administrator or User management administrator role in Office 365 and that your user account has a Power BI Professional license:

  1. Navigate to the Admin Center within Office 365, expand the Groups tab on the left-hand pane and select Groups:
  2. The main window should refresh, displaying the list of Groups setup on the AAD tenant. Select the Add a group button:
  3. Define the settings as indicated in the below screenshot, making sure that the Type selected is Security, and press Add:
  4. You will then receive confirmation that the Security Group was added successfully to your tenant:
  5. With the main Groups window, select the new Security Group and then click the Edit button on the right-hand details pane:
  6. Then, select the Add members button to add in the required list of Users:
  1. Press Save to commit any changes:
  2. Navigate back to Power BI Online and to the Dashboard that needs sharing. Select the Share button at the top right of the screen:
  3. Within the Share dashboard pane, begin typing in the name of the Security Group created in the previous steps. Power BI will automatically detect and auto-complete the name of the group for you. Before pressing the Share button, you can also include a custom message to recipients that will be sent via an email and also toggle whether they will also be able to Share the dashboard themselves. A URL link generates at this point as well, allowing you to copy/paste this into an email, IM message etc.:
  4. Once the Dashboard is Shared, you can then navigate to the Access tab to review the list of Users/Security Groups that have access to your Dashboard. It is also possible to modify their access levels or remove access entirely by clicking on the ellipses button next to each Name:

Key Takeaways

  • Workspaces act as a container for the various components that form a Power BI Reporting solution. Within a Workspace, you will find all of the Dashboards, Reports, Workbooks and Datasets that developers have published content to. Each User has a Workspace created for them in Power BI when they first access the service. Additional Workspaces can be added through Office 365 Groups or by installing a Power BI App from AppSource. Dashboards and Reports created within your a Users Workspace are shareable to other Users, provided that your account has a Power BI Professional license assigned to it.
  • To help manage permissions to Dashboards/Reports in a more efficient manner, Administrators can create Security Groups on the same Office 365 Tenant where Power BI Online resides. These can contain multiple groups of Users, allowing administrators to minimise the amount of effort involved in managing Dashboard/Report access. Most crucially, this will also enable Users that do not have an Exchange Online mailbox to access Dashboards/Reports when they are shared out in this manner.
  • Administrators have a whole host of options available to them within the Tenant settings area of the Admin Portal. These include, but are not limited to:
    • Export and Sharing Settings
    • Enable/Disable Content Sharing
    • Enable/Disable Publish To Web
    • Enable/Disable Export Reports as PowerPoint Presentations
    • Enable/Disable Print Dashboards and Reports
    • Content Pack and App Settings
    • Integration Settings
    • Custom Visuals Settings
    • R Visuals Settings
    • Audit and Usage Settings
    • Dashboard Settings
    • Developer Settings
  • All of these settings can be enabled for a specific security group, the entire organisation (excepting specific security groups) or allowed for particular security groups, excluding all others in the organisation.
  • Row-Level Security (RLS) allows report developers to restrict data, based on Roles. Row-level DAX evaluation formulas are used to achieve this, which filters the data that is returned, depending on a TRUE/FALSE logic test. To utilise the feature, you must define both the Roles and DAX formulas for each query within your data model. Then, after deploying your Report to Power BI Online, you then assign Users or Security Groups to the Role(s) created within Power BI Desktop. It is possible to view the effect of a Role at any time, within Power BI Desktop or Online, via the View As Role functionality. With the wide-array of DAX formulas available, including specific ones that return the details for the current user accessing a Report, it is possible to define very granular filtering within a Power BI report, to suit particular security or access models.

Putting some thought into Power BI’s security and access components early on when developing your solution will allow you to best take advantage of features such as RLS, which then benefit further when utilised alongside the other functionality described this week. The final post in the series next week will provide a more detailed description of Workspaces and how these can be used to create Apps for both internal and external consumption.

Welcome to post number nine in my series designed to provide a revision tool for Microsoft Exam 70-778, and for those looking to increase their expertise in Power BI. The topics we have covered so far in the series have all involved Power BI Desktop primarily, and we now move away from this as we evaluate how to Manage Custom Reporting Solutions with Power BI. This focus area for the exam measures the following skill areas:

Configure and access Microsoft Power BI Embedded; enable developers to create and edit reports through custom applications; enable developers to embed reports in applications; use the Power BI API to push data into a Power BI dataset; enable developers to create custom visual.

Despite being, at first glance, a very technically focused area, it is not necessarily a requirement for the exam to know how to work with these features in-depth. However, what this post will try to do is fully explain what Power BI Embedded is (and how it can be tailored accordingly), the capabilities and benefits of the Power BI API and, finally, what options you have available to build custom visualizations, that are then available for use across any Power BI Report.

Power BI Embedded

A potential limitation of using Power BI as your Business Intelligence solution is that you must access your reporting solution through one of the two interfaces, depending on how you have licensed the product:

For strictly organisational only access, this is all fine and dandy; but if you desire to grant external users access to your reports, it would be necessary to open up a door into a critical component of your IT infrastructure, often in tandem with any other systems your solution may contain. For example, if you have developed a support portal for your customers to submit cases with and wish to provide them with a range of Power BI visualizations, you would need to grant and deploy access to two, separate application instances – your support portal and Power BI Online/Report Server. This can lead to a jarring end-user experience and severely limit your capabilities in providing a unified, bespoke and branded solution.

Power BI Embedded seeks to address this scenario by providing application developers the means to embed Power BI reports and visualizations directly within their existing applications. The experience this offers is seamless, to the extent that end-users will not even need to know that you are using Power BI at all. Consequently, this potentially allows you to look exceptionally gifted when you begin to deploy engaging and highly interactive visualizations into your application quickly. As an Azure-based service with a pricing mechanism to suit, you only need to suitably estimate your potential compute capacities, deploy your capacity and any corresponding reports and then build out the appropriate link to your Power BI content within your application.

To get started with using Power BI Embedded, you need to make sure you have the following at your disposal:

To get a feel for the capabilities on offer as part of this offering, you can go to the Power BI Embedded Playground, made available courtesy of Microsoft. This tool allows you to test how the various Power BI Embedded components render themselves, tweak around with their appearance and generate working code samples that are usable within your application. The screenshot below shows an example of how a single Report visual would look when embedding it into a bespoke application:

As the screenshot indicates, there is no loss in core functionality when consuming Power BI in this manner. You can hover over individual areas to gain insights into your data; you can drill-down through the data; data is sortable in the conventional manner; and, finally, you can view the underlying data in the visualization or even export it out into an Excel/CSV document. Also, you have extensive options available that can be used to modify how a visual, report etc. is rendered on your application page, allowing you to ensure that all rendering completes most optimally for your application.

All in all, Power BI Embedded represents a significant boon for application developers, enabling them to very quickly leverage the extensive reporting capabilities Power BI  provides, all of which is cloud-based, highly scalable and minutely tailorable. It is important to highlight that all of this goodness comes with a potentially high cost, namely, that of requiring a sufficiently proficient application developer (preferably .NET focused) to join all of the various bits together. But, if you are already in the position where you have developed an extensive range of Power BI reports for consumption by your customer base, Power BI Embedded is the natural progression point in turning your solution into a real piece of intellectual property.

The Power BI API

If you are finding your feet with Power BI Embedded and need to look at carrying out more complex actions against Power BI content that is pulling through from a workspace, then the API is an area that you will need to gain familiarity in working with too. Microsoft exposes a whole range of functionality as part of the Power BI API, that can assist in a wide variety of tasks – such as automation, deployment and allowing any bespoke application to further leverage benefits out of their Power BI embedded solution. Some examples of the types of things you can do with the API include:

The API requires that you authenticate against the Power BI service, using a corresponding Application Registration on Azure Active Directory, which defines the list of privileges that can be carried out. This component can be straightforwardly created using the wizard provided by Microsoft, and a full tutorial is also available on how to generate an access token from your newly created Application Registration. The key thing as part of all of this is to ensure that your Application Registration is scoped for only the permissions you require (these can be changed in future if needed) and not to grant all permissions needlessly.

Because the API is a REST endpoint, there are a variety of programming languages or tools that you can use from an interaction standpoint. PowerShell is an especially good candidate for this and, in the snippet below, you can see how this can be used to modify the User Name and Password for a SQL Server data source deployed onto Power BI Online:

#Make the request to patch the Data Source with the updated credentials

$sqluser = "MyUserName"
$sqlPass = "P@ssw0rd!"

$patchURI = "https://api.powerbi.com/v1.0/myorg/gateways/cfafbeb1-8037-4d0c-896e-a46fb27ff229/datasources/1e8176ec-b01c-4a34-afad-e001ce1a28dd/"
$person = @{
    credentialType='Basic'
    basicCredentials=@{
        username=$sqluser
        password=$sqlpass
        }
}
$personJSON = $person | ConvertTo-Json
$request3 = Invoke-RestMethod -Uri $patchURI -Headers $authHeader -Method PATCH -Verbose -Body $personJSON

This example deliberately excludes some of the previous steps needed to authenticate with Power BI and is, therefore, provided for strictly illustrative purposes only.

Creating Custom Visuals

Developers have access to primarily two options when it comes to building out bespoke visualizations, which are then includable in a Power BI Online, Embedded and Report Server report:

Last week’s post discussed this topic in more detail from an exam standpoint which, in a nutshell, only requires you to have a general awareness of the options available here; no need to start extensively learning a new programming language, unless you really want to 🙂

Key Takeaways

  • Power BI Embedded is an Azure hosted offering that allows you add Power BI Report content into bespoke applications. This deployment option can be incredibly useful if you wish to make available your Power BI solution to users outside of your organisation or if you have an existing, bespoke application system that can benefit from utilising Power BI content. An Azure subscription is required to begin working with Power BI Embedded and you are billed based on node size, not individual user licenses. All Power BI content requires publishing to the online service before its contents become available for Power BI Embedded to access. Report developers will, therefore, need granting a Power BI Professional license to carry out these activities.
  • The Power BI API grants access to developers to perform automation or administrative actions programmatically against the Power BI Online service. Utilising a REST API, developers can determine the optimal programming language of choice to interact with the API, allowing them to streamline the deployment of Reports or Dashboards to the Power BI service or leverage additional functionality when utilising Power BI Embedded. The API can also cater to specific data load requirements, although more complex needs in this area would require addressing via alternate means (SSIS, Azure Data Factory etc.)
  • Developers can add their own bespoke visualizations to a Power BI Report by either developing them using Node.js or using the R language. The first of these options facilitate a more streamlined deployment mechanism and allows developers to add their visualizations to AppSource, whereas the second option may be more useful for complex visualization types with an analytical or statistical function.

Compared to the other exam topics, a general awareness of these concepts is more than likely sufficient from a learning perspective and is (arguably) useful knowledge in any event, as it allows you to understand how developers can further extend a Power BI solution to suit a particular business need. In next weeks post, we will move into the final subject area for the exam, as the focus shifts towards how to work with Power BI outside of the Desktop application and the various tools available to integrate on-premise data sources into Power BI Online.

Towards the back end of last year, I discovered the joys and simplicity of Visual Studio Team Services (VSTS)/Azure DevOps. Regardless of what type of development workload that you face, the service provides a whole range of features that can speed up development, automate important build/release tasks and also assist with any testing workloads that you may have. Microsoft has devoted a lot of attention to ensuring that their traditional development tools/languages and new ones, such as Azure Data Factory V2, are fully integrated with VSTS/Azure DevOps. And, even if you do find yourself fitting firmly outside the Microsoft ecosystem, there are a whole range of different connectors to enable you to, for example, leverage Jenkins automation tasks for an Amazon Web Services (AWS) deployment. As with a lot of things to do with Microsoft today, you could not have predicted such extensive third-party support for a core Microsoft application 10 years ago. 🙂

Automated release deployments are perhaps the most useful feature that is leverageable as part of VSTS/Azure DevOps. These address several business concerns that apply to organisations of any size:

  • Removes human intervention as part of repeatable business processes, reducing the risk of errors and streamlining internal processes.
  • Allows for clearly defined, auditable approval cycles for release approvals.
  • Provides developers with the flexibility for structuring deployments based on any predefined number of release environments.

You may be questioning at this stage just how complicated implementing such processes are, versus the expected benefits they can deliver. Fortunately, when it comes to Azure App Service deployments at least, there are predefined templates provided that should be suitable for most basic deployment scenarios:

This template will implement a single task to deploy to an Azure App Service resource. All you need to do is populate the required details for your subscription, App Service name etc. and you are ready to go! Optionally, if you are working with a Standard App Service Plan or above, you can take advantage of the slot functionality to stage your deployments before impacting on any live instance. This option is possible to set up by specifying the name of the slot you wish to deploy to as part of Azure App Service Deploy task and then adding on an Azure App Service Manage task to carry out the swap slot – with no coding required at any stage:

There may also be some additional options that need configuring as part of the Azure App Service Deploy task:

  • Publish using Web Deploy: I would recommend always leaving this enabled when deploying to Azure Web Apps, given the functionality afforded to us via the Kudu Web API.
  • Remove additional files at destination: Potentially not a safe option should you have a common requirement to interact with your App Service folder manually, in most cases, leaving this enabled ensures that your target environment remains consistent with your source project.
  • Exclude files from the App_Data folder: Depending on what your web application is doing, it’s possible that some required data will exist in this folder that assists your website. You can prevent these files from being removed in tandem with the previous setting by enabling this.
  • Take App Offline: Slightly misleading in that, instead of stopping your App Service, it instead places a temporary file in the root directory that tells all website visitors that the application is offline. This temporary page will use one of the default templates that Azure provides for App Service error messages.

This last setting, in particular, can cause some problems, particularly when it comes to working with .NET Core web applications:

 

Note also that the problem does not occur when working with a standard .NET Framework MVC application. Bearing this fact in mind, therefore, suggests that the problem is a specific symptom relating to .NET Core. It also occurs when utilising slot deployments, as indicated above.

To get around this problem, we must address the issue that the Error Code points us toward – namely, the fact that web application files within the target location cannot be appropriately accessed/overwritten, due to being actively used by the application in question. The cleanest way of fixing this is to take your application entirely offline by stopping the App Service instance, with the apparent trade-off being downtime for your application. This scenario is where the slot deployment functionality goes from being an optional, yet useful, requirement to a wholly mandatory one. With this enabled (and the matching credit card limit to accommodate), it is possible to implement the following deployment process:

  • Create a staging slot on Azure for your App Service, with a name of your choosing
  • Structure a deployment task that carries out the following steps, in order:
    • Stop the staging slot on your chosen App Service instance.
    • Deploy your web application to the staging slot.
    • Start the staging slot on your chosen App Service instance.
    • (Optionally) Execute any required steps/logic that is necessary for the application (e.g. compile MVC application views, execute a WebJob etc.)
    • Perform a Swap Slot, promoting the staging slot to your live, production instance.

The below screenshot, derived from VSTS/Azure DevOps, shows how this task list should be structured:

Even with this in place, I would still recommend that you look at taking your web application “offline” in the minimal sense of the word. The Take App Offline option is by far the easiest way of achieving this; for more tailored options, you would need to look at a specific landing page that redirects users during any update/maintenance cycle, which provides the necessary notification to any users.

Setting up your first deployment pipeline(s) can throw up all sorts of issues that lead you down the rabbit hole. While this can be discouraging and lead to wasted time, the end result after any perserverence is a highly scalable solution that can avoid many sleepless nights when it comes to managing application updates. And, as this example so neatly demonstrates, solutions to these problems often do not require a detailed coding route to implement – merely some drag & drop wizardry and some fiddling about with deployment task settings. I don’t know about you, but I am pretty happy with that state of affairs. 🙂

The introduction of Azure Data Factory V2 represents the most opportune moment for data integration specialists to start investing their time in the product. Version 1 (V1) of the tool, which I started taking a look at last year, missed a lot of critical functionality that – in most typical circumstances – I could do in a matter of minutes via a SQL Server Integration Services (SSIS) DTSX package. The product had, to quote some specific examples:

  • No support for control low logic (foreach loops, if statements etc.)
  • Support for only “basic” data movement activities, with a minimal capability to perform or call data transformation activities from within SQL Server.
  • Some support for the deployment of DTSX packages, but with incredibly complex deployment options.
  • Little or no support for Integrated Development Environment (IDE’s), such as Visual Studio, or other typical DevOps scenarios.

In what seems like a short space of time, the product has come on leaps and bounds to address these limitations:

Supported now are Filter and Until conditions, alongside the expected ForEach and If conditionals.

When connecting to SQL Data destinations, we now have the ability to execute pre-copy scripts.

SSIS Integration Runtimes can now be set up from within the Azure Data Factory V2 GUI – no need to revert to PowerShell.

And finally, there is full support for storing all created resources within GitHub or Visual Studio Team Services Azure DevOps

The final one is a particularly nice touch, and means that you can straightforwardly incorporate Azure Data Factory V2 as part of your DevOps strategy with minimal effort – an ARM Resource Template deployment, containing all of your data factory components, will get your resources deployed out to new environments with ease. What’s even better is that this deployment template is intelligent enough not to recreate existing resources and only update Data Factory resources that have changed. Very nice.

Although a lot is provided for by Azure Data Factory V2 to assist with a typical DevOps cycle, there is one thing that the tool does not account for satisfactorily.

A critical aspect as part of any Azure Data Factory V2 deployment is the implementation of Triggers. These define the circumstances under which your pipelines will execute, typically either via an external event or based on a pre-defined schedule. Once activated, they effectively enter a “read-only” state, meaning that any changes made to them via a Resource Group Deployment will be blocked and the deployment will fail – as we can see below when running the New-AzureRmResourceGroupDeployment cmdlet directly from PowerShell:

It’s nice that the error is provided in JSON, as this can help to facilitate any error handling within your scripts.

The solution is simple – stop the Trigger programmatically as part of your DevOps execution cycle via the handy Stop-AzureRmDataFactoryV2Trigger. This step involves just a single line PowerShell Cmdlet that is callable from an Azure PowerShell task. But what happens if you are deploying your Azure Data Factory V2 template for the first time?

I’m sorry, but your Trigger is another castle.

The best (and only) resolution to get around this little niggle will be to construct a script that performs the appropriate checks on whether a Trigger exists to stop and skip over this step if it doesn’t yet exist. The following parameterised PowerShell script file will achieve these requirements by attempting to stop the Trigger called ‘MyDataFactoryTrigger’:

param($rgName, $dfName)

Try
{
   Write-Host "Attempting to stop MyDataFactoryTrigger Data Factory Trigger..."
   Get-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -ErrorAction Stop
   Stop-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -Force
   Write-Host -ForegroundColor Green "Trigger stopped successfully!"
}

Catch

{ 
    $errorMessage = $_.Exception.Message
    if($errorMessage -like '*NotFound*')
    {       
        Write-Host -ForegroundColor Yellow "Data Factory Trigger does not exist, probably because the script is being executed for the first time. Skipping..."
    }

    else
    {

        throw "An error occured whilst retrieving the MyDataFactoryTrigger trigger."
    } 
}

Write-Host "Script has finished executing."

To use successfully within Azure DevOps, be sure to provide values for the parameters in the Script Arguments field:

You can use pipeline Variables within arguments, which is useful if you reference the same value multiple times across your tasks.

With some nifty copy + paste action, you can accommodate the stopping of multiple Triggers as well – although if you have more than 3-4, then it may be more sensible to perform some iteration involving an array containing all of your Triggers, passed at runtime.

For completeness, you will also want to ensure that you restart the affected Triggers after any ARM Template deployment. The following PowerShell script will achieve this outcome:

param($rgName, $dfName)

Try
{
   Write-Host "Attempting to start MyDataFactoryTrigger Data Factory Trigger..."
   Start-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -Force -ErrorAction Stop
   Write-Host -ForegroundColor Green "Trigger started successfully!"
}

Catch

{ 
    throw "An error occured whilst starting the MyDataFactoryTrigger trigger."
}

Write-Host "Script has finished executing."

The Azure Data Factory V2 offering has no doubt come leaps and bounds in a short space of time…

…but you can’t shake the feeling that there is a lot that still needs to be done. The current release, granted, feels very stable and production-ready, but I think there is a whole range of enhancements that could be introduced to allow better feature parity when compared with SSIS DTSX packages. With this in place, and when taking into account the very significant cost differences between both offerings, I think it would make Azure Data Factory V2 a no-brainer option for almost every data integration scenario. The future looks very bright indeed 🙂