For the past 13 weeks on the blog, I have delivered a series of posts concerning Microsoft Exam 70-778, specifically focused towards providing a set of detailed revision notes that cover the broad array of Power BI features assessed as part of the exam. To round things off, today’s blog will bridge together everything I have discussed thus far in the series; with the hope being that this post can be a single reference point for those who have not been following the series to date.

Microsoft Exam 70-778 Overview

The exam, with its full title Analyzing and Visualizing Data with Microsoft Power BI, is targeted towards Business Intelligence (BI) and data professionals who are looking to validate their skills in working with Power BI. The exam is a necessary component, alongside Exam 70-779: Analyzing and Visualizing Data with Microsoft Excel, in attaining the Microsoft Certified Solutions Associate (MCSA) certification in BI Reporting. Successful candidates can then (optionally) pass an additional “elective” exam to gain the Microsoft Certified Solutions Expert (MCSE) certification in Data Management and Analytics.

Skills Measured in the Exam

The skills measured are outlined below, alongside links to the relevant posts from the series and the list of essential points to remember:

Consuming and Transforming Data By Using Power BI Desktop

Connect to data sources.
Skills Measured

May include: Connect to databases, files, folders; import from Excel; connect to SQL Azure, Big Data, SQL Server Analysis Services (SSAS)

Revision Notes

Exam 70-778 Revision Notes: Importing from Data Sources

Key Takeaways
  • Power BI supports a broad range of database systems, flat file, folder, application and custom data sources. While it is impossible to memorise each data source, you should at least broadly familiarise yourself with the different types at our disposal.
  • A crucial decision for many data sources relates to the choice of either Importing a data source in its entirety or in taking advantage of DirectQuery functionality instead (if available). Both routes have their own defined set of benefits and disadvantages. DirectQuery is worth consideration if there is a need to keep data regularly refreshed and you have no requirement to work with multiple data sources as part of your solution.
  • Live Connection is a specific data connectivity option available for SQL Server Analysis Services. It behaves similarly to DirectQuery.
  • It is possible to import an existing Excel BI solution into Power BI with minimal effort, alongside the ability to import standard worksheet data in the same manner as other flat file types.
Perform transformations
Skills Measured

May include: Design and implement basic and advanced transformations; apply business rules; change data format to support visualization

Revision Notes

Exam 70-778 Revision Notes: Performing Data Transformations

Key Takeaways
  • The Power Query M formula language is used to perform transformations to data once loaded into Power BI. Although it is possible to do this via code, Power BI allows us to define all of our required data changes from within the interface, without the need to write a single line of code.
  • Each data source connected to represents itself as a Query within Power BI. There are many options at your disposal when working with Queries, such as renaming, merging, duplication and the ability to disable or reference as part of other Queries.
  • There are wide-range of column transformations that can be applied, which are too numerous to mention. The Transform tab provides the best means of seeing what is available, with options ranging from formatting through to grouping and pivoting/unpivoting.
  • New columns are addable via the Add Column tab. You can choose to base new columns on calculations, conditional logic, other column values or as a defined list of ascending numbers, which may be useful for indexing purposes.
  • It is possible to merge or append queries together to suit your specific requirements. Merging involves the horizontal combination of Queries, whereas appending represents a vertical combination.
  • Parameters can be used to help optimise any complex filtering requirements.
  • Where possible, Power Query will attempt to use the most optimal query for your data source, based on the transformation steps you define. This action is known as Query Folding and, in most cases, SQL-derived data sources will support this option by default.
Cleanse data
Skills Measured

May include: Manage incomplete data; meet data quality requirements

Revision Notes

Exam 70-778 Revision Notes: Cleansing Data

Key Takeaways
  • Data can be filtered directly within Power Query, using Excel-like functionality to assist you in only returning the most relevant data in your queries. The data type of each field plays a particularly important part of this, as only specific filter options will be at your disposal if, for example, you are working with numeric data.
  • From a data quality perspective, you typically will need to handle column values that contain one of two possible value types:
    • Errors: This will usually occur as a result of a calculated column field not working correctly. The best solution will always be to address any issues with your calculated column, such as by using a conditional statement to return a default value.
    • Blanks/NULLs: A common symptom when working with SQL derived data sources, your real problems with blank values start to appear when you attempt to implement DAX custom columns/Measures outside of the Power Query Editor. It is, therefore, recommended that these are dealt with via a Replace action, depending on your fields data types. For example, a number field with blank/NULL values should be replaced with 0.
  • The Remove Rows option(s) can act as a quick way of getting rid of any Error or Blank/NULL rows and can also be utilised further to remove duplicates or a range of rows. In most cases, you will have similar options available to you with Keep Rows instead.
  • There are a variety of formatting options available to us when working with text/string data types. These range from fixing capitalisation issues in data, through to removing whitespace/non-printable character sets and even the ability to prepend/append a new value.

Modeling and Visualizing Data

Create and optimize data models.
Skills Measured

May include: Manage relationships; optimize models for reporting; manually type in data; use Power Query

Revision Notes

Exam 70-778 Revision Notes: Create and Optimise Data Models

Key Takeaways
  • Relationships form the cornerstone of ensuring the long-term viability and scalability of a large data model. Assuming you are working with well-built out, existing data sources, Power BI will automatically detect and create Relationships for you. In situations where more granular control is required, these Relationships can be specified manually if needed. It is worth keeping in mind the following important features of Relationships:
    • They support one-to-one (1:N), one-to-many (1:N) and many-to-one (N:1) cardinality, with many-to-many (N:N) currently in preview.
    • Filter directions can be specified either one way or bi-directionally.
    • Only one relationship can be active on a table at any given time.
  • It is possible to sort columns using more highly tailored custom logic via the Sort By Column feature. The most common requirement for this generally involves the sorting of Month Names in date order but can be extended to cover other scenarios if required. To implement, you should ensure that your data has a numbered column to indicate the preferred sort order.
  • Moving outside of the Power Query Editor presents us with more flexibility when it comes to formatting data to suit particular styling or locale requirements. While the majority of this functionality provides date/time and currency formatting options, for the most part, it is also possible to categorise data based on Location, the type of URL it is or on whether or not it represents a Barcode value; these options can assist Power BI when rendering certain types of visualizations.
  • There may be ad-hoc requirements to add manually defined data into Power BI – for example, a list of values that need linking to a Slicer control. The Enter Data button is the “no-code” route to achieving this and supports the ability to copy & paste data from external sources. For more advanced scenarios, you also have at your disposal a range of M code functionality to create Lists, Records and Tables, which can be extended further as required.
Create calculated columns, calculated tables, and measures
Skills Measured

May include: Create DAX formulas for calculated columns, calculated tables, and measures; Use What If parameters

Revision Notes

Exam 70-778 Revision Notes: Using DAX for Calculated Columns

Key Takeaways
  • DAX is the primary formula language when working with datasets outside of Power Query. It includes, to date, more than 200 different types of functions that can assist in all sorts of data modelling.
  • An important concept to grasp within DAX is context and, specifically, row context (formulas that calculate a result for each row in a dataset) and filter context (formulas that automatically apply any filtering carried out at report level).
  • The sheer amount of DAX functions available makes it impossible to master and remember all of them, particularly when it comes to the exam. Your learning should, therefore, focus on learning the general syntax of DAX and the general types of functions available (aggregation, date/time etc.)
  • There are three principal means of utilising DAX with Power BI:
    • As Measures: These typically present a scalar value of some description, often an aggregation or a result of a complex formula. Using them in association with a Card visualization type is recommended, but this is not a strict requirement.
    • As Calculated Columns: Similar to the options available within Power Query, Calculated Columns provide a dynamic and versatile means of adding new columns onto your datasets. Compared with the options available within Power Query and the complexity of the M language, DAX Calculated Columns might represent a more straightforward means of adding custom columns onto your datasets.
    • As Calculated Tables: A powerful feature, mainly when used in conjunction with Calculated Columns, you have the ability here to create entirely new datasets within the model. These will typically derive from any existing datasets you have brought in from Power Query, but you also have functionality here to create Date tables, sequence numbers and manually defined datasets as well.
  • What-if Parameters provide of means of testing DAX formulas, as well as allowing report users to perform predictive adjustments that can affect multiple visualizations on a report.
Measure performance by using KPIs, gauges and cards.
Skills Measured

May include: calculate the actual; calculate the target; calculate actual to target; configure values for gauges; use the format settings to manually set values

Revision Notes

Exam 70-778 Revision Notes: Utilising KPIs with Gauge Visualisations

Key Takeaways
  • There are two principle visualization types available within Power BI to help track actual-to-target progress – KPIs and Gauges.
  • KPIs provide a more visually unique means of a binary success/fail determination when tracking towards a target. It is also possible to use KPI’s to track variance over time via the Trend axis. The Indicator will typically be the result of some form of aggregation or Measure.
  • Gauges provide a less visually distinctive, but non-binary, mechanism of viewing progress towards a target. Gauges support more potential field well values when compared with KPIs, nearly all of which are optional in some way. You can also manually define some of these values, for situations where your data model does not contain the required information.
  • All visualizations within Power BI are modifiable from a display or formatting perspective. The same basic options will generally be supported – such as changing a font type or background colour – with more specific configuration properties available per unique visualization type. For example, a KPI visualization can be customised to hide the background Trend Axis entirely. All of these options are designed to give developers greater control over the look and feel of their reports and to mirror them as closely as possible to any potential branding requirement.
  • When building out a solution designed to monitor progress within Power BI, the steps involved will typically be more in-depth than merely creating a new visualization. In most cases, there will be a requirement to bring together a lot of the other skills that have been discussed previously within this series – such as creating DAX formulas, modifying data within Power Query or bringing together different data sources into a single model. It is essential, therefore, not to underestimate the amount of time and effort involved in creating a practical solution that takes advantage of KPIs or Gauges.
Create hierarchies
Skills Measured

May include: Create date hierarchies; create hierarchies based on business needs; add columns to tables to support desired hierarchy

Revision Notes

Exam 70-778 Revision Notes: Creating Hierarchies

Key Takeaways
  • Hierarchies within Power BI provide a means of logically categorising data into an order of preference or precedence, providing greater flexibility to Power BI report users when they interact with visualizations.
  • Date Hierarchies are created and managed automatically by Power BI for each Date or Date/Time field defined within your model. These automatically create fields that contain the Year, Quarter, Month & Day values from the respective date fields. These fields can then be utilised as part of a Table visualization or within a DAX formula. Date Hierarchies can also be completely disabled if required.
  • Custom (or User-Defined) Hierarchies need to be created manually and provide additional customisation options when compared to Date Hierarchies, such as the number of fields they contain, the order and its name. A Custom Hierarchy will typically make use of one of several Parent/Child DAX functions, such as PATH or PATHITEM.
  • Including a hierarchy as part of a chart visualization, such as a Pie chart or Donut chart, opens up other drill-down capabilities around your data. Indicated by the additional arrow icons included at the top of the visualization, they provide the means for users to interrogate data points that interest them the most straightforwardly.
Create and format interactive visualizations.
Skills Measured

May include: Select a visualization type; configure page layout and formatting; configure interactions between visual; configure duplicate pages; handle categories that have no data; configure default summarization and data category of columns; position, align, and sort visuals; enable and integrate R visuals; format measures; Use bookmarks and themes for reports

Revision Notes

Exam 70-778 Revision Notes: Create and Format Interactive Visualizations

Key Takeaways
  • Power BI delivers, out of the box, a range of different visualizations that cater towards most (if not all) reporting requirements. Should you find yourself in need of additional visualizations, then Microsoft AppSource is your go-to destination for finding visualizations developed by others. If you have experience working with either Node.js or R, then these can be used to build bespoke visualizations also.
  • When first developing a report, you should be able to match a requirement for a specific visualization type, to ensure that you are delivering a solution that is both meaningful and useful. From an exam perspective, this becomes a more critical consideration, and you should be prepared to suggest the most optimal visualization to use when given a specific scenario.
  • After adding visualization’s to your report, you have additional options available to customise them further. For example, you can specify a preferred sorting order for your data, override any summarizations used and move/align your visual on the report page.
  • By default, visualizations in Power BI are designed to change automatically, based on how users interact with the report. All of these options are controllable via the Edit interactions button, allowing you to specify your preferred cross-filtering and cross-highlighting conditions.
  • There is a range of report page customisation options available to developers. It is possible to resize a page to any possible height/width, allowing you to optimise your report for specific devices. Also, you can modify the colour of a page (or its wallpaper) or add an image instead. Pages can also be renamed, reordered or duplicated.
  • Measures can be formatted in the same way as calculated columns, meaning you can specify a data type or, for numerics, modify the number of decimal places.
  • Bookmarks allow developers to set up “checkpoints” within a report, based on how a report page has been filtered. These can then be used to automatically navigate the user through a report, applying these filtering steps automatically. This feature can help transform your report into an interactive story.
  • Visualizations will automatically inherit their various colour properties from the currently selected report theme. Although these can be modified granularly, the fastest and most consistent way of making these changes en masse is to change the Theme. Power BI includes some Themes out of the box, but you also have the option of building your own using a custom JSON file; this can then be imported into different reports, providing a portable means of enforcing a particular branding requirement.
Manage custom reporting solutions
Skills Measured
  • May include: Configure and access Microsoft Power BI Embedded; enable developers to create and edit reports through custom applications; enable developers to embed reports in applications; use the Power BI API to push data into a Power BI dataset; enable developers to create custom visuals
Revision Notes

Exam 70-778 Revision Notes: Managing Custom Reporting Solutions

Key Takeaways
  • Power BI Embedded is an Azure hosted offering that allows you add Power BI Report content into bespoke applications. This deployment option can be incredibly useful if you wish to make available your Power BI solution to users outside of your organisation or if you have an existing, bespoke application system that can benefit from utilising Power BI content. An Azure subscription is required to begin working with Power BI Embedded and you are billed based on node size, not individual user licenses. All Power BI content requires publishing to the online service before its contents become available for Power BI Embedded to access. Report developers will, therefore, need granting a Power BI Professional license to carry out these activities.
  • The Power BI API grants access to developers to perform automation or administrative actions programmatically against the Power BI Online service. Utilising a REST API, developers can determine the optimal programming language of choice to interact with the API, allowing them to streamline the deployment of Reports or Dashboards to the Power BI service or leverage additional functionality when utilising Power BI Embedded. The API can also cater to specific data load requirements, although more complex needs in this area would require addressing via alternate means (SSIS, Azure Data Factory etc.)
  • Developers can add their own bespoke visualizations to a Power BI Report by either developing them using Node.js or using the R language. The first of these options facilitate a more streamlined deployment mechanism and allows developers to add their visualizations to AppSource, whereas the second option may be more useful for complex visualization types with an analytical or statistical function.

Configure Dashboards, Reports and Apps in the Power BI Service

Access on-premises data
Skills Measured

May include: Connect to a data source by using a data gateway;publish reports to the Power BI service from Power BI Desktop; edit Power BI Service reports by using Power BI desktop

Revision Notes

Exam 70-778 Revision Notes: Report Publishing, On-Premise Gateway & Creating Dashboards

Key Takeaways
  • The Power BI On-Premise Gateway provides a streamlined route to working with non-cloud data sources within Power BI, Microsoft Flow and PowerApps. As a lightweight and easy-to-configure client application, it supports a wide variety of data sources, making them accessible as if they were in the cloud. Once set up, corresponding Data Sources are then made available for configuration and for leveraging as part of any Power BI Dataset.
  • Reports can be published into Power BI Online, meaning that they become accessible online and to a broader group of users, without requiring access to Power BI Desktop. Reports need deploying into a Workspace, which can be created manually or derived from an Office 365 Group. Each Report contains a corresponding Dataset, where all queries defined within Power BI Desktop exist.
  • Reports that already exist on Power BI Online can be updated by just publishing a new version of the Report from Power BI Desktop. It is also possible to modify Reports from directly within the browser and by downloading a copy of the .pbix Report file as well, which can then be altered and re-published.
Configure a dashboard
Skills Measured

May include: Add text and images; filter dashboards; dashboard settings; customize the URL and title; enable natural language queries

Revision Notes

Exam 70-778 Revision Notes: Report Publishing, On-Premise Gateway & Creating Dashboards

Key Takeaways
  • Dashboards provide a means of grouping together various content as tiles, designed for at-a-glance analysis and optimal end-user experience.
  • The list of content that can be pinned to a Dashboard includes:
    • Visualizations
    • Web content
    • Images
    • Text boxes
    • Videos
    • Custom streaming data
  • Pinned content can be re-arranged on Dashboard via drag and drop functionality. It is also possible to resize tiles to any given height/width.
  • Within the settings of a Dashboard, it is possible to enable/disable features such as natural language queries (Q&A’s) and Notes.
  • Some features of a Dashboard are only available if you have a Power BI Professional subscription, such as sharing and email subscriptions.

 

Publish and embed reports
Skills Measured

May include: Publish to web; publish to Microsoft SharePoint; publish reports to a Power BI Report Server

Revision Notes

Exam 70-778 Revision Notes: Publish and Embed Reports

Key Takeaways
  • The Publish to web option allows for non-licensed, external users to view a Power BI Report in its entirety. A URL and IFrame embed code can be generated for this at any time within the portal and then dropped into virtually any website. Although you will lose some functionality when deploying a Report out in this manner, you can expect that users will be able to perform most types of interactions with visualizations, Report pages and other components, as if they were accessing the Report through Power BI Online. In some cases, you may be unable to use the Publish to web option if your Report uses certain kinds of features, such as R Visuals or row-level security. You must also take into account any privacy or data protection concerns, as Reports deployed out in this manner will be publically accessible; where this is an issue, the Embed option is available as a secure alternative.
  • There are three steps involved if you wish to add a Report to SharePoint. First, you must generate the unique SharePoint embed URL within Power BI. Secondly, you then need to add on the dedicated control for this feature on your target SharePoint page and configure the relevant display options. Finally, you then need to ensure that all SharePoint users have been granted access to the Report, either at a Workspace level (recommended option) or by having the Report shared with them. By implication, in this scenario, all SharePoint users would have to have at least a Power BI Professional license to take full advantage of this functionality.
  • Publishing a Report to Power BI Report Server is mostly the same as if you were to do the same with the online version of the product. Instead of selecting a Workspace to add the Report to you, specify the name of the Report Server folder where the Report will reside. From a development standpoint, the dedicated Power BI Desktop for Power BI Report Server must be used and may differ in functionality from the “normal” version of the tool. There is also no option to edit a report from within Power BI Report Server like you can through the online version.
Configure security for dashboards, reports and apps.
Skills Measured

May include: Create a security group by using the Admin Portal; configure access to dashboards and app workspaces; configure the export and sharing setting of the tenant; configure Row-Level Security

Revision Notes

Exam 70-778 Revision Notes: Securing Power BI Dashboards, Reports and Apps

Key Takeaways
  • Workspaces act as a container for the various components that form a Power BI Reporting solution. Within a Workspace, you will find all of the Dashboards, Reports, Workbooks and Datasets that developers have published content to. Each User has a Workspace created for them in Power BI when they first access the service. Additional Workspaces can be added through Office 365 Groups or by installing a Power BI App from AppSource. Dashboards and Reports created within your a Users Workspace are shareable to other Users, provided that your account has a Power BI Professional license assigned to it.
  • To help manage permissions to Dashboards/Reports in a more efficient manner, Administrators can create Security Groups on the same Office 365 Tenant where Power BI Online resides. These can contain multiple groups of Users, allowing administrators to minimise the amount of effort involved in managing Dashboard/Report access. Most crucially, this will also enable Users that do not have an Exchange Online mailbox to access Dashboards/Reports when they are shared out in this manner.
  • Administrators have a whole host of options available to them within the Tenant settings area of the Admin Portal. These include, but are not limited to:
    • Export and Sharing Settings
    • Enable/Disable Content Sharing
    • Enable/Disable Publish To Web
    • Enable/Disable Export Reports as PowerPoint Presentations
    • Enable/Disable Print Dashboards and Reports
    • Content Pack and App Settings
    • Integration Settings
    • Custom Visuals Settings
    • R Visuals Settings
    • Audit and Usage Settings
    • Dashboard Settings
    • Developer Settings
  • All of these settings can be enabled for a specific security group, the entire organisation (excepting specific security groups) or allowed for particular security groups, excluding all others in the organisation.
  • Row-Level Security (RLS) allows report developers to restrict data, based on Roles. Row-level DAX evaluation formulas are used to achieve this, which filters the data that is returned, depending on a TRUE/FALSE logic test. To utilise the feature, you must define both the Roles and DAX formulas for each query within your data model. Then, after deploying your Report to Power BI Online, you then assign Users or Security Groups to the Role(s) created within Power BI Desktop. It is possible to view the effect of a Role at any time, within Power BI Desktop or Online, via the View As Role functionality. With the wide-array of DAX formulas available, including specific ones that return the details for the current user accessing a Report, it is possible to define very granular filtering within a Power BI report, to suit particular security or access models.
Configure apps and apps workspaces.
Skills Measured

May include: Create and configure an app workspace; publish an app; update a published app; package dashboards and reports as apps

Revision Notes

Exam 70-778 Revision Notes: Working with Apps and App Workspaces

Key Takeaways
  • Workspaces act as a container for the various components that form a Power BI Reporting solution. Within a Workspace, you will find all of the Dashboards, Reports, Workbooks and Datasets that developers have published content to. Each User has a Workspace created for them in Power BI when they first access the service. It is also possible to create additional Workspaces, either through the Power BI Online interface or by creating an Office 365 Group. A new experience for creating Workspaces is currently in preview which, once released, would negate the need for each Workspace to have an associated Office 365 Group.
  • When creating a Workspace, you can define various settings such as the type of access each user has (read-only or ability to modify its content), its members and whether it requires assignment to a Power BI Premium node. It is not possible to change the access type for a Workspace after creation, but you can freely change its name or modify its membership at any time.
  • The contents of a Workspace can be published as an App, enabling you to expose your solution to a broader audience within or outside your organisation. Once published, users navigate to the Power BI AppSource store for their tenant, which lists all Apps available for installation. Once installed, they will then become visible from within the Apps area of the application. You can update content within an App at any time by republishing its corresponding Workspace. It is also possible to define individual properties within an App, such as its description, access rights and landing page. To install and use Apps, the user in question must have a Power BI Professional license.

Additional Preperation Resources

The official Microsoft exam reference book is a helpful learning aid for this exam, particularly given that it includes numerous exercises that you can work through to familiarise yourself with different Power BI functionality. There is also an online course available on the edX website which, despite not covering the whole exam syllabus, does provide a useful visual aid and includes a lot of the features you are expected to know for the exam. Finally, nothing beats actually working with the product itself and trying out the various features yourself. Power BI Desktop is a free download and, with access to one of the sample databases provided by Microsoft, you can very quickly provision an environment on your own home computer to enable you to experience everything that Power BI has to offer.

Exams are always a nightmarish experience, both when preparing for them and when you are sat there in the test centre. I hope that this post, and this whole series, proves to be useful in helping with your exam preparation and getting you ready to pass the exam with flying colours 🙂

Welcome to post number nine in my series designed to provide a revision tool for Microsoft Exam 70-778, and for those looking to increase their expertise in Power BI. The topics we have covered so far in the series have all involved Power BI Desktop primarily, and we now move away from this as we evaluate how to Manage Custom Reporting Solutions with Power BI. This focus area for the exam measures the following skill areas:

Configure and access Microsoft Power BI Embedded; enable developers to create and edit reports through custom applications; enable developers to embed reports in applications; use the Power BI API to push data into a Power BI dataset; enable developers to create custom visual.

Despite being, at first glance, a very technically focused area, it is not necessarily a requirement for the exam to know how to work with these features in-depth. However, what this post will try to do is fully explain what Power BI Embedded is (and how it can be tailored accordingly), the capabilities and benefits of the Power BI API and, finally, what options you have available to build custom visualizations, that are then available for use across any Power BI Report.

Power BI Embedded

A potential limitation of using Power BI as your Business Intelligence solution is that you must access your reporting solution through one of the two interfaces, depending on how you have licensed the product:

For strictly organisational only access, this is all fine and dandy; but if you desire to grant external users access to your reports, it would be necessary to open up a door into a critical component of your IT infrastructure, often in tandem with any other systems your solution may contain. For example, if you have developed a support portal for your customers to submit cases with and wish to provide them with a range of Power BI visualizations, you would need to grant and deploy access to two, separate application instances – your support portal and Power BI Online/Report Server. This can lead to a jarring end-user experience and severely limit your capabilities in providing a unified, bespoke and branded solution.

Power BI Embedded seeks to address this scenario by providing application developers the means to embed Power BI reports and visualizations directly within their existing applications. The experience this offers is seamless, to the extent that end-users will not even need to know that you are using Power BI at all. Consequently, this potentially allows you to look exceptionally gifted when you begin to deploy engaging and highly interactive visualizations into your application quickly. As an Azure-based service with a pricing mechanism to suit, you only need to suitably estimate your potential compute capacities, deploy your capacity and any corresponding reports and then build out the appropriate link to your Power BI content within your application.

To get started with using Power BI Embedded, you need to make sure you have the following at your disposal:

To get a feel for the capabilities on offer as part of this offering, you can go to the Power BI Embedded Playground, made available courtesy of Microsoft. This tool allows you to test how the various Power BI Embedded components render themselves, tweak around with their appearance and generate working code samples that are usable within your application. The screenshot below shows an example of how a single Report visual would look when embedding it into a bespoke application:

As the screenshot indicates, there is no loss in core functionality when consuming Power BI in this manner. You can hover over individual areas to gain insights into your data; you can drill-down through the data; data is sortable in the conventional manner; and, finally, you can view the underlying data in the visualization or even export it out into an Excel/CSV document. Also, you have extensive options available that can be used to modify how a visual, report etc. is rendered on your application page, allowing you to ensure that all rendering completes most optimally for your application.

All in all, Power BI Embedded represents a significant boon for application developers, enabling them to very quickly leverage the extensive reporting capabilities Power BI  provides, all of which is cloud-based, highly scalable and minutely tailorable. It is important to highlight that all of this goodness comes with a potentially high cost, namely, that of requiring a sufficiently proficient application developer (preferably .NET focused) to join all of the various bits together. But, if you are already in the position where you have developed an extensive range of Power BI reports for consumption by your customer base, Power BI Embedded is the natural progression point in turning your solution into a real piece of intellectual property.

The Power BI API

If you are finding your feet with Power BI Embedded and need to look at carrying out more complex actions against Power BI content that is pulling through from a workspace, then the API is an area that you will need to gain familiarity in working with too. Microsoft exposes a whole range of functionality as part of the Power BI API, that can assist in a wide variety of tasks – such as automation, deployment and allowing any bespoke application to further leverage benefits out of their Power BI embedded solution. Some examples of the types of things you can do with the API include:

The API requires that you authenticate against the Power BI service, using a corresponding Application Registration on Azure Active Directory, which defines the list of privileges that can be carried out. This component can be straightforwardly created using the wizard provided by Microsoft, and a full tutorial is also available on how to generate an access token from your newly created Application Registration. The key thing as part of all of this is to ensure that your Application Registration is scoped for only the permissions you require (these can be changed in future if needed) and not to grant all permissions needlessly.

Because the API is a REST endpoint, there are a variety of programming languages or tools that you can use from an interaction standpoint. PowerShell is an especially good candidate for this and, in the snippet below, you can see how this can be used to modify the User Name and Password for a SQL Server data source deployed onto Power BI Online:

#Make the request to patch the Data Source with the updated credentials

$sqluser = "MyUserName"
$sqlPass = "P@ssw0rd!"

$patchURI = "https://api.powerbi.com/v1.0/myorg/gateways/cfafbeb1-8037-4d0c-896e-a46fb27ff229/datasources/1e8176ec-b01c-4a34-afad-e001ce1a28dd/"
$person = @{
    credentialType='Basic'
    basicCredentials=@{
        username=$sqluser
        password=$sqlpass
        }
}
$personJSON = $person | ConvertTo-Json
$request3 = Invoke-RestMethod -Uri $patchURI -Headers $authHeader -Method PATCH -Verbose -Body $personJSON

This example deliberately excludes some of the previous steps needed to authenticate with Power BI and is, therefore, provided for strictly illustrative purposes only.

Creating Custom Visuals

Developers have access to primarily two options when it comes to building out bespoke visualizations, which are then includable in a Power BI Online, Embedded and Report Server report:

Last week’s post discussed this topic in more detail from an exam standpoint which, in a nutshell, only requires you to have a general awareness of the options available here; no need to start extensively learning a new programming language, unless you really want to 🙂

Key Takeaways

  • Power BI Embedded is an Azure hosted offering that allows you add Power BI Report content into bespoke applications. This deployment option can be incredibly useful if you wish to make available your Power BI solution to users outside of your organisation or if you have an existing, bespoke application system that can benefit from utilising Power BI content. An Azure subscription is required to begin working with Power BI Embedded and you are billed based on node size, not individual user licenses. All Power BI content requires publishing to the online service before its contents become available for Power BI Embedded to access. Report developers will, therefore, need granting a Power BI Professional license to carry out these activities.
  • The Power BI API grants access to developers to perform automation or administrative actions programmatically against the Power BI Online service. Utilising a REST API, developers can determine the optimal programming language of choice to interact with the API, allowing them to streamline the deployment of Reports or Dashboards to the Power BI service or leverage additional functionality when utilising Power BI Embedded. The API can also cater to specific data load requirements, although more complex needs in this area would require addressing via alternate means (SSIS, Azure Data Factory etc.)
  • Developers can add their own bespoke visualizations to a Power BI Report by either developing them using Node.js or using the R language. The first of these options facilitate a more streamlined deployment mechanism and allows developers to add their visualizations to AppSource, whereas the second option may be more useful for complex visualization types with an analytical or statistical function.

Compared to the other exam topics, a general awareness of these concepts is more than likely sufficient from a learning perspective and is (arguably) useful knowledge in any event, as it allows you to understand how developers can further extend a Power BI solution to suit a particular business need. In next weeks post, we will move into the final subject area for the exam, as the focus shifts towards how to work with Power BI outside of the Desktop application and the various tools available to integrate on-premise data sources into Power BI Online.

The whole concept of audio conferencing – the ability for a diverse range of participants to dial into a central location for a meeting – is hardly a new concept for the 21st century. Its prevalence, however, has undoubtedly grown sharply in the last 15-20 years; to the point now where, to use an analogy, it very much feels like a DVD when compared to full video conferencing, à la Blu-Ray. When you also consider the widening proliferation of remote workers, globalisation and the meteoric rise of cloud computing, businesses suddenly find themselves having to find answers to the following questions:

  • How can I enable my colleagues to straightforwardly and dynamically collaborate across vast distances?
  • How do I address the challenges of implementing a scalable solution that meets any regulatory or contractual requirements for my organisation?
  • What is the most cost-effective route to ensuring I can have a genuinely international audio-conferencing experience?
  • How do I identify a solution that users can easily adopt, without introducing complex training requirements?

These questions are just a flavour of the sorts of things that organisations should be thinking about when identifying a suitable audio conferencing solution. And there are a lot of great products on the market that address these needs – GoToMeeting or join.me represent natural choices for specific scenarios. But to provide a genuinely unified experience for existing IT deployments that have a reliance on Skype for Business/Microsoft Teams, the audio conferencing add-on for Office 365 (also referred to as Skype for Business Online Audio Conferencing) may represent a more prudent choice. It ticks all of the boxes for the questions above, ensuring that users can continue utilising other tools they know and use every day – such as Microsoft Outlook and Office 365. Admittedly, though, the solution is undoubtedly more geared up for Enterprise deployments as opposed to utilisation by SMBs. It may, therefore, become too unwieldy a solution in the hands of smaller organisations.

I was recently involved in implementing an audio conferencing solution for Skype for Business Online, to satisfy the requirement for international dialling alluded to earlier. Having attended many audio conferences that utilise the service previously, I was familiar with the following experience when dialling in and – perhaps naively – assumed this to be the default configuration:

  1. User dials in and enters the meeting ID number.
  2. An audio cue is played for the conference leader, asking them to enter their leader code (this is optional).
  3. The user is asked to record their name and then press *
  4. The meeting then starts automatically.

For most internal audio conferences (and even external ones), this process works well, mainly when, for example, the person organising the meeting is doing so on behalf of someone else, and is unlikely to be dialling in themselves. However, I was surprised to learn that the actual, default experience is a little different:

  1. User dials in and enters the meeting ID number.
  2. An audio cue is played for the conference leader, asking them to enter their leader code (this is optional).
  3. The user is asked to record their name and then press *
  4. If the leader has not yet dialled in, all other attendees sit in the lobby. The call does not start until the leader joins.

The issue does not relate to how the meeting organiser has configured their meeting in Outlook, regardless of which setting is chosen in Outlook in the These people don’t have to wait in the lobby drop-down box.

After some fruitless searching online, I eventually came across the following article, which clarified things for me:

Start an Audio Conference over the phone without a PIN in Skype for Business Online

As a tenant-level configuration, therefore, there is a two-stage process involved to get this reconfigured for existing Skype for Business Online Audio Conferencing deployments:

  • Set the AllowPSTNOnlyMeetingsByDefault setting to true on the tenant via a PowerShell cmdlet.
  • Configure the AllowPSTNONLYMeetingsByDefault setting to true for every user setup for Audio Conferencing, either within the Skype for Business Administration Centre or via PowerShell.

The second process could be incredibly long-winded to achieve via the Administration Centre route, as you have to go into each user’s audio conferencing settings and toggle the appropriate control, as indicated below:

For a significantly larger deployment, this could easily result in carpal tunnel syndrome and loss of sanity 😧. Fortunately, PowerShell can take away some of the woes involved as part of this. By logging into the Skype for Business Online administration centre via this route, it is possible to both enable the AllowPSTNOnlyMeetingsByDefault setting on a tenant level and also for all users who currently have the setting disabled. The complete script to carry out these steps is below:

#Standard login for S4B Online

Import-Module SkypeOnlineConnector
$userCredential = Get-Credential
$sfbSession = New-CsOnlineSession -Credential $userCredential
Import-PSSession $sfbSession

#Login script for MFA enabled accounts

Import-Module SkypeOnlineConnector
$sfbSession = New-CsOnlineSession
Import-PSSession $sfbSession

#Enable dial in without leader at the tenant level - will only apply to new users moving forward

Set-CsOnlineDialInConferencingTenantSettings -AllowPSTNOnlyMeetingsByDefault $true

#Get all current PSTN users

$userIDs = Get-CsOnlineDialInConferencingUserInfo | Where Provider -eq "Microsoft" | Select DisplayName, Identity

$userIDs | ForEach-Object -Process {
        
    #Then, check whether the AllowPstnOnlyMeetings is false

    $identity = $_.Identity

    $user = Get-CsOnlineDialInConferencingUser -Identity $identity.RawIdentity
    Write-Host $_.DisplayName"AllowPstnOnlyMeetings value equals"$user.AllowPstnOnlyMeetings
    if ($user.AllowPstnOnlyMeetings -eq $false) {
        
        #If so, then enable

        Set-CsOnlineDialInConferencingUser -Identity $identity.RawIdentity -AllowPSTNOnlyMeetings $true
        Write-Host $_.DisplayName"AllowPstnOnlyMeetings value changed to true!"
    }

    else {
        Write-Host "No action required for user account"$_.DisplayName 
    }
}

Some notes/comments before you execute it in your environment:

  • You should comment out the appropriate authentication snippet that is not appropriate for your situation, depending on whether you have enabled Multi-Factor Authentication for your user account.
  • Somewhat annoyingly, there is no way (I found) to extract a unique enough identifier that can be used with the Set-CsOnlineDialInConferencingUser cmdlet when obtaining the details of the user via the Get-CsOnlineDialInConferencingUser cmdlet. This is why the script first retrieves the complete LDAP string using the Get-CsOnlineDialInConferencingUserInfo. Convoluted I know, but it ensures that the script can work correctly and avoids any issues that may arise from duplicate Display Names on the Office 365 tenant.

All being well, with very little modification, the above code can be utilised to enforce a setting across the board or for a specific subset of users, if required. It does seem strange that the option is turned off by default, but there are understandable reasons why it may be desirable to curate the whole meeting experience for attendees. If you are considering rolling out Skype for Business Online Audio Conferencing in your Office 365 tenant in the near future, then this represents one of many considerations that you will have to take when it comes to the deployment. You should, therefore, think carefully and consult with your end-users to determine what their preferred setting is; you can then choose to enable/disable the AllowPSTNOnlyMeetingsByDefault setting accordingly.

The introduction of Azure Data Factory V2 represents the most opportune moment for data integration specialists to start investing their time in the product. Version 1 (V1) of the tool, which I started taking a look at last year, missed a lot of critical functionality that – in most typical circumstances – I could do in a matter of minutes via a SQL Server Integration Services (SSIS) DTSX package. The product had, to quote some specific examples:

  • No support for control low logic (foreach loops, if statements etc.)
  • Support for only “basic” data movement activities, with a minimal capability to perform or call data transformation activities from within SQL Server.
  • Some support for the deployment of DTSX packages, but with incredibly complex deployment options.
  • Little or no support for Integrated Development Environment (IDE’s), such as Visual Studio, or other typical DevOps scenarios.

In what seems like a short space of time, the product has come on leaps and bounds to address these limitations:

Supported now are Filter and Until conditions, alongside the expected ForEach and If conditionals.

When connecting to SQL Data destinations, we now have the ability to execute pre-copy scripts.

SSIS Integration Runtimes can now be set up from within the Azure Data Factory V2 GUI – no need to revert to PowerShell.

And finally, there is full support for storing all created resources within GitHub or Visual Studio Team Services Azure DevOps

The final one is a particularly nice touch, and means that you can straightforwardly incorporate Azure Data Factory V2 as part of your DevOps strategy with minimal effort – an ARM Resource Template deployment, containing all of your data factory components, will get your resources deployed out to new environments with ease. What’s even better is that this deployment template is intelligent enough not to recreate existing resources and only update Data Factory resources that have changed. Very nice.

Although a lot is provided for by Azure Data Factory V2 to assist with a typical DevOps cycle, there is one thing that the tool does not account for satisfactorily.

A critical aspect as part of any Azure Data Factory V2 deployment is the implementation of Triggers. These define the circumstances under which your pipelines will execute, typically either via an external event or based on a pre-defined schedule. Once activated, they effectively enter a “read-only” state, meaning that any changes made to them via a Resource Group Deployment will be blocked and the deployment will fail – as we can see below when running the New-AzureRmResourceGroupDeployment cmdlet directly from PowerShell:

It’s nice that the error is provided in JSON, as this can help to facilitate any error handling within your scripts.

The solution is simple – stop the Trigger programmatically as part of your DevOps execution cycle via the handy Stop-AzureRmDataFactoryV2Trigger. This step involves just a single line PowerShell Cmdlet that is callable from an Azure PowerShell task. But what happens if you are deploying your Azure Data Factory V2 template for the first time?

I’m sorry, but your Trigger is another castle.

The best (and only) resolution to get around this little niggle will be to construct a script that performs the appropriate checks on whether a Trigger exists to stop and skip over this step if it doesn’t yet exist. The following parameterised PowerShell script file will achieve these requirements by attempting to stop the Trigger called ‘MyDataFactoryTrigger’:

param($rgName, $dfName)

Try
{
   Write-Host "Attempting to stop MyDataFactoryTrigger Data Factory Trigger..."
   Get-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -ErrorAction Stop
   Stop-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -Force
   Write-Host -ForegroundColor Green "Trigger stopped successfully!"
}

Catch

{ 
    $errorMessage = $_.Exception.Message
    if($errorMessage -like '*NotFound*')
    {       
        Write-Host -ForegroundColor Yellow "Data Factory Trigger does not exist, probably because the script is being executed for the first time. Skipping..."
    }

    else
    {

        throw "An error occured whilst retrieving the MyDataFactoryTrigger trigger."
    } 
}

Write-Host "Script has finished executing."

To use successfully within Azure DevOps, be sure to provide values for the parameters in the Script Arguments field:

You can use pipeline Variables within arguments, which is useful if you reference the same value multiple times across your tasks.

With some nifty copy + paste action, you can accommodate the stopping of multiple Triggers as well – although if you have more than 3-4, then it may be more sensible to perform some iteration involving an array containing all of your Triggers, passed at runtime.

For completeness, you will also want to ensure that you restart the affected Triggers after any ARM Template deployment. The following PowerShell script will achieve this outcome:

param($rgName, $dfName)

Try
{
   Write-Host "Attempting to start MyDataFactoryTrigger Data Factory Trigger..."
   Start-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -Force -ErrorAction Stop
   Write-Host -ForegroundColor Green "Trigger started successfully!"
}

Catch

{ 
    throw "An error occured whilst starting the MyDataFactoryTrigger trigger."
}

Write-Host "Script has finished executing."

The Azure Data Factory V2 offering has no doubt come leaps and bounds in a short space of time…

…but you can’t shake the feeling that there is a lot that still needs to be done. The current release, granted, feels very stable and production-ready, but I think there is a whole range of enhancements that could be introduced to allow better feature parity when compared with SSIS DTSX packages. With this in place, and when taking into account the very significant cost differences between both offerings, I think it would make Azure Data Factory V2 a no-brainer option for almost every data integration scenario. The future looks very bright indeed 🙂

Once upon a time, there was a new cloud service known as Windows Azure. Over time, this cloud service developed with new features, became known more generally as just Azure, embraced the unthinkable from a technology standpoint and also went through a complete platform overhaul. Longstanding Azure users will remember the “classic” portal, with its very…distinctive…user interface:

Image courtesy of Microsoft

As the range of different services offered on Azure increased and the call for more efficient management tools became almost deafening, Microsoft announced the introduction of a new portal experience and Resource Group Management for Azure resources, both of which are now the de facto means of interacting with Azure today. The old style portal indicated above was officially discontinued earlier this year. In line with these changes, Microsoft introduced new, Resource Manager compatible versions of pretty much every major service available on the “classic” portal…with some notable exceptions. The following “classic” resources can still be created and worked with today using the new Azure portal:

This provides accommodation for those who are still operating compute resources dating back to the days of yonder, allowing you to create and manage resources that may be needed to ensure the continued success of your existing application deployment. In most cases, you will not want to create these “classic” resources as part of new project work, as the equivalent Resource Manager options should be more than sufficient for your needs. The only question mark around this concerns Cloud Services. There is no equivalent Resource Manager resource available currently, with the recommended option for new deployments being Azure Service Fabric instead. Based on my research online, there appears to be quite a feature breadth between both offerings, with Azure Service Fabric arguably being overkill for more simplistic requirements. There also appears to be some uncertainty over whether Cloud Services are technically considered deprecated or not. I would highly recommend reading Andreas Helland’s blog post on the subject and form your own opinion from there.

For both experiences, Microsoft provided a full set of automation tools in PowerShell to help developers carry out common tasks on the Azure Portal. These are split out into the standard Azure cmdlets for the “classic” experience and a set of AzureRM cmdlets for the new Resource Management approach. Although the “classic” Azure resource cmdlets are still available and supported, they very much operate in isolation – that is, if you have a requirement to interchangeably create “classic” and Resource Manager resources as part of the same script file, then you are going to encounter some major difficulties and errors. One example of this is that the ability to switch subscriptions that you have access, but not ownership, to becomes nigh on impossible to achieve. For this reason, I would recommend utilising AzureRM cmdlets solely if you ever have a requirement to create classic resources to maintain an existing deployment. To help accommodate this scenario, the New-AzureRmResource cmdlet really becomes your best friend. In a nutshell, it lets you create any Azure Resource of your choosing when executed. The catch around using it is that the exact syntax to utilise as part of the -ResourceType parameter can take some time to discover, particularly in the case of working with “classic” resources. What follows are some code snippets that, hopefully, provide you with a working set of cmdlets to create the “classic” resources highlighted in the screenshot above.

Before you begin…

To use any of the cmdlets that follow, make sure you have connected to Azure, selected your target subscription and have a Resource Group created to store your resources using the cmdlets below. You can obtain your Subscription ID by navigating to its properties within the Azure portal:

#Replace the parameter values below to suit your requirements

$subscriptionID = 36ef0d35-2775-40f7-b3a1-970a4c23eca2
$rgName = 'MyResourceGroup'
$location = 'UK South'

Set-ExecutionPolicy Unrestricted
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionID
#Create Resource Group
New-AzureRMResourceGroup -Name $rgName -Location $location

With this done, you should hopefully encounter no problems executing the cmdlets that follow.

Cloud Services (classic)

#Create an empty Cloud Service (classic) resource in MyResourceGroup in the UK South region

New-AzureRmResource -ResourceName 'MyClassicCloudService' -ResourceGroupName $rgName `
                    -ResourceType 'Microsoft.ClassicCompute/domainNames' -Location $location -Force
                    

Disks (classic)

#Create a Disk (classic) resource using a Linux operating system in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyClassicDisk' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageaccounts/disks' ` 
                    -Location $location `
                    -PropertyObject @{'DiskName'='MyClassicDisk' 
                    'Label'='My Classic Disk' 
                    'VhdUri'='https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'
                    'OperatingSystem' = 'Linux'
                    } -Force
                    

Network Security groups (classic)

#Create a Network Security Group (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyNSG' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/networkSecurityGroups' `
                    -Location $location -Force
                    

Reserved IP Addresses (classic)

#Create a Reserved IP (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyReservedIP' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/reservedIps' `
                    -Location $location -Force
                    

Storage Accounts (classic)

#Create a Storage Account (classic) resource in MyResourceGroup in the UK South region.
#Storage account with use Standard Locally Redundant Storage

New-AzureRmResource -ResourceName 'MyStorageAccount' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/StorageAccounts' ` 
                    -Location $location -PropertyObject @{'AccountType' = 'Standard-LRS'} -Force
                    

Virtual Networks (classic)

#Create a Virtual Network (classic) resource in MyResourceGroup in the UK South Region

New-AzureRmResource -ResourceName 'MyVNET' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/virtualNetworks' `
                    -Location $location
                    -PropertyObject @{'AddressSpace' = @{'AddressPrefixes' = '10.0.0.0/16'}
                                      'Subnets' = @{'name' = 'MySubnet'
                                                    'AddressPrefix' = '10.0.0.0/24'
                                                    }
                                     }
                                     

VM Images (classic)

#Create a VM image (classic) resource in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyVMImage' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageAccounts/vmImages' `
                    -Location $location `
                    -PropertyObject @{'Label' = 'MyVMImage Label'
                    'Description' = 'MyVMImage Description'
                    'OperatingSystemDisk' = @{'OsState' = 'Specialized'
                                              'Caching' = 'ReadOnly'
                                              'OperatingSytem' = 'Windows'
                                              'VhdUri' = 'https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'}
                    }

Conclusions or Wot I Think

The requirement to work with the cmdlets shown in this post should only really be a concern for those who are maintaining “classic” resources as part of an ongoing deployment. It is therefore important to emphasise not to use these cmdlets to create resources for new projects. Alongside the additional complexity involved in constructing the New-AzureRmResource cmdlet, there is an abundance of new, updated AzureRM cmdlets at your disposal that enables you to more intuitively create the correct types of resources. The key benefit that these examples provide is the ability to use a single Azure PowerShell module for the management of your entire Azure estate, as opposed to having to switch back and forth between different modules. It is perhaps a testament to how flexible Azure is that cmdlets like the New-AzureRmResource exist in the first place, ultimately enabling anybody to fine-tune deployment and maintenance scripts to suit any conceivable situation.