The life of a Dynamics CRM/Dynamics 365 for Customer Engagement (CRM/D365CE) professional is one of continual learning across many different technology areas within the core “stack” of the Business Applications platform. Microsoft has clarified this in no uncertain terms recently via the launch of the Power Platform offering, making it clear that cross-skilling across the various services associated with the Common Data Service is no longer an optional requirement, should you wish to build out a comprehensive business solution. I would not be surprised in the slightest if we find ourselves in a situation where the standard SSRS, Chart and Dashboarding options available within CRM/D365CE become deprecated soon, and Power BI becomes the preferred option for any reporting requirements involving the application. With this in mind, knowledge of Power BI becomes a critical requirement when developing and managing these applications, even more so when you consider how it is undoubtedly a core part of Microsoft’s product lineup; epitomised most clearly by the release of the Microsoft Certified Solutions Architect certification in BI Reporting earlier this year.

I have been doing a lot of hands-on and strategic work with Power BI this past year, a product for which I have a lot of affection and which has numerous business uses. As a consequence, I am in the process of going through the motions to attain the BI Reporting MCSA, having recently passed Exam 70-779: Analyzing and Visualizing Data with Microsoft Excel. As part of this week’s post, I wanted to share some general, non-NDA breaching advice for those who are contemplating going for the exam. I hope you find it useful šŸ™‚

Power BI Experience is Relevant

For an exam focused purely on the Excel sides of things, there are a lot of areas tested that have a significant amount of crossover with Power BI, such as:

  • Connecting to data sources via Power Query in Excel, an experience which is an almost carbon copy of working with Power Query within Power BI.
  • Although working with the Excel Data Model, for me at least, represented a significant learning curve when revising, it does have a lot of cross-functionality with Power BI, specifically when it comes to how DAX fits into the whole equation.
  • Power BI is even a tested component for this exam. You should, therefore, expect to know how to upload Excel workbooks/Data Models into Power BI and be thoroughly familiar with the Power BI Publisher for Excel.

Any previous knowledge around working with Power BI is going to give you a jet boost when it comes to tackling this exam, but do not take this for granted. There are some significant differences between both sets of products (epitomised by the fact that Excel and Power BI, in theory, address two distinctly different business requirements), and this is something that you will need to understand and potentially identify during the exam. But specific, detailed knowledge of some of the inner workings of Power BI is not going to be a disadvantage to you.

Learn *a lot* of DAX

DAX, or Data Analysis Expressions, are so important for this exam, and also for 70-778 as well. While it will not necessarily be required for you to memorise every single DAX expression available to pass the exam (although you are welcome to try!), you should be in a position to recognise the structure of the more common DAX functions available. You ideal DAX study areas before the exam may include, but is not limited to:

A focus, in particular, should be driven towards the syntax of these functions, to the extent that you can memorise example usage scenarios involving them.

Get the exam book

As with all exams, Microsoft has released an accompanying book that is a handy revision guide and reference point for self-study. On balance, I feel this is one of the better exam reference books that I have come across, but beware of the occasional errata and, given the frequency of changes these days thanks to the regular Office 365 release cycle, be sure to supplement your learning with any proper online cross-checking.

Setup a dedicated lab environment

This task can be accomplished alongside working through the exercises in the exam book referenced above but, as with any exam, hands-on experience using the product is the best way of getting a passing grade. Download a copy of SQL Server Developer edition, restore one of the sample databases made available by Microsoft, get a copy of Excel 2016 and – hey presto! – you now have a working lab environment & dataset that you can interact with to your heart’s content.

Pivot yourself toward greater Excel knowledge

Almost a quarter of the exam tests candidates on the broad range of PivotTable/PivotChart visualisations made available within Excel. With this in mind, some very detailed, specific knowledge is required in each of the following areas to stand a good chance of a passing grade:

  • PivotTables: How they are structured, how to modify the displaying of Totals/Subtotals, changing their layout configuration, filtering (Slicers, Timelines etc.), auto-refresh options, aggregations/summarising data and the difference between Implicit and Explicit Measures.
  • PivotCharts: What chart types are available in previous and newer versions of Excel (and which aren’t), understanding the ideal usage scenario for each chart type, understanding the different variants available for each chart types, understanding the structure of a chart (Legend, Axis etc.), chart filtering and formatting options available for each chart.

Check out the relevant edX course

As a revision tool, I found the following edX course of great assistance and free of charge to work through:

Analyzing and Visualizing Data with Excel

The course syllable mirrors itself firmly to the skills measured for the exam and represents a useful visual tool for self-study or as a means of quickly filling any knowledge gaps.

Conclusions or Wot I Think

It is always essential, within the IT space, to keep one eye over the garden fence to see what is happening in related technology areas. This simple action of “keeping up with the Joneses” is to ensure no surprises down the road and to ensure that you can keep your skills relevant for the here and now. In case you didn’t realise already, Power BI is very much one of those things that traditional reporting analysts and CRM/D365CE professionals should be contemplating working with, now or in the immediate future. As well as being a dream to work with, it affords you the best opportunity to implement a reporting solution that will both excite and innovate end users. For me, it has allowed me to develop client relationships further once putting the solution in place, as users increasingly ask us to bring in other data sources into the solution. Whereas typically, this may have resulted in a protracted and costly development cycle to implement, Power BI takes all the hassle out of this and lets us instead focus on creating the most engaging range of visualisations possible for the data in question. I would strongly urge any CRM/D365CE professional to start learning about Power BI when they can and, as the next logical step, look to go for the BI Reporting MCSA.

With two major Microsoft events recently taking place back to back over the last fortnight – Microsoft Inspire & the Business Applications Summit – there is, understandably, a plethora of major new announcements that concern those of us who are working in the Business Applications space today. The critical announcement from my perspective is the October 2018 Business Application Release Notes, which gives us all a nice and early look at what is going to be released soon for Dynamics 365, Microsoft Flow, PowerApps, Power BI and other related services. Unlike previous Spring or Fall releases, the sheer breadth of different features that now sit within the Business Applications space makes it all the more important to consider any new announcement carefully and to ensure that they are adequately factored into any architectural decisions in months ahead. If you are having trouble wading through all 239 pages of the document, then I have been through the notes and picked out what I feel are most relevant highlights from a Dynamics CRM/Dynamics 365 Customer Engagement (D365CE) perspective and their potential impact or applicability to business scenarios.

SharePoint Integration with Portals

This is a biggie and a feature that no doubt many portal administrators have been clamouring for, with the only other option being a complicated SDK solution or a third-party vendor approach. Document management directly within CRM/D365CE has always been a sketchy idea at best when you consider the database size limitations of the application and the cost for additional database storage. That’s why SharePoint has always represented the optimal choice for storing any documents related to a record, facilitating a much more inexpensive route and affording opportunities to take advantage of the vast array of SharePoint features. When you start adding portals into the mix – for example, to enable customers to upload documents relating to a loan application – the whole thing currently falls flat on its face, as documents (to the best of my knowledge) can only be uploaded and stored directly within CRM/D365CE. With the removal of this feature, a significant adoption barrier for CRM Portals will be eliminated, and I am pleased to also see an obligatory Power BI reference included as part of this announcement šŸ™‚

In addition, we are providing the ability to embed Power BI charts within a portal, allowing users to benefit from the interactive visualizations of Power BI.

Portal Configuration Migration

Another process that can regularly feel disjointed and laborious are the steps involved in deploying Portal changes from Dev -> UAT/Test -> Production environments, with no straightforward means of packaging up changes via a Solution or similar for easy transportation. This torment promises to change as part of the release in October, thanks to the following:

To reduce the time and effort required to manage portal configuration across environments, we are publishing schema for configuration migration that works with the Configuration Migration SDK tool.

If you are not aware of the Configuration Migration tool, then you owe it to yourself to find out more about what it can accomplish, as I am sure it will take a lot of headache out of everyday business settings, product catalogue or other non-solution customisation activity that you may be carrying out in multiple environments. The neat thing about this particular announcement is that an existing, well-established tool can be used to achieve these new requirements, as opposed to an entirely new, unfamiliar mechanism. Integration with the current Configuration Migration tool will surely help in adopting this solution more quickly and enable deployment profiles to be put together that contain nearly all required configuration data for migration.

Portal Access Restrictions

In Portal terms, this is a relatively minor one, but a welcome addition nonetheless. When testing and developing any software application, it is always prudent to restrict access to only the users or organisations who require access to it. This option has not been available to Portals to date, but no longer thanks to the following announcement:

This feature would allow administrators to define a list of IP addresses that are allowed to access your portal. The allow list can include individual IP addresses or a range of IP addresses defined by a subnet mask. When a request to the portal is generated from any user, their IP address is evaluated against the allow list. If the IP address is not in the list, the portal replies with an HTTP 403 status code

The capabilities exposed here demonstrate a lot of parity with Azure Web Apps, which is, I understand, what is used to host portals. I would hope that we can see the exposure of more Azure Web App configuration features for portal administrators in the years ahead.

Multi-resource Scheduling

There has been a real drive in getting the Resource Scheduling experience within D365CE looking as visually optimal and feature-rich as possible in recent years. There is a specific reason to explain this – the introduction of Project Service Automation and Field Service capability requires this as an almost mandatory pre-requisite. There is a wide array of new features relating to resource scheduling as part of this update, but the announcement that caught my eye, in particular, was the ability to group related resources on the Resource Scheduler, as predefined “crews”. This new feature is hugely welcome for many reasons:

  • Different types of jobs/work may require resources with a specific set of skills in combination to complete.
  • It may be prudent to group specific resources if, for example, previous experience tells you that they work well together.
  • Location may be a factor as part of all this, meaning that by scheduling a “crew” of resources together within the same locale, you can reduce the unnecessary effort involved in travelling and ensure your resources are utilising their time more effectively.

The release notes give us a teaser of how this will look in practice, and I am eager to see how this works in practice:

Leave and absence management in Dynamics 365 Talent

I have been watching with casual, distant interest how the Dynamics 365 Talent product has been developing since its release, billed as one of the first applications built on top of the new Unified Interface/Common Data Service experience. I have noted its primary utility to date has been more towards the Human Resources hiring and onboarding process, with a significant feature gap that other HR systems on the market today would more than happily fill, by providing central hubs for policy documents, managing personal information and leave requests. I think there may be a recognition of this fact within Microsoft, which explains the range of new features contained within Dynamics 365 Talent as part of the October 2018 release. The new feature that best epitomises the applications maturity is the ability to manage leaves and absences, noted as follows:

Organizations can configure rules and policies related to their leave and absence plans. They can choose how employees accrue their time off, whether itā€™s by years of service or by hours worked. They also can configure when this time off can be taken and if certain types of time off must be taken before others. If they allow employees to get a pay-out of their time off, this can be configured as well.

Managers can see an all-up calendar view of their team members’ time off as well as company holidays and closures. This view shows them where they may have overlap as well as time-off trends for their team and enables them to drill down to gain a better understanding of an individual’s time off.

This immediately places the system as a possible challenger to other HR systems and represents a natural, and much needed, coming-of-age development for the system. I would undoubtedly say that Dynamics 365 Talent is starting to become something that warrants much closer attention in future.

Develop Microsoft Flows Using Visio

Microsoft Flow is great. This fact should be self-evident to regular followers of the blog. As a regularly developing, relatively young product, though, it is understandable that some aspects of it require further work. An excellent example of this is the ability to manage the deployment of Flows between different environments or stages. While Flows big brother, Microsoft Logic Apps, has this pretty well covered, the ability to deploy development or concepts Flows repeatedly often ends up being a case of manually creating each Flow again from scratch, which isn’t exactly fun.

The October release promises to change this with the introduction of a specific piece of integration with Microsoft Visio:

Microsoft Visio enables enterprises to capture their business processes using its rich modeling capabilities. Anyone who creates flowcharts or SharePoint workflows can now use Visio to design Microsoft Flow workflows. You can use Visio’s sharing and commenting capabilities to collaborate with multiple stakeholders and arrive at a complete workflow in little time. As requested here, you can publish the workflow to Microsoft Flow, then supply parameters to activate it.

This feature will be available to Visio Online Plan 2 subscription users. Office Insiders can expect early access in July 2018. In the future, you’ll also be able to export existing Flows and modify them in Visio.

Now, it’s worth noting, in particular, the requirement for Visio Online Plan 2 to accommodate this neat piece of functionality. But, assuming this is not an issue for your organisation, the potential here to define Flows locally, share them quickly for approval, and deploy themĀ en masse is enormous, bringing a much-needed degree of automation to a product that currently does not support this. I’m looking forward to getting my hands on this in due course.

Custom Fonts in Power BI

Continuing the theme of obligatory Power BI references, my final pick has to be the introduction of Custom Fonts into Power BI, which will be in Public Preview as part of October’s release:

Corporate themes often include specific fonts that are distributed and used throughout the company. You can use those fonts in your Power BI reports.

For any font property, Power BI Desktop will show a complete list of all the fonts installed on your computer. You can choose from these to use in your report. When distributing the report, anyone with the font installed will see it reflected in the report. If the end user doesn’t have it installed, it falls back to the default font.

For those who have particular branding requirements that require accommodation within their Power BI Reports, this new feature completes the puzzle and takes you an additional step further in transforming your reports so that they are almost unrecognisable from a default Power BI Report. Hopefully, the preview period for this new feature will be relatively short and then rolled out as part of general availability.

Conclusions or Wot I Think

The list above is just a flavour of my “choice cuts” of the most exciting features that will be in our hands within the next few months, and I really would urge you to read through the entire document if you have even just a little passing interest in any of the technologies included in these release notes. As you can tell, my list is ever so skewered towards Portals out of everything else. This is for a good reason – ever since Microsoft’s acquisition of ADXStudio a few years back, we have seen some progress in the development of CRM Portals from Microsoft, mainly in the context of integrating the product more tightly for Online users. In my view, this has been the only significant effort we have seen in taking the product forward, with a relatively extensive list of backlog feature requests that looked to have been consigned to the recycling bin. The October Release very much seems to flip this on its head and I am pleased to discover a whole range of new, most clamoured for, features being made available on Portals, which take the product forward in strides and enables organisations to more easily contemplate their introduction.

As you will probably expect based on where things are going in the D365CE space at the moment, the announcements for Flow, PowerApps and the Common Data Service are all very much framed towards the end goal of integrating these and the “old” CRM/D365CE experience together as tightly as possible, a change that should be welcomed. The release notes are also crucial in highlighting the importance of anyone working in this space to be as multi-skilled as possible from a technology standpoint. Microsoft is (quite rightly) encouraging all technology professionals to be fast and reactive to change, and anticipating us to have a diverse range of skills to help the organisations/businesses we work with every day. There is no point in fighting this and, the best way for you to succeed in this climate is to identify the relevant opportunities that you can drive forward from these product announcements and proactively implement as part of the work you are doing each day. In a nutshell, you should know how to deploy a Power BI Dashboard, have familiarity with the type of services that Flow connects to, see the difference between a Canvas and Model-driven PowerApps and – amongst all of this – understand how D365CE solutions operate. Be a Swiss Army Knife as much as possible and deliver as much value and benefit in your role as you possibly can.

Slight change of pace with this week’s blog post, which will be a fairly condensed and self-indulgent affair – due to personal circumstances, I have been waylaid somewhat when it comes to producing content for the blog and I have also been unable to make any further progress with my new YouTube video series. Hoping that normal service will resume shortly, meaning additional videos and more content-rich blog posts, so stay tuned.

I’ve been running the CRM Chap blog for just over 2 years now. Over this time, I have been humbled and proud to have received numerous visitors to the site, some of whom have been kind enough to provide feedback or to share some of their Dynamics CRM/365 predicaments with me. Having reached such a landmark now seems to be good a time as any to take a look back on the posts that have received the most attention and to, potentially, give those who missed them the opportunity to read them. In descending order, here is the list of the most viewed posts to date on the crmchap.co.uk website:

  1. Utilising SQL Server Stored Procedures with Power BI
  2. Installing Dynamics CRM 2016 SP1 On-Premise
  3. Power BI Deep Dive: Using the Web API to Query Dynamics CRM/365 for Enterprise
  4. Utilising Pre/Post Entity Images in a Dynamics CRM Plugin
  5. Modifying System/Custom Views FetchXML Query in Dynamics CRM
  6. Grant Send on Behalf Permissions for Shared Mailbox (Exchange Online)
  7. Getting Started with Portal Theming (ADXStudio/CRM Portals)
  8. Microsoft Dynamics 365 Data Export Service Review
  9. What’s New in the Dynamics 365 Developer Toolkit
  10. Implementing Tracing in your CRM Plug-ins

I suppose it is a testament to the blog’s stated purpose that posts covering areas not exclusive to Dynamics CRM/365 rank so highly on the list and, indeed, represents how this application is so deeply intertwined with other technology areas within the Microsoft “stack”.

To all new and long-standing followers of the blog, thank you for your continued support and appreciation for the content šŸ™‚

The very recent Microsoft Data Amp event provided an excellent forum for theĀ SQL Server 2017 announcement, which is due to be released at some point this year. Perhaps the most touted feature of the new version is that it will be available to be installed on Linux; an entirely inconceivable premise 10 years ago, which just goes to show how far Microsoft have changed in their approach to supporting non-Windows platforms as standard. Past the big headline announcements, there is a lot to look forward to underneath the hood with SQL Server 2017 that may act as encouragement for organisations looking to upgrade in the near future.

In this week’s post, I’ll be taking a closer look at 3 new features I am most looking forward to, that are present within the SQL Server Community Technical Preview (CTP) 2.0 version and which will form part of the SQL Server 2017 release later on this year.

Power BI in SSRS: A Match Made in Heaven

This is by far the feature I am most looking forward to seeing in action. I have been working more and more with Power BI this year, often diving into the deep-end in respect to what can be achieved with the product, and I have been impressed with how it can be used for addressing reporting scenarios that SSRS may struggle with natively. The announcement earlier this year that Power BI would be included as part of SSRS in the next major release of the productĀ was, therefore, incredibly welcome and its inclusion as part of SQL Server 2017 is confirmed by the inclusion of Power BI reports in the CTP 2.0 release.

For those who are already familiar with Power BI, there is thankfully not much that you need to learn to get up and running with Power BI in SSRS.Ā One thing to point out is that you will need to download a completely separateĀ version of the Power BI Desktop App to allow you to deploy your Power BI reports to SSRS. I would hope that this is mitigated once SQL Server 2017 is released so that we are can deploy from just a single application for either Online or SSRS 2017. Users who are experienced with the existing Power BI Desktop application should have no trouble using the equivalent product for SSRS, as they are virtually identical.

The actual process of deploying a Power BI report is relatively straightforward. After making sure that you have installed the SSRS Power BI Desktop Application, you can then navigate to your SSRS homepage and select + New -> Power BI Report:

You will be greeted with a prompt similar to the below and the Power BI Desktop application will open automatically:

Now it’s time to build your report šŸ™‚ As an example, I have used the WideWorldImporters Sample Database to build a simplistic Power BI report:

If you were working with Power BI online, then this would be the stage where you would click the Publish button to get it onto your online Power BI tenant. The option to deploy to your SSRS instance is currently missing from Power BI in SSRS application; instead, you will need to manually upload your .pbix file into Reporting Services via the Upload button. Once uploaded, your report will be visible on the home page and can be navigated to in the usual manner:

Simplified CSV Importing

Anyone who has at least some experience working with databases and application systems should have a good overview of the nuances of delimited flat file types – in particular, Comma Separated Value (.csv) files. This file type is generally the de-facto format when working with data exported from systems and, more often than not, will be the most common file type that you will regularly need to import into a SQL Server database. Previously, if you didn’t opt to use the Import Wizard/.dtsx package to straightforwardly get your .csv file imported, you would have to rely on the following example script:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH
	(
		FIELDTERMINATOR = ',',
		ROWTERMINATOR = '\n'
	)

Now, with SQL Server 2017, you can simplify your query byĀ replacing FIELDTERMINATOR and ROWTERMINATOR with a new FORMAT parameter, that specifies the file format we are concerned with:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH (FORMAT = 'CSV');

Whilst the overall impact on your query length is somewhat negligible, it is nice that a much more simplified means of accomplishing a common database task has been introduced and that we now also have the option of accessing Azure Blob Storage locations forĀ import files.

Updated Icons for SSMS

Typically, as part of any major update to the application, the “under the hood” visual side of things are generally not changed much. A good example of this can be found within CRM Online/Dynamics 365 for Enterprise within the Customizations area of the application, which has not seen much of a facelift since CRM 2011. As a result, a lot of the icons can look inconsistent with the application as a whole. As these are generally the areas of the application that we use the most day in, day out, it can be a little discouraging not to see these areas get any love or attention as part of a major update… šŸ™

With this in mind, it is pleasing to see that the updated SSMS client for SQL Server 2017 has been given refreshed icons that bring the application more in line with how Visual Studio and other Microsoft products are looking these days. Below is a comparison screenshot, comparing SSMS 2014 with SSMS 2017:

Conclusions or Wot I Think

Whilst there is a lot more to look forward to with the new release that is not covered in this post (for example, the enhancements to R server and deeper integration with AI tools), I believe that the most exciting and important announcement for those with their Business Intelligence/Reporting hats on is the introduction of Power BI into SSRS. Previously, each tool was well suited for a specific reporting purpose – SSRS was great for designing reports that require a lot of visual tailoring and widely common formats for exporting, whereas Power BI is more geared towards real-time, dashboard views that marry together disparate data sources in a straightforward way. By being able to leverage SSRS to fully utilise Power BI reports, the application suddenly becomes a lot more versatile and the potential for combining together functionality becomes a lot more recognisable. So, for example, having the ability to drill down to an SSRS report from a Power BI report would be an excellent way of providing reporting capabilities that satisfy end-user consumption in 2 different, but wildly applicable, scenarios.

In summary, the SQL Server 2017 release looks to be very much focused on bringing the product up to date with the new state of play at Microsoft, successfully managing to achieve cross-platform requirements alongside bringing exciting functionality (that was previously cloud-only) into the hands of organisations who still have a requirement to run their database systems on their on-premise infrastructure. I’m eagerly looking forward to the release later on this year and in seeing the product perform in action. šŸ™‚

Those who have experience working with an RDMS system like SQL Server will become accustomed towards a certain way of going about things. These can often involve a mixture of “lazy” query writing (e.g. using SELECT *… as opposed to SELECT Column1, Column2…), the manner in which you write your query (ALL CAPS or lower case) and best practice approaches. One arguable example of a best practice approach is the use of Stored Procedures. An illustration of how to use a Stored Procedure can most readily demonstrate their benefits. Take a look at the T-SQL query below, which should execute fine against the AdventureWorks2012 sample database:

SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
FROM [HumanResources].[Employee] AS E
 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
 INNER JOIN [HumanResources].[Department] AS D
  ON DH.[DepartmentID] = D.[DepartmentID]
 INNER JOIN [Person].[Person] AS P
  ON E.[BusinessEntityID] = P.[BusinessEntityID]
WHERE DH.[EndDate] IS NULL
AND E.[JobTitle] = 'Production Technician - WC50'

The query returns the data we need, but not in an efficient manner. Consider the following:

  • Executing a query like the above, in-code, as part of an end-user application could expose your database to the risk of an SQL Injection attack or similar.
  • The query compromises a lot of information regarding our underlying database structure, information which any underlying client executing the query neither cares or should have to worry about.
  • The example is a very precise query, with a specific function – i.e. get me all the current employees who have the Job Title ofĀ Production Technician – WC50. If we wanted to modify it to instead obtain allĀ Senior Tool Designers, we would have to write a completely separate query to accommodate this.

Implementing a Stored Procedure to encapsulate our query logic immediately addresses the above concerns, by providing us with a single-line query into the database, giving us just the data we need and enables us to utilise the query for other scenarios as well. Setting one up is very straight forward via the CREATE PROCEDURE command – the rest is pretty much what we have put together already:

CREATE PROCEDURE dbo.uspGetEmployeesByJobTitle 
	@JobTitle NVARCHAR(50)
AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

	SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
	FROM [HumanResources].[Employee] AS E
	 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
	  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
	 INNER JOIN [HumanResources].[Department] AS D
	  ON DH.[DepartmentID] = D.[DepartmentID]
	 INNER JOIN [Person].[Person] AS P
	  ON E.[BusinessEntityID] = P.[BusinessEntityID]
	WHERE DH.[EndDate] IS NULL
	AND E.[JobTitle] = @JobTitle
END
GO

By utilising a parameter for our WHERE clause filter on the Job Title, we can pass any valid value back to our stored procedure, immediately making our initial query more versatile across our reporting/business application. And, as a primary bonus, we can now safely take a 10 line query down to 1:

EXECUTE dbo.uspGetEmployeesByJobTitle @JobTitle = 'Senior Tool Designer'

So we have established that Stored Procedures are wicked cool awesome – but what does this have to do with PowerBI?!?Ā 

Having worked with SQL Server Reporting Services (SSRS) extensively in the past, I have become accustomed to using Stored Procedures as a mechanism for storing underlying query logic within the database and having a straightforward means of referencing this from my .rdl file. I can only assume from this that this is the “norm” and preferred method of querying SQL data, as opposed to a direct SELECT statement.

When recently doing some work within PowerBI involving Azure SQL Databases, I was, therefore, surprised that there was no option to return data via a stored procedure as default. Instead, PowerBI would prefer me to directly query underlying table/view objects:

Thankfully, when inspecting the underlying PowerQuery used to return an example table from the above, it doesn’t use any kind of SELECT query to get the data:

let
    Source = Sql.Databases("mydatabaseinstance"),
    AdventureWorks2012 = Source{[Name="AdventureWorks2012"]}[Data],
    Production_ProductModel = AdventureWorks2012{[Schema="Production",Item="ProductModel"]}[Data]
in
    Production_ProductModel

Unfortunately, the same cannot be said for if you select the Advanced options area and input your own SQL query directly:

let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]#(lf)FROM [HumanResources].[Employee] AS E#(lf) INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH#(lf)  ON E.[BusinessEntityID] = DH.[BusinessEntityID]#(lf) INNER JOIN [HumanResources].[Department] AS D#(lf)  ON DH.[DepartmentID] = D.[DepartmentID]#(lf) INNER JOIN [Person].[Person] AS P#(lf)  ON E.[BusinessEntityID] = P.[BusinessEntityID]#(lf)WHERE DH.[EndDate] IS NULL#(lf)AND E.[JobTitle] = 'Senior Tool Designer'"])
in
    Source

I do NOT recommend you use the above method to query your SQL Server data!

I have spoken previously on the blog in respect to conventions around working with datasets i.e. only get what you need, and nothing else.Ā As I work more and more with PowerBI, the tool very much seems to be geared towards flipping this mentality on its head. PowerBI has a number of built-in tools that seem to scream out “Just get ALL your data in here, we’ll worry about the rest!”. I realise that the difference between MB and GB these days, from a storage/cost point of view, is minimal;Ā nevertheless, I still believe it is prudent not to put all your eggs in one basket and ensure that your business data is not being stored cavalier-esque within a multitude of different cloud services.

With this in mind, it is good to know that youĀ can utiliseĀ stored procedures in PowerBI. You basically have two ways in which this can be achieved:

  • Going back to theĀ Advanced options screen above on theĀ SQL Server databaseĀ wizard, you can EXECUTE your stored procedure directly using the following SQL Statement:
DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''
EXEC (@SP)

Be sure to specify your database and don’t forget the double quotes!

  • If youĀ prefer to use PowerQuery as opposed to the wizard above, then the following code will also work:
let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''#(lf)EXEC (@SP)"])
in
    Source

In both cases, you will be required to authenticate with the database and your result set should return as follows if using the AdventureWorks2012 example database/code:

Finally, as a best-practice security step, you should ensure that the account connecting to your SQL Server instance is restricted to onlyĀ EXECUTE the procedures you have specified. This can be achieved via the following T-SQL snippet, executed against your database instance:

GRANT EXECUTE ON OBJECT::dbo.uspMyStoredProcedure  
    TO MySQLServerLogin 

Conclusions or Wot I Think

PowerBI is increasingly becoming a more relevant tool for traditional Business Intelligence/Reporting Services experienced professionals. The bellwether for this can surely be seen in the current Technical Preview for SQL Server Reporting Services, which includes PowerBI reports built-in to the application. Although we have no timescales at this stage at when the next major version of SQL Server will be released, it is reasonable to assume by the end of this year at the earliest, bringing PowerBI reports as a new feature. I am really excited about the introduction of PowerBI into SSRS, as it would appear to be a match made in heaven – giving an opportunity for those with experience in both products the flexibility to develop a unified, best of breed solution, using traditional .rdl reporting capability and/or PowerQuery/DAX functionality.

With the above on the horizon, the importance of being able to integrate seamlessly with SQL Server and having support for traditional/well-proven practices become crucial indicators of whether this match will be over before the honeymoon. And so, I would hope to see the option to access SQL Server data via Stored Procedures become standard when using the built-in data connector within PowerBI. Based on the feedback I have seen online, I’d warrant towards how welcome this feature could be and an excellent way of reducing the need for direct coding to achieve a common requirement within PowerBI.