The very recent Microsoft Data Amp event provided an excellent forum for the SQL Server 2017 announcement, which is due to be released at some point this year. Perhaps the most touted feature of the new version is that it will be available to be installed on Linux; an entirely inconceivable premise 10 years ago, which just goes to show how far Microsoft have changed in their approach to supporting non-Windows platforms as standard. Past the big headline announcements, there is a lot to look forward to underneath the hood with SQL Server 2017 that may act as encouragement for organisations looking to upgrade in the near future.

In this week’s post, I’ll be taking a closer look at 3 new features I am most looking forward to, that are present within the SQL Server Community Technical Preview (CTP) 2.0 version and which will form part of the SQL Server 2017 release later on this year.

Power BI in SSRS: A Match Made in Heaven

This is by far the feature I am most looking forward to seeing in action. I have been working more and more with Power BI this year, often diving into the deep-end in respect to what can be achieved with the product, and I have been impressed with how it can be used for addressing reporting scenarios that SSRS may struggle with natively. The announcement earlier this year that Power BI would be included as part of SSRS in the next major release of the product was, therefore, incredibly welcome and its inclusion as part of SQL Server 2017 is confirmed by the inclusion of Power BI reports in the CTP 2.0 release.

For those who are already familiar with Power BI, there is thankfully not much that you need to learn to get up and running with Power BI in SSRS. One thing to point out is that you will need to download a completely separate version of the Power BI Desktop App to allow you to deploy your Power BI reports to SSRS. I would hope that this is mitigated once SQL Server 2017 is released so that we are can deploy from just a single application for either Online or SSRS 2017. Users who are experienced with the existing Power BI Desktop application should have no trouble using the equivalent product for SSRS, as they are virtually identical.

The actual process of deploying a Power BI report is relatively straightforward. After making sure that you have installed the SSRS Power BI Desktop Application, you can then navigate to your SSRS homepage and select + New -> Power BI Report:

You will be greeted with a prompt similar to the below and the Power BI Desktop application will open automatically:

Now it’s time to build your report 🙂 As an example, I have used the WideWorldImporters Sample Database to build a simplistic Power BI report:

If you were working with Power BI online, then this would be the stage where you would click the Publish button to get it onto your online Power BI tenant. The option to deploy to your SSRS instance is currently missing from Power BI in SSRS application; instead, you will need to manually upload your .pbix file into Reporting Services via the Upload button. Once uploaded, your report will be visible on the home page and can be navigated to in the usual manner:

Simplified CSV Importing

Anyone who has at least some experience working with databases and application systems should have a good overview of the nuances of delimited flat file types – in particular, Comma Separated Value (.csv) files. This file type is generally the de-facto format when working with data exported from systems and, more often than not, will be the most common file type that you will regularly need to import into a SQL Server database. Previously, if you didn’t opt to use the Import Wizard/.dtsx package to straightforwardly get your .csv file imported, you would have to rely on the following example script:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH
	(
		FIELDTERMINATOR = ',',
		ROWTERMINATOR = '\n'
	)

Now, with SQL Server 2017, you can simplify your query by replacing FIELDTERMINATOR and ROWTERMINATOR with a new FORMAT parameter, that specifies the file format we are concerned with:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH (FORMAT = 'CSV');

Whilst the overall impact on your query length is somewhat negligible, it is nice that a much more simplified means of accomplishing a common database task has been introduced and that we now also have the option of accessing Azure Blob Storage locations for import files.

Updated Icons for SSMS

Typically, as part of any major update to the application, the “under the hood” visual side of things are generally not changed much. A good example of this can be found within CRM Online/Dynamics 365 for Enterprise within the Customizations area of the application, which has not seen much of a facelift since CRM 2011. As a result, a lot of the icons can look inconsistent with the application as a whole. As these are generally the areas of the application that we use the most day in, day out, it can be a little discouraging not to see these areas get any love or attention as part of a major update… 🙁

With this in mind, it is pleasing to see that the updated SSMS client for SQL Server 2017 has been given refreshed icons that bring the application more in line with how Visual Studio and other Microsoft products are looking these days. Below is a comparison screenshot, comparing SSMS 2014 with SSMS 2017:

Conclusions or Wot I Think

Whilst there is a lot more to look forward to with the new release that is not covered in this post (for example, the enhancements to R server and deeper integration with AI tools), I believe that the most exciting and important announcement for those with their Business Intelligence/Reporting hats on is the introduction of Power BI into SSRS. Previously, each tool was well suited for a specific reporting purpose – SSRS was great for designing reports that require a lot of visual tailoring and widely common formats for exporting, whereas Power BI is more geared towards real-time, dashboard views that marry together disparate data sources in a straightforward way. By being able to leverage SSRS to fully utilise Power BI reports, the application suddenly becomes a lot more versatile and the potential for combining together functionality becomes a lot more recognisable. So, for example, having the ability to drill down to an SSRS report from a Power BI report would be an excellent way of providing reporting capabilities that satisfy end-user consumption in 2 different, but wildly applicable, scenarios.

In summary, the SQL Server 2017 release looks to be very much focused on bringing the product up to date with the new state of play at Microsoft, successfully managing to achieve cross-platform requirements alongside bringing exciting functionality (that was previously cloud-only) into the hands of organisations who still have a requirement to run their database systems on their on-premise infrastructure. I’m eagerly looking forward to the release later on this year and in seeing the product perform in action. 🙂

When you have spent any length of time working with Dynamics CRM Online/Dynamics 365 for Enterprise (D365E) data programmatically, you become accustomed to how Option Set, State and Status Reason values are presented to you in code. To explain, the application does not store your Option Set value display names within the SQL Server Entity tables; rather, the Option Set Value that has been specified alongside your Label is what is stored as an integer value. That is why you are always mandatorily prompted to provide both values within the application:

The following benefits are realised as a result of how this is setup:

That being said, when working with these field types in code, you do always have to have the application window open or a list of all Labels/Values to hand so that you don’t get too confused… 🙂

I have previously extolled the virtues of the Data Export Service on the blog, and why you should consider it if you have basic integration requirements for your CRM/D365E deployment. One area in which it differs from other products on the market is how it handles the field types discussed above. For example, when exporting data to a SQL database via Scribe Online, new columns are created alongside that contain the “Display Name” (i.e. label value) that correspond to each Option, Status and Status Reason Label. So by running the following query against a Scribe export database:

SELECT DISTINCT statecode, statecode_displayname
FROM dbo.account

We get the best of both worlds – our underlying statecode value and their display names – all in 2 lines of code:

This is a big help, particularly when you are then using the data as part of a report, as no additional transformation steps are required and your underlying SQL query can be kept as compact as possible.

The Data Export Service differs from the above in an understandable way, as display name values for Status, Status Reason and Option Set column values are instead segregated out into their own separate table objects in your Azure SQL database:

OptionSetMetadata

GlobalOptionSetMetadata

StateMetadata

StatusMetadata

Why understandable? If you consider how the application can support multiple languages, then you realise that this can also apply to metadata objects across the application – such as field names, view names and – wouldn’t you have guessed it – Labels too. So when we inspect the OptionSetMetadata table, we can see that the table structure accommodates the storing of labels in multiple languages via the LocalizedLabelLanguageCode field:

Unlike the Scribe Online route above (which I assume only retrieves the Labels that correspond to the user account that authenticates with CRM/D365E), the Data Export Service becomes instantly more desirable if you are required to build multi-language reports referencing CRM/D365E application data.

The issue that you have to reconcile yourself with is that your SQL queries, if being expressed as natively as possible, instantly become a whole lot more complex. For example, to achieve the same results as the query above, it would have to be adapted as follows for the Data Export Service:

SELECT DISTINCT statecode, LocalizedLabel
FROM dbo.account
 LEFT JOIN dbo.StateMetadata
  ON 'account' = EntityName
  AND statecode = [State]
  AND '1033' = LocalizedLabelLanguageCode

The above is a very basic example, but if your query is complex – and involves multiple Option Set Values – then you would have to resort to using Common Table Expressions (CTE’s) to accommodate each potential JOIN required to get the information you want.

In these moments, we can look at some of the wider functionality provided as part of SQL Server to develop a solution that will keep things as simple as possible and, in this particular instance, a user-defined function is an excellent candidate to consider. These enable you to perform complex operations against the database platform and encapsulate them within very simply expressed objects that can also accept parameters. The good thing about functions is that they can be used to return table objects and scalar (i.e. single) objects.

Using a scalar function, we can, therefore, remove some of the complexity behind returning Option Set, Status and Status Reason labels by creating a function that returns the correct label, based on input parameters received by the function. You could look at creating a “master” function that, based on the input parameters, queries the correct Metadata table for the information you need; but in this example, we are going to look at creating a function for each type of field – Status, Status Reason, Option Set and Global Option Set.

To do this, connect up to your Data Export Service database and open up a new query window, ensuring that the context is set to the correct database. Paste the following code in the window and then hit Execute:

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

--Create Function to return Global Option Set Labels

CREATE FUNCTION [dbo].[fnGetGlobalOptionSetLabel]
(
	@GlobalOptionSetName NVARCHAR(64), --The logical name of the Global Option Set
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Global Option Set field ' + @GlobalOptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Option Set Labels

CREATE FUNCTION [dbo].[fnGetOptionSetLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Option Set field
	@OptionSetName NVARCHAR(64), --The logical name of the Option Set field
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Option Set field ' + @OptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Labels

CREATE FUNCTION [dbo].[fnGetStateLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status field
	@State INT, --The Status option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain State label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Reason Labels

CREATE FUNCTION [dbo].[fnGetStatusLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status Reason field
	@Status INT, --The Status Reason option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain Status label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

This will then go off and create the functions listed in code, which should then show up under the Programmability folder on your SQL database:

For those who are unsure at what the SQL code is doing, it first attempts to determine if only 1 Label can be found for your appropriate field type, based on the parameters provided. If it is successful, then a value is returned; otherwise, the CAST function is designed to force an error to return back to the caller to indicate that none or more than 1 Option Set value was found. In most cases, this would indicate a typo in the parameters you have specified.

As with anything, the best way to see how something works is in the practice! So if we again look at our previous examples shown in this post, we would utilise the dbo.fnGetStateLabel function as follows to return the correct label in English:

SELECT DISTINCT statecode, dbo.fnGetStateLabel('account', statecode, 1033) AS statecode_displayname
FROM dbo.account

With our results returning as follows:

Now we can expose this through our reports and not worry about having to do any kind of transformation/lookup table to get around the issue 🙂

Attempting to keep things as simple as possible by encapsulating complex functionality into simply and clearly expressed functions is an excellent way of ensuring that code can be kept as streamlined as possible, and also ensures that other colleagues can accomplish complex tasks, even if they do not have in-depth knowledge of Transact-SQL.

Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging – and potentially alienating – part of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server. Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of £’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance 🙂 Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select the AccountLead and Product entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick – for example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed – often tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.

Those who have experience working with an RDMS system like SQL Server will become accustomed towards a certain way of going about things. These can often involve a mixture of “lazy” query writing (e.g. using SELECT *… as opposed to SELECT Column1, Column2…), the manner in which you write your query (ALL CAPS or lower case) and best practice approaches. One arguable example of a best practice approach is the use of Stored Procedures. An illustration of how to use a Stored Procedure can most readily demonstrate their benefits. Take a look at the T-SQL query below, which should execute fine against the AdventureWorks2012 sample database:

SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
FROM [HumanResources].[Employee] AS E
 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
 INNER JOIN [HumanResources].[Department] AS D
  ON DH.[DepartmentID] = D.[DepartmentID]
 INNER JOIN [Person].[Person] AS P
  ON E.[BusinessEntityID] = P.[BusinessEntityID]
WHERE DH.[EndDate] IS NULL
AND E.[JobTitle] = 'Production Technician - WC50'

The query returns the data we need, but not in an efficient manner. Consider the following:

  • Executing a query like the above, in-code, as part of an end-user application could expose your database to the risk of an SQL Injection attack or similar.
  • The query compromises a lot of information regarding our underlying database structure, information which any underlying client executing the query neither cares or should have to worry about.
  • The example is a very precise query, with a specific function – i.e. get me all the current employees who have the Job Title of Production Technician – WC50. If we wanted to modify it to instead obtain all Senior Tool Designers, we would have to write a completely separate query to accommodate this.

Implementing a Stored Procedure to encapsulate our query logic immediately addresses the above concerns, by providing us with a single-line query into the database, giving us just the data we need and enables us to utilise the query for other scenarios as well. Setting one up is very straight forward via the CREATE PROCEDURE command – the rest is pretty much what we have put together already:

CREATE PROCEDURE dbo.uspGetEmployeesByJobTitle 
	@JobTitle NVARCHAR(50)
AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

	SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
	FROM [HumanResources].[Employee] AS E
	 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
	  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
	 INNER JOIN [HumanResources].[Department] AS D
	  ON DH.[DepartmentID] = D.[DepartmentID]
	 INNER JOIN [Person].[Person] AS P
	  ON E.[BusinessEntityID] = P.[BusinessEntityID]
	WHERE DH.[EndDate] IS NULL
	AND E.[JobTitle] = @JobTitle
END
GO

By utilising a parameter for our WHERE clause filter on the Job Title, we can pass any valid value back to our stored procedure, immediately making our initial query more versatile across our reporting/business application. And, as a primary bonus, we can now safely take a 10 line query down to 1:

EXECUTE dbo.uspGetEmployeesByJobTitle @JobTitle = 'Senior Tool Designer'

So we have established that Stored Procedures are wicked cool awesome – but what does this have to do with PowerBI?!? 

Having worked with SQL Server Reporting Services (SSRS) extensively in the past, I have become accustomed to using Stored Procedures as a mechanism for storing underlying query logic within the database and having a straightforward means of referencing this from my .rdl file. I can only assume from this that this is the “norm” and preferred method of querying SQL data, as opposed to a direct SELECT statement.

When recently doing some work within PowerBI involving Azure SQL Databases, I was, therefore, surprised that there was no option to return data via a stored procedure as default. Instead, PowerBI would prefer me to directly query underlying table/view objects:

Thankfully, when inspecting the underlying PowerQuery used to return an example table from the above, it doesn’t use any kind of SELECT query to get the data:

let
    Source = Sql.Databases("mydatabaseinstance"),
    AdventureWorks2012 = Source{[Name="AdventureWorks2012"]}[Data],
    Production_ProductModel = AdventureWorks2012{[Schema="Production",Item="ProductModel"]}[Data]
in
    Production_ProductModel

Unfortunately, the same cannot be said for if you select the Advanced options area and input your own SQL query directly:

let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]#(lf)FROM [HumanResources].[Employee] AS E#(lf) INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH#(lf)  ON E.[BusinessEntityID] = DH.[BusinessEntityID]#(lf) INNER JOIN [HumanResources].[Department] AS D#(lf)  ON DH.[DepartmentID] = D.[DepartmentID]#(lf) INNER JOIN [Person].[Person] AS P#(lf)  ON E.[BusinessEntityID] = P.[BusinessEntityID]#(lf)WHERE DH.[EndDate] IS NULL#(lf)AND E.[JobTitle] = 'Senior Tool Designer'"])
in
    Source

I do NOT recommend you use the above method to query your SQL Server data!

I have spoken previously on the blog in respect to conventions around working with datasets i.e. only get what you need, and nothing else. As I work more and more with PowerBI, the tool very much seems to be geared towards flipping this mentality on its head. PowerBI has a number of built-in tools that seem to scream out “Just get ALL your data in here, we’ll worry about the rest!”. I realise that the difference between MB and GB these days, from a storage/cost point of view, is minimal; nevertheless, I still believe it is prudent not to put all your eggs in one basket and ensure that your business data is not being stored cavalier-esque within a multitude of different cloud services.

With this in mind, it is good to know that you can utilise stored procedures in PowerBI. You basically have two ways in which this can be achieved:

  • Going back to the Advanced options screen above on the SQL Server database wizard, you can EXECUTE your stored procedure directly using the following SQL Statement:
DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''
EXEC (@SP)

Be sure to specify your database and don’t forget the double quotes!

  • If you prefer to use PowerQuery as opposed to the wizard above, then the following code will also work:
let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''#(lf)EXEC (@SP)"])
in
    Source

In both cases, you will be required to authenticate with the database and your result set should return as follows if using the AdventureWorks2012 example database/code:

Finally, as a best-practice security step, you should ensure that the account connecting to your SQL Server instance is restricted to only EXECUTE the procedures you have specified. This can be achieved via the following T-SQL snippet, executed against your database instance:

GRANT EXECUTE ON OBJECT::dbo.uspMyStoredProcedure  
    TO MySQLServerLogin 

Conclusions or Wot I Think

PowerBI is increasingly becoming a more relevant tool for traditional Business Intelligence/Reporting Services experienced professionals. The bellwether for this can surely be seen in the current Technical Preview for SQL Server Reporting Services, which includes PowerBI reports built-in to the application. Although we have no timescales at this stage at when the next major version of SQL Server will be released, it is reasonable to assume by the end of this year at the earliest, bringing PowerBI reports as a new feature. I am really excited about the introduction of PowerBI into SSRS, as it would appear to be a match made in heaven – giving an opportunity for those with experience in both products the flexibility to develop a unified, best of breed solution, using traditional .rdl reporting capability and/or PowerQuery/DAX functionality.

With the above on the horizon, the importance of being able to integrate seamlessly with SQL Server and having support for traditional/well-proven practices become crucial indicators of whether this match will be over before the honeymoon. And so, I would hope to see the option to access SQL Server data via Stored Procedures become standard when using the built-in data connector within PowerBI. Based on the feedback I have seen online, I’d warrant towards how welcome this feature could be and an excellent way of reducing the need for direct coding to achieve a common requirement within PowerBI.

Getting to grips with how to use Dynamics CRM/365 for Enterprise (D365E) is no easy feat. You can imagine just how difficult it is for an end user to get to grips with how the application works and functions; with more detailed knowledge around customisation and development being an entirely different ball game altogether. Compounding this problem further is the fact that the product has evolved at an increasingly more rapid pace over recent years, to the point where it is literally impossible to become a master of everything that you can do within CRM/D365E. Those venturing into the product for the first time may find their learning journey significantly simplified if they already have a good general knowledge about some of the underlying technology that powers CRM/D365E. This was certainly true in my case; I had a good background already in managing Office 365, writing SQL queries/reports and some experience with C#. This is all incredibly useful knowledge to have in your arsenal and is all directly applicable towards CRM/D365E in some way. For those who are getting to grips with the product for the first time, either without this previous experience or as part of an apprentice/graduate type role, your journey may not be as swift and issue-free. With this in mind, here’s my list of essential knowledge that you can add to your own “swiss army knife” of personal knowledge. Experience and good knowledge of these technologies will not only help you greatly in working with CRM/D365E, but present an excellent learning opportunity for Microsoft technologies more generally and something that you can add to your CV with pride:

SQL Server

What is it? : SQL Server is Microsoft’s proprietary database knowledge, based on the ANSI standard. SQL stands for Structured Query Language and is one the most widely used database programming languages on the planet.

Why Knowing It Is Useful: The underlying database technology that CRM/D365E uses is SQL Server, so having a general awareness of relational database systems work and how SQL Server differs from the standard goes a long way in understanding what is capable from a customisation/development viewpoint. For example, you can very quickly grasp which data types the application supports, as they will all ultimately be based on a supported SQL Server column data type. If you are running on-premise versions of CRM/D365E, then knowledge of SQL Server immediately moves from being a nice bonus to an essential requirement. This is because administrators will need to have good knowledge of how to manage their Dynamics CRM database instance, perform backups and also, potentially, write transact-SQL (T-SQL) queries against your database for reporting or diagnostic work.

Recommended Area of Study: Focusing your attention towards SQL Server Reporting Services (SSRS) report writing will benefit you the most. Through this, you can begin to establish good knowledge of how SQL Server databases work generally, and be in a position to write FetchXML for Online/On-Premise deployments of the application or Transact-SQL (T-SQL) based reports for On-Premise only. Having a good awareness of what is capable via a standard SQL query will also hold your good stead when working with FetchXML, as you can immediately make a number of assumptions about what is possible with FetchXML (for example, filtering results using an IN block containing multiple values and performing grouping/aggregate actions on datasets)

Office 365

What is it? : Office 365 is Microsofts primary – and perhaps most popular – cloud offering for businesses, individuals or home users. Through a wide array of different subscription offers, home and business users can “pick ‘n’ mix” the range of solutions they require – from Exchange-based email accounts to licenses for Microsoft Visio/Project, through to PowerBI.

Why Knowing It Is Useful: Although it is arguable that knowledge of Office 365 is not essential if you anticipate working with on-premises versions of the application, you may be doing yourself a disservice in the long term. Microsoft is increasingly incentivising organisations to move towards the equivalent cloud versions of their on-premise applications, meaning that as much knowledge as possible of how CRM/D365E Online works in the context of Office 365 is going to become increasingly more mandatory. If you are looking to secure a career change in the near future, and have not had much experience with Office 365, then this is definitely an area that you should focus on for future learning. From a day-to-day management point of view for the Online version of the product, some basic awareness of how to navigate around and use Office 365 is pretty much essential if you are going to succeed working with the product on a day-to-day basis.

Recommended Area of Study: Spin up a D365E trial, and you can very quickly start getting to grips with how the product sits within the Office 365 “ecosystem”. Practice licensing users, configuring security group level access to your D365E trial tenant and modify the details on Office 365 user accounts to see how these details are synced through into D365E. The Microsoft Virtual Academy also has a number of general courses related to Office 365 however, due to the frequent updates, it may not always be in-line with the current version. The official curriculum/certification paths for Office 365 may also suffer the same from this but are worthwhile in demonstrating your experience and ability to integrate D365E with the various related Office 365 services.

Active Directory

What is it? : For the rookie, intermediate and experienced IT admins, Active Directory needs no introduction. It is essentially Microsoft’s implementation of the Lightweight Directory Access Protocol (LDAP), having first being introduced in Windows Server 2000, providing a means of managing user, security and access rights for domain objects. There are now two distinct versions of Active Directory that are available – the more traditional, Windows server based, on-premise Active Directory and Azure Active Directory, which is utilised primarily by Office 365.

Why Knowing It Is Useful: User account records for both On-Premise and Online versions of CRM/D365E use Active Directory objects, with a number of important information synchronised between an Active Directory user and the equivalent User entity record. For example, as indicated in this MSDN article, the only way in which you can synchronise a user’s Job Title through to CRM/D365E is by updating the equivalent field on the Azure Active Directory. Active Directory objects are also the only way in which you can authenticate with the application via the Web Interface or other means, with no option to create a database user or other kind of authenticated user type.

Recommended Area of Study: It’s free to set up your own Azure Active Directory, so this is an excellent starting point for getting to grips with the technology. There’s also nothing preventing you from downloading a trial of Windows Server and installing the Active Directory server role on this machine. Once configured, you can then start to create users, update attributes, configure permissions and setup roles that contain collections of privileges. If you already have an Office 365 tenant with CRM/D365E Online, then you can use the Office 365 portal to manage your user accounts and test the synchronisation of attribute values from the Active Directory through to the application.

PowerShell

What is it? : A good way to remember what PowerShell is that it is essentially a blue command prompt window 🙂 . Traditionally only being relevant and important for those working extensively with Windows Server or Exchange, PowerShell is now increasingly important as part of administrating on-premise CRM/D365E, Office 365 and Azure, to name a few. Indeed, one of the major shock announcements this year was that PowerShell became open sourced and can be installed on Linux; representing the increasing demand and importance of Linux-based resources within the Microsoft cloud.

Why Knowing It is Useful: Similar to SQL Server, PowerShell is something that is instantly more applicable for on-premise CRM/D365E deployments. For example, the only way to modify the default number of Dashboard items is via executing the Get-CRMSetting cmdlet against your on-premise organisation. I would also, again, argue having a general awareness of PowerShell can help greatly when performing administration work against an Office 365 tenant that contains a CRM/D365E organisation, such as user provisioning or license assignment. If you are utilising the Azure Service Bus to integrate CRM/D365E for Azure-based applications, then PowerShell immediately becomes a desirable skill to have in your arsenal, allowing you to remotely administer, deploy or update Azure resources programmatically.

Recommended Area of Study: The fact that PowerShell is now open sourced means that there is a plethora of online tools and guides to refer to, and you can be assured that you can get it working on your platform of choice. The GitHub page for PowerShell is a great place to get started. Beyond that, you have a few options about how you can practice further. If you have spun up a D365E trial, then you can choose to hook up PowerShell to Office 365 to see what you can do from a remote management perspective (such as granting Send On Behalf permissions for a shared mailbox). Alternatively, you can run it from your local Windows machine, connect it up to a Windows Server instance or attempt to create new services in Azure and experiment that way.