When you have spent any length of time working with Dynamics CRM Online/Dynamics 365 for Enterprise (D365E) data programmatically, you become accustomed to how Option Set, State and Status Reason values are presented to you in code. To explain, the application does not store your Option Set value display names within the SQL Server Entity tables; rather, the Option Set Value that has been specified alongside your Label is what is stored as an integer value. That is why you are always mandatorily prompted to provide both values within the application:

The following benefits are realised as a result of how this is setup:

That being said, when working with these field types in code, you do always have to have the application window open or a list of all Labels/Values to hand so that you don’t get too confused… ūüôā

I have previously extolled the virtues of the Data Export Service on the blog, and why you should consider it if you have basic integration requirements for your CRM/D365E deployment.¬†One area in which it differs from other products¬†on the market is how it handles the field types discussed above. For example, when exporting data to a SQL database via Scribe Online, new columns are created alongside that contain the “Display Name” (i.e. label value) that correspond to each Option, Status and Status Reason Label. So by running the following query against a Scribe export database:

SELECT DISTINCT statecode, statecode_displayname
FROM dbo.account

We get the best of both worlds – our underlying statecode value and their display names – all in 2 lines of code:

This is a big help, particularly when you are then using the data as part of a report, as no additional transformation steps are required and your underlying SQL query can be kept as compact as possible.

The Data Export Service differs from the above in an understandable way, as display name values for Status, Status Reason and Option Set column values are instead segregated out into their own separate table objects in your Azure SQL database:

OptionSetMetadata

GlobalOptionSetMetadata

StateMetadata

StatusMetadata

Why understandable? If you consider how the application can support multiple languages, then you realise that this can also apply to metadata objects across the application – such as field names, view names and – wouldn’t you have guessed it – Labels¬†too. So when we inspect the¬†OptionSetMetadata table, we can see that the table structure accommodates the¬†storing of labels in multiple languages via the¬†LocalizedLabelLanguageCode field:

Unlike the Scribe Online route above (which I assume only retrieves the Labels that correspond to the user account that authenticates with CRM/D365E), the Data Export Service becomes instantly more desirable if you are required to build multi-language reports referencing CRM/D365E application data.

The issue that you have to reconcile yourself with is that your SQL queries, if being expressed as natively as possible, instantly become a whole lot more complex. For example, to achieve the same results as the query above, it would have to be adapted as follows for the Data Export Service:

SELECT DISTINCT statecode, LocalizedLabel
FROM dbo.account
 LEFT JOIN dbo.StateMetadata
  ON 'account' = EntityName
  AND statecode = [State]
  AND '1033' = LocalizedLabelLanguageCode

The above is a very basic example, but if your query is complex – and involves multiple Option Set Values – then you would have to resort to using Common Table Expressions (CTE’s) to accommodate each potential JOIN required to get the information you want.

In these moments, we can look at some of the wider functionality provided as part of SQL Server to develop a solution that will keep things as simple as possible and, in this particular instance, a user-defined function is an excellent candidate to consider. These enable you to perform complex operations against the database platform and encapsulate them within very simply expressed objects that can also accept parameters. The good thing about functions is that they can be used to return table objects and scalar (i.e. single) objects.

Using a scalar function, we can, therefore, remove some of the complexity behind returning Option Set, Status and Status Reason labels by creating a function that returns the correct label, based on input parameters received by the function. You could look at creating a “master” function that, based on the input parameters, queries the correct Metadata table for the information you need; but in this example, we are going to look at creating a function for each type of field – Status, Status Reason, Option Set and Global Option Set.

To do this, connect up to your Data Export Service database and open up a new query window, ensuring that the context is set to the correct database. Paste the following code in the window and then hit Execute:

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

--Create Function to return Global Option Set Labels

CREATE FUNCTION [dbo].[fnGetGlobalOptionSetLabel]
(
	@GlobalOptionSetName NVARCHAR(64), --The logical name of the Global Option Set
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Global Option Set field ' + @GlobalOptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Option Set Labels

CREATE FUNCTION [dbo].[fnGetOptionSetLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Option Set field
	@OptionSetName NVARCHAR(64), --The logical name of the Option Set field
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Option Set field ' + @OptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Labels

CREATE FUNCTION [dbo].[fnGetStateLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status field
	@State INT, --The Status option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain State label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Reason Labels

CREATE FUNCTION [dbo].[fnGetStatusLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status Reason field
	@Status INT, --The Status Reason option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain Status label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

This will then go off and create the functions listed in code, which should then show up under the Programmability folder on your SQL database:

For those who are unsure at what the SQL code is doing, it first attempts to determine if only 1 Label can be found for your appropriate field type, based on the parameters provided. If it is successful, then a value is returned; otherwise, the CAST function is designed to force an error to return back to the caller to indicate that none or more than 1 Option Set value was found. In most cases, this would indicate a typo in the parameters you have specified.

As with anything, the best way to see how something works is in the practice! So if we again look at our previous examples shown in this post, we would utilise the dbo.fnGetStateLabel function as follows to return the correct label in English:

SELECT DISTINCT statecode, dbo.fnGetStateLabel('account', statecode, 1033) AS statecode_displayname
FROM dbo.account

With our results returning as follows:

Now we can expose this through our reports and not worry about having to do any kind of transformation/lookup table to get around the issue ūüôā

Attempting to keep things as simple as possible by encapsulating complex functionality into simply and clearly expressed functions is an excellent way of ensuring that code can be kept as streamlined as possible, and also ensures that other colleagues can accomplish complex tasks, even if they do not have in-depth knowledge of Transact-SQL.

Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging Рand potentially alienating Рpart of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server.¬†Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of ¬£’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance ūüôā Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select¬†the¬†Account,¬†Lead¬†and¬†Product¬†entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick Рfor example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed Рoften tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering¬†your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.

When working with form-level JScript functionality on Dynamics CRM/Dynamics 365 for Enterprise (D365E), you often uncover some interesting pieces of exposed functionality that can be utilised neatly for a specific business scenario. I did a blog post last year on arguably one of the best of these functions when working with Lookup field controls – the Xrm.Page.getControl().addPreSearch method. Similar to other methods exposed via the SDK, its prudent and effective implementation can greatly reduce the amount of steps/clicks that are involved when populating Entity forms.

I’ve already covered as part of last years post just what this method does, its sister method, addCustomFilter, and also some of the interesting problems that are encountered when working with the Customer lookup field type; a special, recently introduced field type that allows you to create a multi-entity lookup/relationship onto the Account and Contact entities on one field. I was doing some work again recently using these method(s) in the exact same conditions, and again came across some interesting quirks that are useful to know when determining whether the utilisation of these SDK methods is a journey worth starting in the first place. Without much further ado, here are two¬†additional scenarios that involve utilising these methods and the “lessons learned” from each:

Pre-Filtering the Customer Lookup to return Account or Contact Records Only

Now, your first assumption with this may be that, if you wanted your lookup control to only return one of the above entity types, then surely it would be more straightforward to just setup a dedicated 1:N relationship between your corresponding entity types to achieve this? The benefits of this seem to be pretty clear – this is a no-code solution that, with a bit of ingenious use of Business Rules/Workflows, could be implemented in a way that the user never even suspects what is taking place (e.g. Business Rule to hide the corresponding Account/Contact lookup field if the other one contains a value). However, assume one (or all) of the following:

  • You are working with an existing System entity (e.g. Quote, Opportunity) that already has the Customer lookup field defined. This would, therefore, mean you would have to implement duplicate schema changes to your Entity to accommodate your scenario, a potential no-no from a best practice point of view.
  • Your entity in question already has a significant amount of custom fields, totalling more than 200-300 in total. Additional performance overheads may occur if you were to then choose to create two separate lookup fields as opposed to one.
  • The entity you are customising already has a Customer lookup field built in, which is populated with data across hundreds, maybe thousands, of records within the application. Attempting to implement two separate lookups and then going through the exercise of updating¬†every record to populate the correct lookup field could take many hours to complete and also have unexpected knock-on effects across the application.

In these instances, it may make more practical sense to implement a small JScript function to conditionally alter how the Customer Lookup field allows the user to populate records when working on the form. The benefit of this being is that you can take advantage of the multi-entity capablities that this field type was designed for, and also enforce the integrity of your business logic/requirements on the applications form layer.

To that end, what you can look at doing is applying a custom FetchXML snippet that prevents either Account or Contact records from returning when a user clicks on the control. Paradoxically, this is not done by, as I first assumed, using the following snippet:

var filter = "<filter type='and'><condition attribute='accountid' operator='not-null' /></filter>";
Xrm.Page.getControl("mycustomerlookupfield").addCustomFilter(filter, "account");

This will lead to no records returning on your lookup control. Rather, you will need to filter the opposite way Рonly return Contact records where the contactid equals Null i.e. the record does not exist:

var filter = "<filter type='and'><condition attribute='contactid' operator='null' /></filter>";
Xrm.Page.getControl("mycustomerlookupfield").addCustomFilter(filter, "contact");

Don’t Try and Pass Parameters to your addCustomerFilter Function (CRM 2016 Update 1)

If your organisation is currently on Dynamics CRM 2016 Update 1, then you may encounter a strange – and from what I can gather, unresolvable – issue if you are working with multiple, parameterised functions in this scenario. To explain further, lets assume you have a Customer Lookup and a Contact Lookup field on your form. You want to filter the Contact Lookup field to only return Contacts that are associated with the Account populated on the Customer Lookup. Assume that there is already a mechanism in place to ensure that the Customer lookup will always have an Account record populated within it, and your functions to use in this specific scenario may look something like this:

function main() {

    //Filter Contact lookup field if Customer lookup contains a value

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue();

    if (customerID != null) {
   
        Xrm.Page.getControl("mycontactfield").addPreSearch(filterContactNameLookup(customerID[0].id));
    }
    
}

function filterContactNameLookup(customerID) {

    var filter = "<condition attribute='parentcustomerid' operator='eq' value='" + customerID + "' />";
    Xrm.Page.getControl("mycontactfield").addCustomFilter(filter, "account");

}

The above example is a perfectly sensible means of implementing this. Because, surely, it make more practical sense to only obtain the ID of our Customer Lookup field in one place and then pass this along to any subsequent functions? The problem is that CRM 2016 Update 1 throws some rather cryptic errors in the developer console when attempting to execute the code, and does nothing on the form itself:

Yet, when we re-write our functions as follows, explicitly obtaining our Customer ID on two occasions, this runs as we’d expect with no error:

function main() {

    //Filter Contact lookup field if Customer lookup contains a value

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue();

    if (customerID != null) {
   
        Xrm.Page.getControl("mycontactfield").addPreSearch(filterContactNameLookup);
    }
    
}

function filterContactNameLookup() {

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue()[0].id;
    var filter = "<condition attribute='parentcustomerid' operator='eq' value='" + customerID + "' />";
    Xrm.Page.getControl("mycontactfield").addCustomFilter(filter, "account");

}

I’ve been scratching my head at why this doesn’t work, and the only thing I can think of is that the first function – main – would be executed as part of the forms OnLoad event, whereas the filterContactNameLookup is only triggered at the point in which the lookup control is selected. It’s therefore highly possible that the first instance of the customerID is unobtainable by the platform at this stage, meaning that you have to get the value again each time the lookup control is interacted with. If anyone else can figure out what’s going on here or confirm whether this is a bug or not with Dynamics CRM 2016 Update 1, then do please let me know in the comments below.

Conclusions or Wot I Think

It could be argued quite strongly that the examples shown here in this article have little or no use practical use if you are approaching your CRM/D365E implementation from a purely functional point of view. Going back to my earlier example, it is surely a lot less hassle and error-prone to implement a solution using a mix of out of the box functionality within the application. The problem that you eventually may find with this is that the solution becomes so cumbersome and, frankly, undecipherable when someone is coming into your system cold. With anything, there always a balance should be striven for on all occasions and, with a bit of practical knowledge of how to write JScript functionality (something that any would-be CRM expert should have stored in their arsenal), you can put together a solution that is relatively clean from a coding point of view, but also benefits from utilising some great functionality built-in to the application.

Often, when working with the latest features as part of Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E), it very much feels like handling a double-edged sword. Whilst the functionality they bring to the table is often impressive, there is typically scant information available when things go wrong – either through official channels or via online forum/blog posts. This can often act as a major barrier to early adoption, particularly when the business benefit of such adoption far outweighs any wasted time or extraordinary effort.

One of these new features is Word Templates, a feature that I have blogged about previously –¬†particularly in how they compare against the more traditional SQL Server Reporting Services (SSRS) Reports that CRM/D365E offer developers. They present CRM customisers a much more familiar and equally powerful means of creating common templates that can be run against specific CRM record types, providing largely similar functionality compared to SSRS Reports. For those who have not been fortunate enough to have previous experience using SSRS, Word Templates present a far more accessible and friendly approach to addressing documentation needs within CRM/D365E.

When configuring access to Word (and indeed Excel) Templates for your users, you generally only need to be concerned with two set of permissions Рadequate level Read permissions against the entity that the template is being run against (Lead, Account etc.) and the Document Generation privilege, in the Business Management area on the Security Role window:

When we were recently attempting to setup access to Word Templates for a specific group of users on CRM 2016 Online (8.1), we first verified that all of the above privileges had been granted Рbut we still encountered a strange issue when attempting to generate the Word Template:

This message should be familiar to those who work with the application frequently, as it generally is one of the error types that is generally thrown back when a database error has occurred Рtypically a timeout error or similar. What is distinctly not familiar about the above is the fact that we are unable to select the Download Log File button. Doing so would generally present us with a sufficiently detailed error message about the underlying problem. Without this, the issue becomes significantly more tricky to diagnose.

Before escalating the issue further with Microsoft. we were able to diagnose and observe the following:

  • Users that had been assigned the System¬†Administrator security role were able to generate the template without issue
  • Users within the root Business Unit, likewise, had no issue generating the template
  • We were unable to replicate the issue within a separate environment, that had been configured the exact same way (same Business Units, Security Roles etc.)
  • When running a Fiddler trace whilst reproducing the issue, nothing additional error message wise was exposed behind the scenes. Fiddler, for those who are unaware, is one of the best tools to have in your arsenal when diagnosing web service request issues in CRM. As the act of generating a Word Template would (presumably) cause a web service request, it was hoped that something additional clues could be gathered by using Fiddler.

Due to the limitations of CRM 2016 Online compared with On-Premise CRM 2016 (i.e. we had no way in which we could interrogate the SQL database for the instance), we had to escalate the case to Microsoft in order to provide a resolution. And, as is generally the case in these matters, the proposed solution was informative and surprising in equal measure.

The support engineer assigned to the case took a copy of our instance database and deployed into a test environment, to see if the issue could be reproduced. They also have the benefit of being able to access information regarding the instance that I would imagine most CRM Online administrators would give their right hand for ūüôā Because of this, we were able to determine the underlying error database error message that was causing the problem:

The data type image cannot be used as an operand to the UNION, INTERSECT or EXCEPT operators because it is not comparable

The error is currently an acknowledged bug in CRM, which is targeted for resolution early in 2017. In the meantime, the engineer pointed me towards a tool that I was previously unaware of Рthe Dynamics CRM Organization Settings Editor. I was already aware that there are a number of settings relating to a CRM/D365E organisation that can be modified via PowerShell/cmd line executable for On-Premise deployments only. These settings can achieve a number of potentially desirable changes to your CRM Organisation, some of which cannot be achieved via the CRM interface alone Рsuch as changing whether emails are sent synchronously or asynchronously (SendEmailSynchronously) or modifying the number of elements that are displayed in the tablet app, such as fields (TabletClientMaxFields), tabs (TabletClientMaxTabs) and lists (TabletClientMaxLists). In order to make these changes, there is a tool that you can download from Microsoft that enables you to change these settings via a cmd line window Рbut, arguably, the Dynamics CRM Organization Settings Editor is a far simpler tool to use, given that it is a managed solution that enables you to modify all of the settings from within CRM/D365E. Regardless of which tool you use, the solution suggested by the support engineer (which resolves the problem, incidentally) is outlined below, using the Dynamics CRM Organisation Settings Editor. It is worth noting that, although the KB article above does not specifically reference this, I was advised that using the tool is unsupported by Microsoft. Therefore, any changes Рand issues that may be caused as a result Рare made at your own risk:

  1. Install the managed solution file into CRM and then open up the Solution. You will be greeted with the Configuration page, which should look similar to the below:
  2. Navigate to the EnableRetrieveMultipleOptimization setting, which should be set to the default of not set.
  3. Click Add to modify the setting. You will be asked to confirm the change before it will take effect:
  4. Once you click OK, the default value of 1 will be applied to this setting. Fortunately, this is the precise setting we need to get things working, so verify that this is indeed set to 1 on your instance and then close out of the solution:

Now, when you refresh CRM for the affected user, you should be able to download the Word Template without issue.

Although we can expect this error to be resolved shortly as part of the next update to D365E, hopefully, the above workaround will help others who come across the same issue and allow you to use Word Templates without issue within your CRM/D365E environment ūüôā

I have had an opportunity recently to start getting to grips with the wonderful world of PowerBI. For those who have walked the tightrope between Excel and SQL Server Reporting Service (SSRS) Reports, PowerBI appears to be the tool with these individuals in mind. It enables you to leverage existing Excel knowledge (through PowerQuery or Excel-based formulas/calculations), whilst also offering a number of easy to setup¬†Visualisations, that are not too dissimilar to the charts, maps and other objects that can be setup on a .rdl report. What’s also good to know, from a Dynamics CRM/365 for Enterprise (D365E) point of view, is that you can very quickly hook up your CRM data to a PowerBI dashboard/report. In addition to this, integrating with other data sources – such as SQL, JSON or flat file sources – or even with completely different application systems is possible (even with SalesForce – shock, horror!). In the days of increasing need for tight integration between a dazzling array of different application and database systems, PowerBI gives you a broad framework to achieve any reporting goal you may have in mind. Having said that, it is still rough around the edges and in need of some further development before it, arguably, becomes the de facto tool to use. For example,¬†I have found some of the formatting and design options available to be rather basic and light-years behind what is possible with SSRS or even Excel. There are also some features missing that are somewhat baffling, such as the ability to send out dashboards/reports via email on a predefined schedule. I would hope that we see increasing attention towards PowerBI in the months and years ahead in order to bring the very best of features from these more traditional applications but exposed in a non-daunting and wholly accessible way.

As referenced above, getting set up with your Online CRM/D365E data is incredibly straightforward via the inbuilt Dynamics 365 connector “wizard”¬†– simply login into your online organisation, specify the entity data that you want to work with and PowerBi will push this data into a table for you, enabling you to start building your report in seconds. The connector “wizard” is suited to most typical data retrieval scenarios, providing a GUI interface to visualise the entity data within your CRM/D365E instance and the ability to put together related entities and return them as part of your query. Before you start using it, however, I would highlight the following:

  • The OData feed is rather indiscriminate in its retrieval – all records from an entity will be returned. Some pre-filtering will occur based on CRM’s/D365E’s security model (e.g. if¬†the account you log in as has¬†Business Unit level Read privilege on the Lead entity, only records in the accounts Business Unit will be returned), but typically it will be System Administrators who set up a PowerBI Dashboard; therefore meaning you have to deal with a lot of records being returned into PowerBI. Given that the two different PowerBI plans have limitations in regards to record retrieval, this could cause problems if you are working with large CRM datasets.
  • Tied in with the above, because you have no way via the “wizard” interface to specify record retrievals, you cannot take advantage of filtering your data based on certain attributes or even take advantage of some of the native functionality within CRM to aggregate your data. Whilst PowerBI is certainly more than capable of doing this, relatively n00bish users may find this an unwelcome barrier that hinders adoption.
  • Lookup and Option Set attributes are returned as a special data type of Record – with the underlying properties of the related record (GUID, display name etc.) stored within this sub-record. Having the data stored in this manner causes additional administrative effort on the PowerBI side, as you will need to figure out how to access the underlying properties of this PowerBi data type.

Fortunately, if you are inclined towards building very specific and focused queries that you can execute against your Online CRM/D365E, there is a way – and the best thing is, we get to use something that has been recently introduced into CRM as well ūüôā

The Web API to the rescue!

The Web API was introduced in Dynamics CRM 2016, which implements version 4 of the Open Data (OData) Protocol, and will eventually replace the traditional means that developers would use to access CRM via web services (namely, the Organization service and the old OData service). CRM Developers will need to start becoming increasingly familiar with the Web API in the years ahead, and no doubt a lot of work will need to be done updating existing coding to look at the new Web API.

Because the Web API is a web service, PowerBi can connect to it via the Web connector. By querying the Web API, you have access to all of the messages that are exposed to get data from CRM – Retrieve, Retrieve Multiple and Predefined Query – with multiple options available to use in terms of how you return, filter and aggregate your data. Results will be returned in JSON format, so there will be some additional work that needs to be done to get the data into an accessible format. This post will now take a look at what you need to do in order to return data based on a FetchXML query, as well as (hopefully!) providing a useful code snippet that you can adapt to your environment.

Before starting, ensure that you have installed the CRM Rest Builder Managed Solution within your CRM development environment. This tool allows you to quickly generate code snippets in JScript that perform web service calls into CRM/D365E and is a massive help in a number of different ways. A big shout out and thanks to Jason Lattimer for the work he has done on this.

  1. To begin, you need a FetchXML query that returns the data you need. This can be written manually or generated automatically via Advanced Find. In this example, we are going to use the following snippet that queries the Case entity, that will return 7 sample data records from CRM:
<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
  <entity name="incident">
    <attribute name="title" />
    <attribute name="ticketnumber" />
    <attribute name="createdon" />
    <attribute name="incidentid" />
    <attribute name="caseorigincode" />
    <attribute name="customerid" />
    <order attribute="ticketnumber" descending="false" />
    <filter type="and">
      <condition attribute="prioritycode" operator="eq" value="1" />
    </filter>
  </entity>
</fetch>
  1. Next, we need to generate the URL that will be used to query the Web API endpoint. There are two challenges here – the first being that the FetchXML query needs to be included in the URL and that it needs to be encoded so that it is a valid URL. The start of the URL is fairly straightforward to put together – it’s just your CRM organisation URL, in the following format:

https://<Organisation Name>.<Region Code>.dynamics.com

So if your organisation name is crmchap and your CRM tenant is based in the EMEA region, then the URL would be as follows:

https://crmchap.crm4.dynamics.com

The rest of the URL can be obtained from the CRM Rest Builder. Open the Managed Solution, navigating to the Configuration page. It will look something like this:

1

Update the page so that the Action is set to Predefined Query. Ensure that the Primary Entity is set to Incident (always something you have to remember when working with the Case entity ūüôā ) and then copy and paste the FetchXML query into the box below. The window should look similar to the below once ready:

2Press Create Request to put together the code snippet. On Line 2, you will see the following piece of code:

req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v8.0/incidents?fetchXml=%3Cfetch%20version%3D%221.0%22%20output-format%3D%22xml-platform%22%20mapping%3D%22logical%22%20distinct%3D%22false%22%3E%3Centity%20name%3D%22incident%22%3E%3Cattribute%20name%3D%22title%22%20%2F%3E%3Cattribute%20name%3D%22ticketnumber%22%20%2F%3E%3Cattribute%20name%3D%22createdon%22%20%2F%3E%3Cattribute%20name%3D%22incidentid%22%20%2F%3E%3Cattribute%20name%3D%22caseorigincode%22%20%2F%3E%3Cattribute%20name%3D%22description%22%20%2F%3E%3Cattribute%20name%3D%22customerid%22%20%2F%3E%3Corder%20attribute%3D%22ticketnumber%22%20descending%3D%22false%22%20%2F%3E%3Cfilter%20type%3D%22and%22%3E%3Ccondition%20attribute%3D%22prioritycode%22%20operator%3D%22eq%22%20value%3D%221%22%20%2F%3E%3C%2Ffilter%3E%3C%2Fentity%3E%3C%2Ffetch%3E"

The bit we are interested in is the string value after the Xrm.Page.context.getClientUrl() function, which will need to be appended to our CRM URL. So based on the above, our URL to use with PowerBI would be as follows:

https://crmchap.crm4.dynamics.com/api/data/v8.0/incidents?fetchXml=%3Cfetch%20version%3D%221.0%22%20output-format%3D%22xml-platform%22%20mapping%3D%22logical%22%20distinct%3D%22false%22%3E%3Centity%20name%3D%22incident%22%3E%3Cattribute%20name%3D%22title%22%20%2F%3E%3Cattribute%20name%3D%22ticketnumber%22%20%2F%3E%3Cattribute%20name%3D%22createdon%22%20%2F%3E%3Cattribute%20name%3D%22incidentid%22%20%2F%3E%3Cattribute%20name%3D%22caseorigincode%22%20%2F%3E%3Cattribute%20name%3D%22description%22%20%2F%3E%3Cattribute%20name%3D%22customerid%22%20%2F%3E%3Corder%20attribute%3D%22ticketnumber%22%20descending%3D%22false%22%20%2F%3E%3Cfilter%20type%3D%22and%22%3E%3Ccondition%20attribute%3D%22prioritycode%22%20operator%3D%22eq%22%20value%3D%221%22%20%2F%3E%3C%2Ffilter%3E%3C%2Fentity%3E%3C%2Ffetch%3E

A bit of a mouthful I agree!

  1. Now that we have the URL, we can connect up to CRM within PowerBI. Create or Open a new PBIX file and select Get Data -> Web:

3

  1. On the From Web window, copy and paste the URL we’ve built and press OK. You will be prompted to log into your CRM organisation; select Organisational account and log in as you would normally via the Office 365 login page. Once logged in, the data will be retrieved and the Query Editor will open, displaying something similar to the below:4
  2. Some additional work is required in order to get our data into a standard, tabular format. In addition, the data at the moment is returning the underlying Option Set and Lookup values from the Incident entity, as opposed to the Display Name; not good from a reporting point of view:

5We, therefore, need to modify the underlying PowerQuery in order to achieve the following:

Right click on Query1 and select Advanced Editor to open the underlying PowerQuery text. Delete everything here and then copy and paste the following into the window:

let
    //Get the CRM data, including the Formatted Values as part of the returned data
    Source = Json.Document(Web.Contents("https://crmchap.crm4.dynamics.com/api/data/v8.0/incidents?fetchXml=%3Cfetch%20version%3D%221.0%22%20output-format%3D%22xml-platform%22%20mapping%3D%22logical%22%20distinct%3D%22false%22%3E%3Centity%20name%3D%22incident%22%3E%3Cattribute%20name%3D%22title%22%20%2F%3E%3Cattribute%20name%3D%22ticketnumber%22%20%2F%3E%3Cattribute%20name%3D%22createdon%22%20%2F%3E%3Cattribute%20name%3D%22incidentid%22%20%2F%3E%3Cattribute%20name%3D%22caseorigincode%22%20%2F%3E%3Cattribute%20name%3D%22customerid%22%20%2F%3E%3Corder%20attribute%3D%22ticketnumber%22%20descending%3D%22false%22%20%2F%3E%3Cfilter%20type%3D%22and%22%3E%3Ccondition%20attribute%3D%22prioritycode%22%20operator%3D%22eq%22%20value%3D%221%22%20%2F%3E%3C%2Ffilter%3E%3C%2Fentity%3E%3C%2Ffetch%3E", [Headers=[#"Prefer"="odata.include-annotations=""OData.Community.Display.V1.FormattedValue"""]])),
    //Get the underlying list of records returned
    values = Source[value],
    //Create a new table with one column, populated with the values list of records
    #"Table from List" = Table.FromList(values, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    //Query will error if no results, therefore use an if statement to build an empty table with the correct column headers
    Expand = if List.IsEmpty(values) then #table({"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}, {"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}) else Table.ExpandRecordColumn(#"Table from List", "Column1", {"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}, {"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}),
    //For some reason, if there are no results, then the empty table contains error records - so need to specifically remove them
    #"Removed Errors" = Table.RemoveRowsWithErrors(Expand),
    //Finally, rename the _customerid_value field
    #"Renamed Columns" = Table.RenameColumns(#"Removed Errors",{{"_customerid_value@OData.Community.Display.V1.FormattedValue", "Customer Name"}})

in 
   #"Renamed Columns"

The comments should hopefully explain what the code is doing, but to summarise: the PowerQuery is parsing the JSON data into a tabular format, using the returned data to build column headers, and then renaming the _customerid_value field to match our requirements. There is also a logic statement in there to check if we actually have any data; and if not, then build an empty table (since JSON does not return anything that we can use if 0 records are returned).

With our updated PowerQuery, our result set should look similar to the below:

6

Nice, presentable, filtered and exactly what we need to start building our PowerBI report! ūüôā

Conclusions or Wot I Think

Whilst I’ll be the first to admit the above example is rather code-heavy and would require significant tinkering to suit specific scenarios, I would argue that the approach demonstrated adheres most closely to some of the common rules when it comes to querying data for reports:

  • Only return the columns you need
  • Filter your data to reduce the number of results returned
  • Ensure that your report-side columns are giving presentable database column values

This approach is the only one adheres most closely to the above and, therefore, I feel that the extra effort is warranted; particularly when it means that those who are more familiar with CRM can stick to the tools they know best. As the default Dynamics 365 connector uses the Organization Data service, I would imagine that eventually this will be updated to use the new Web API instead. I hope that when this happens, we can begin to achieve all of the above via the PowerBI interface, as opposed to resorting to code.