The Voice of the Customer (VoC) solution, available as part of Dynamics 365 Customer Engagement (D365CE), works most effectively when you are tightly integrating your survey’s around other features or datasets stored within the application. That’s not to say that it must only ever be utilised in tandem as opposed to isolation. If you have the requirement to quickly send out a survey to a small list of individuals (regardless of whether they are D365CE Contact records), VoC presents a natural choice if you are already paying for D365CE Online, as it is included as part of your subscription cost. As services such as¬†SurveyMonkey tend to charge money to let you develop more complex, bespoke branded surveys, VoC, by comparison, offers all of this to you at no additional cost. Just don’t be buying licenses to use only this specific piece of functionality. ūüôā Ultimately, the upside of all this¬†is that VoC represents a solid solution in and of itself,¬†but working with the solution in this manner is just the icing on top of the cake. When you start to take into account the myriad of different integration touchpoints that VoC can instantly support, thanks to its resident status within D365CE, this is where things start to get really exciting. With this in mind, you can look to implement solutions that:

  • Send out surveys automatically via e-mail.
  • Route survey responses to specific users, based on answers to certain questions or the Net Promoter Score (NPS) of the response.
  • Tailor a specific email to a customer that is sent out after a survey is completed.
  • Include survey data as part of an Azure Data Export Profile and leverage the full power of T-SQL querying to get the most out of your survey data.

It is in the first of these scenarios – sending out surveys via a WorkFlow – that you may find yourself encountering the error referenced in the title of this post. The error can occur when you look to take advantage of the survey snippet feature as part of an Email Template – in laymen’s terms, the ability to send out a survey automatically and tag the response back to the record that it is triggered from. To implement this, you would look towards creating a WorkFlow that looks similar to the below:

With then either the desired email message typed out or a template specified that contains the survey snippet code, available from a published survey’s form:

All of this should work fine and dandy up until the user in question attempts to trigger the workflow; at which point, we see the appropriate error message returned when viewing the workflow execution in the System Jobs area of the application:

Those who are familiar with errors like these in D365CE will instantly realise that this is a security role permission issue. In the above example, the user in question had not been granted the Survey¬†User role, which is included as part of the solution and gives you the basic “set menu” of permissions required to work with VoC. A short while following on from rectifying this, we tried executing the workflow again and the error still occurred, much to our frustration. Our next step was to start combing through the list of custom entity privileges on the Security Role tab to see if there was a permission called¬†Azure Deployment or similar. Much to our delight, we came across the following permission which looked like a good candidate for further scrutiny:

When viewing this security role within the¬†Survey User role, the¬†Read permission for this organization-owned custom entity was not set. By rectifying this and attempting to run the workflow again, we saw that it executed successfully ūüôā

It seems a little odd that the standard user security role for the VoC module is missing this privilege for an arguably key piece of functionality. In our situation, we simply updated the Survey User security role to include this permission and cascaded this change accordingly across the various dev/test/prod environments for this deployment. You may also choose to add this privilege to a custom security role instead, thereby ensuring that it is properly transported during any solution updates. Regardless, with this issue resolved, a really nice piece of VoC functionality can be utilised to streamline the process of distributing surveys out to D365CE customer records.

Repeatable and time-consuming tasks are typically an excellent candidate for automation. The range of business benefits that can be realised is perhaps too broad to list, but I think that the simple ability to free up an individuals time to accomplish something better represents the ideal end goal of such activity. I have generally found that the best kind of automation is when there is a degree of human involvement on a very minimal basis – what I would term “keeping the brain involved” and not blindly assuming that the computer will always make the correct choice. A lot of the tools afforded to us when working with Microsoft cloud technologies appear to be very firmly rooted within this mindset, with frameworks such as PowerShell providing the means of carrying out sequence of tasks far quicker than a human could achieve, whilst also providing the mechanism to facilitate human involvement at key steps during any code execution cycle.

When creating a Web App via the Azure portal, you have the option of specifying the creation of an Application Insights resource, which will be automatically associated with your newly created Web App during the deployment. In most cases, you are going to want to take advantage of what this service can deliver to your application in terms of monitoring, usage patterns and error detection; the fact that I am such a major proponent of Application Insights should come as no surprise to regular readers of the blog. Should you find yourself having to deploy both of these resources in tandem via PowerShell, your first destination will likely be the New-AzureRmAppServicePlan & New-AzureRmWebApp cmdlets. For example, the following scripts when executed will create a Basic App Service Plan and Web App in the UK South region called MyWebsite, contained within a resource group with the same name:

New-AzureRMResourceGroup -Name 'MyWebsite' -Location 'UK South'
New-AzureRmAppServicePlan -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'UK South' -Tier 'Basic'
New-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'Basic' -AppServicePlan 'MyWebsite'

Next involves the creation of the Application Insights resource, which you would be forgiven for thinking could be created as part of one of the cmdlets above (à la the portal). Instead, we must resort to a generic cmdlet that can be tinkered with to create any resource on the Azure platform, per the instructions outlined in this article. Therefore, the following cmdlet needs to be executed next to create an Application Insights resource using identical parameters defined for the App Service Plan/Web App:

$appInsights = New-AzureRmResource -ResourceName 'MyWebsite' -ResourceGroupName 'MyWebsite' `
-Tag @{ applicationType = 'web'; applicationName = 'MyWebsite'} `
-ResourceType 'Microsoft.Insights/components' -Location 'UK South' `
-PropertyObject @{'Application_Type'='web'} -Force

It’s worth pointing out at this stage that you may get an error returned along the lines of¬†No registered resource provider found for location… when executing the New-AzureRmResource cmdlet. This is because not all resource providers are automatically registered for use via PowerShell on the Azure platform. This can be resolved by executing the below cmdlets to create the appropriate registration on your subscription. This can take a few minutes to update on the platform:

#Check to see if the Microsoft.Insights provider has a RegistrationState value of Registered.
#If not, execute Register-AzureRmResourceProvider to get it added.
#Then, keep running the first cmdlet until the registration is confirmed.

Get-AzureRmResourceProvider | Where ProviderNamespace -eq 'microsoft.insights'
Register-AzureRmResourceProvider -ProviderNamespace Microsoft.Insights

We now have a Web App and Application Insights resource deployed onto Azure. But, at this juncture, the Web App and Application Insight resources exist in isolation, with no link between them. To fix this, the final step involves updating the newly created Web App resource with the Application Insights Instrumentation Key, which is generated once the resource is created. Because the above snippet is storing all details of the newly created resource within the $appInsights parameter, we can very straightforwardly access this property and add a new application setting via the following cmdlets:

$appSetting = @{'APPINSIGHTS_INSTRUMENTATIONKEY'= $appInsights.Properties.InstrumentationKey}
Set-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -AppSettings $appSetting

With this final step accomplished, the resources are now associated together and this should be reflected accordingly when viewed in the portal. For completeness, the entire script to achieve the above (also including the necessary login steps) can be seen below:

Set-ExecutionPolicy Unrestricted

Login-AzureRmAccount

New-AzureRMResourceGroup -Name 'MyWebsite' -Location 'UK South'
New-AzureRmAppServicePlan -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'UK South' -Tier 'Basic'
New-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'Basic' -AppServicePlan 'MyWebsite'

#Check to see if the Microsoft.Insights provider has a RegistrationState value of Registered.
#If not, execute Register-AzureRmResourceProvider to get it added.
#Then, keep running the first cmdlet until the registration is confirmed.

Get-AzureRmResourceProvider | Where ProviderNamespace -eq 'microsoft.insights'
Register-AzureRmResourceProvider -ProviderNamespace Microsoft.Insights

$appInsights = New-AzureRmResource -ResourceName 'MyWebsite' -ResourceGroupName 'MyWebsite' `
-Tag @{ applicationType = 'web'; applicationName = 'MyWebsite'} `
-ResourceType 'Microsoft.Insights/components' -Location 'UK South' `
-PropertyObject @{'Application_Type'='web'} -Force

$appSetting = @{'APPINSIGHTS_INSTRUMENTATIONKEY'= $appInsights.Properties.InstrumentationKey}
Set-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -AppSettings $appSetting

The above example is interesting in the sense that Application Insights does not have a set of dedicated cmdlets for creating, retrieving and updating the resource. Instead, we must rely on fairly generic cmdlets – and their associated complexity – to work with this resource type. It also seems somewhat counter-intuitive that there is no option as part of the New-AzureRmWebApp cmdlet to create an Application Insights resource alongside the Web App, as we have established the ability to carry this out via the Azure portal. Being able to specify this as an additional parameter (that would also perform the required steps involving the Instrumentation Key) would help to greatly simplify what must be a fairly common deployment scenario. As a service that receives regular updates, we can hope that Microsoft eventually supports one or both of these scenarios to ensure that any complexity towards deploying Application Insights resources in an automated release is greatly reduced.

PowerApps is very much the in vogue topic at the moment, particularly if you are a Dynamics 365 Customer Engagement professional reconciling yourself with the new state of affairs. The previous sentence may sound negative but is very much contrary to my opinion on PowerApps, which I am finding increased use cases for each day when working to address certain business requirements. Whether you are looking to implement a barcode scanning application or something much more expansive, the set of tools that PowerApps provides from the outset means that traditional developers can very easily achieve solutions that would previously take Visual Studio and a whole breadth of programming knowledge to realise.

On the topic of developers, one thing that they may have to assuage themselves with when working with PowerApps is the inability to display dialog messages within the app when some kind of alert needs to be provided to the user. A typical scenario for this could be to request that the user completes all fields on a form before moving to the next screen, ensuring that an appropriate message is displayed reflecting this fact. Whilst there is no current way of achieving this via a dedicated pop-up control or similar, there are ways that this behaviour can be imitated using existing Label controls and a bit of function wizardry.

The best way to illustrating how to accomplish this is to view an example PowerApps app. Below are screenshots of a very simple two-screen app, with several Text Inputs, a Button and Label control:

The required functionality of the app is to allow navigation to the ‘Thanks for your submission!’¬†screen only if the user has entered data into all of the Text Input controls on the first screen; if this condition is not met, then the user should be prevented from moving to the next screen and an error message should be displayed to advise the user accordingly.

The first step is to create the error message and required text on the first screen. This can be straightforwardly achieved via an additional Label control, with some text formatting and colour changes to make it noticeable to the user.

As with many other controls within PowerApps, you have the ability to toggle the visibility of Labels either on a consistent or variable basis. The Visible property is your go-to destination for this and, as the next step, the value of this field should be updated to read ErrorVisible Рthe name of a variable storing the state of the controls visibility (either true or false). If done correctly, as indicated in the screenshot below, you will notice that the Label control will immediately disappear from the screen on the right. This is because the default value of the newly specified variable is false.

The next step involves the invocation of a somewhat complex PowerApps function to implement the required logic on the Button control. The entire function to use is reproduced below in its entirety:

If(Or(IsBlank(TextInput1.Text),IsBlank(TextInput2.Text),IsBlank(TextInput3.Text),IsBlank(TextInput4.Text)), Set(ErrorVisible, true), Navigate(Screen2,ScreenTransition.Fade))

To break the above down for those who are unfamiliar with working with functions:

  • IsBlank does exactly what it says on the tin, but it’s important to emphasise that to check whether a field contains a value or not, you have to specify the Text property of the control.
  • The¬†Set function enables us to specify the values of variables on the fly, whether they have been declared already or not somewhere else on the app. No additional syntax is required, making it very straightforward to create runtime variables that store values throughout an entire app session.
  • Navigate specifies any other screen on the app to open. We can also select a transition effect to use which, in this case, is the Fade transition.
  • Finally, all of the above is wrapped around an If logic statement that prevents the user from moving to the next screen if any of the Text Input controls do not contain a value (courtesy of the¬†Or statement).

The function needs to be entered within the OnSelect property of the button control, and your form should resemble the below if done correctly:

With everything configured, its time to give the app a test drive. ūüôā The sequence below provides a demonstration of how the app should work if all of the above steps are followed:

A final, potentially optional step (depending on what your app is doing), is to ensure that the error message is hidden as soon as the user navigates away from the 1st screen. This can be achieved by updating the ErrorVisible variable back to false as soon as the user navigates onto the second screen, as indicated in the screenshot below:

Conclusions or Wot I Think

PowerApps is still very much a product in its infancy, very neatly fitting into the new wave of Microsoft products with regular release cycles, feature updates and ongoing development. It can be, therefore, unrealistic to expect a full range of features which satisfy all business scenarios to be available at this juncture. Having said that, one feature that could be added to greatly benefit data entry into forms is the ability to display pop-out dialog messages, depending on field requirement levels or other conditional logic. The key benefit of this would be the need not to resort to complex functions to achieve this functionality and to, instead, allow error messages or alerts to be configured via the PowerApps GUI. A similar comparison can perhaps be made with Business Rules in Dynamics 365 Customer Engagement. Before their introduction, developers would have to resort to JScript functions to display form-level alerts based on conditional logic. Not everyone is familiar with JScript, meaning a significant barrier was in place for those looking to implement arguably straightforward business logic. Now, with Business Rules, we have the ability to replicate a lot of functionality that JScript allows for, speeding up the time it takes to implement solutions and providing a much clearer mechanism of implementing straightforward business logic. Hopefully, in the months ahead, we can start to see a similar type of feature introduced within PowerApps to aid in developing a similar solution demonstrated in this post.

When considering whether or not to shift your existing SQL workloads to a single database offering on Azure SQL, one of the major pros is the breadth of capabilities the service can offer when compared with other vendors or in comparison to SQL Server on an Azure Virtual Machine. A list of these may include:

  • High feature parity with the latest on-premise SQL Server offering.
  • Built-in support for Enterprise product features, such as Transparent Database Encryption.
  • Security management features, such as firewalls and (optional) integration with Azure SQL Database Threat Detection for proactive monitoring.
  • Ability to quickly scale a database from a 2GB database with low CPU consumption to a mammoth 4TB database, with a significant pool of CPU/memory resources to match.

It is the last one of these that makes Azure SQL database a particularly good fit for web application deployments that have unpredictable user loads at the time of deployment or, as we have seen previously on the blog, when you are wanting to deploy out a LOB reporting database that houses Dynamics 365 Customer Engagement instance data. Administrators can very straightforwardly scale or downscale a database at any time within the portal or, if you are feeling particularly clever, you can look to implement automatic scaling based on Database Throughput Unit (DTU) consumption. This can aid towards making your query execution times as speedy as possible.

Database scaling, I have found, is very straightforward to get your head around and works like a charm for the most part…except, of course, when you get rather cryptic error messages like the one demonstrated below:

I got this error recently when attempting to scale an S0 5GB database down to Basic 2GB tier. To cut a long story short, I had temporarily scaled up the database to give me increased DTU capacity for a particularly intensive query, and wanted to scale it back to its original pricing tier. You can perhaps understand my confusion about why this error was occurring. After further research and escalation to Microsoft, it turns out that the database was¬†still consuming unused disk space on the platform, thereby violating any size limits imposed by moving to a lower price tier. To resolve the issue, there are some tasks that need to be performed on the database to get it into a “downscale-ready state”. These consist of a series of T-SQL scripts, which I would caution against using¬†if the database is currently in use, due to potential performance impacts. If you have found yourself in the same boat as me and are happy to proceed, the steps involved are as follows:

  1. To begin with, the script below will execute the DBCC SHRINKDATABASE command against the database, setting the database file max size to the value specified on the @DesiredFileSize parameter. The script is compiled so as to perform the shrinking in “chunks” based on the value of the @ShrinkChunkSize parameter, which may be useful in managing DTU consumption:
SET NOCOUNT ON

DECLARE @CurrentFileSize INT, @DesiredFileSize INT, @ShrinkChunkSize INT, @ActualSizeMB INT,
		@ErrorIndication INT, @dbFileID INT = 1, @LastSize INT, @SqlCMD NVARCHAR(MAX),
		@msg NVARCHAR(100)

/*Set these values for the current operation, size is in MB*/
SET @DesiredFileSize = 2000  /* filesize is in MB */
SET @ShrinkChunkSize = 50 /* chunk size is in MB */

SELECT @CurrentFileSize = size/128 FROM sysfiles WHERE fileid = @dbFileID

SELECT @ActualSizeMB = (SUM(total_pages) / 128) FROM sys.allocation_units

SET @msg = 'Current File Size: ' + CAST(@CurrentFileSize AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET  @msg = 'Actual used Size: ' + CAST(@ActualSizeMB AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET @msg = 'Desired File Size: ' + CAST(@DesiredFileSize AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET @msg = 'Interation shrink size: ' + CAST(@ShrinkChunkSize AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET @ErrorIndication = CASE
							WHEN @DesiredFileSize > @CurrentFileSize THEN 1
							WHEN @ActualSizeMB > @DesiredFileSize THEN 2
							ELSE 0 END

IF @ErrorIndication = 1  
	RAISERROR('[Error] Desired size bigger than current size',0,0) WITH NOWAIT
IF @ErrorIndication = 2  
	RAISERROR('[Error] Actual size is bigger then desired size',0,0) WITH NOWAIT
IF @ErrorIndication = 0 
	RAISERROR('Desired Size check - OK',0,0) WITH NOWAIT

SET @LastSize = @CurrentFileSize + 1

WHILE @CurrentFileSize > @DesiredFileSize /*check if we got the desired size*/ AND @LastSize>@CurrentFileSize /* check if there is progress*/ AND @ErrorIndication=0
BEGIN
	SET @msg = CAST(GETDATE() AS VARCHAR(100)) + ' - Iteration starting'
	RAISERROR(@msg,0,0) WITH NOWAIT
	SELECT @LastSize = size/128 FROM sysfiles WHERE fileid = @dbFileID
	SET @sqlCMD = 'DBCC SHRINKFILE('+ CAST(@dbFileID AS VARCHAR(7)) + ',' + CAST(@CurrentFileSize-@ShrinkChunkSize AS VARCHAR(7)) + ') WITH NO_INFOMSGS;'
	EXEC (@sqlCMD)
	SELECT @CurrentFileSize = size/128 FROM sysfiles WHERE fileid  =@dbFileID
	SET @msg = CAST(getdate() AS VARCHAR(100)) + ' - Iteration completed. current size is: ' + CAST(@CurrentFileSize AS VARCHAR(10))
	RAISERROR(@msg,0,0) WITH NOWAIT
END
PRINT 'Done' 
  1. With the database successfully shrunk, verify that the size of the database does not exceed your target @DesiredFileSize value by running the following query:
SELECT * FROM sys.database_files
 
SELECT (SUM(reserved_page_count) * 8192) / 1024 / 1024 AS DbSizeInMB
FROM    sys.dm_db_partition_stats
  1. Although by this stage, the database file sizes should be underneath 2GB, the maximum size of the database is still set to match the pricing tier level. To fix this, execute the following script, substituting the name of your database where appropriate:
ALTER DATABASE MyDatabase MODIFY (MAXSIZE=2GB) 

 You can confirm that this command has been executed successfully by then running the following query and reviewing the output:

SELECT CAST(DATABASEPROPERTYEX ('MyDatabase', 'MaxSizeInBytes') AS FLOAT)/1024.00/1024.00/1024.00 AS 'DB Size in GB'
  1. With the above commands executed, you are now in a position to scale down your database without issue. There are a few ways this can be done but, as you likely already have SQL Server Management Studio or similar open to run the above queries, you can modify the tier of your database via this handy script:
--Scaling down to Basic is easy, as there is only one Max Size/Service Level Objective
--Therefore, just specify Edition

ALTER DATABASE MyDatabase MODIFY (EDITION = 'Basic');

--For other tiers, specify the size of the DB.
--In this example, we are scaling down from Premium P1 1TB to Standard S2 250GB tier

ALTER DATABASE MyDatabase MODIFY (EDITION = 'Standard', MAXSIZE = 250 GB, SERVICE_OBJECTIVE = 'S2');

Although the script will likely execute immediately and indicate as such in any output, the actual scaling operation on the backend Azure platform can take some time to complete – usually about 5-10 minutes for lower sized databases.

Whilst I was relieved that a workaround was available to get the database scaled down correctly, it would have been useful if the above error message was signposted better or if there was some kind of online support article that detailed that this could be a potential issue when moving a database between various pricing/sizing tiers. Hopefully, by sharing the above steps, others who in the same boat can very quickly diagnose and resolve the issue without hammering your credit card with increased database usage charges in the process. ūüôā

Back only a few years ago, when events such as a reality TV star becoming President of the USA were the stuff of fantasy fiction, Microsoft had a somewhat niche Customer Relationship Management system called Dynamics CRM.¬†Indeed, the very name of this blog still attests to this fact. The “CRM” acronym provides a succinct mechanism for describing the purposes of the system without necessarily confusing people straight out of the gate. For the majority of its lifespan, “CRM” very accurately summarised what the core system was about – namely, a sales and case management system to help manage customer relations.

Along the way, Microsoft acquired a number of organisations and products that offered functionality that, when bolted onto¬†Dynamics CRM, greatly enhanced the application as a whole. I have discussed a number of these previously on the blog, so I don’t propose to retread old ground. However, some notable exceptions from this original list include:

Suffice to say, by the start of 2016, the range of functionality available within Dynamics CRM was growing each month and – perhaps more crucially – there was no clear mechanism in place from a billing perspective for each new solution offered. Instead, if you had a Dynamics CRM Professional license, guess what? All of the above and more was available to you at no additional cost.

Taking this and other factors into account, the announcement in mid-2016 of the transition towards Microsoft Dynamics 365 can be seen as a welcome recognition of the new state of play and the ushering in of Dynamics CRM out of the cold to stand proud amongst the other, major Microsoft product offerings. Here’s a link to the original announcement:

Insights from the Engineering Leaders behind Microsoft Dynamics 365 and Microsoft AppSource

The thinking behind the move was completely understandable. Dynamics CRM could no longer be accurately termed as such, as the core application was almost unrecognisable from its 2011 version. Since then, there has been a plethora of additional announcements and changes to how Dynamics 365 in the context of CRM is referred to in the wider offering. The road has been…rocky, to the say the least. Whilst this can be reasonably expected with such a seismic shift, it nevertheless does present some challenges when talking about the application to end users and customers. To emphasise this fact, let’s take a look at some of the “bumps” in the road and my thoughts on why this is still an ongoing concern.

Dynamics 365 for Enterprise and Business

The above announcement did not go into greater detail about how the specific Dynamics 365 offerings would be tailored. One of the advantages of the other offerings within the Office 365 range of products is the separation of business and enterprise plans. These typically allow for a reduced total cost of ownership for organisations under a particular size within an Office 365 plan, typically with an Enterprise version of the same plan available with (almost) complete feature parity, but with no seat limits. With this in mind, it makes sense that the initial detail in late 2016 confirmed the introduction of business and enterprise Dynamics 365 plans. As part of this, the CRM aspect of the offering would have sat firmly within the Enterprise plan, with Рyou guessed it РEnterprise pricing to match. The following article from encore at the time provides a nice breakdown of each offering and the envisioned target audience for each. Thus, we saw the introduction of Dynamics 365 for Enterprise as a replacement term for Dynamics CRM.

Perhaps understandably, when many businesses Рtypically used to paying £40-£50 per seat for their equivalent Dynamics CRM licenses Рdiscovered that they would have to move to Enterprise plans and pricing significantly in excess of what they were paying, there were some heads turned. Microsoft Partners also raised a number of concerns with the strategy, which is why it was announced that the Business edition and Enterprise edition labels were being dropped. Microsoft stated that they would:

…focus on enabling any organization to choose from different price points for each line of business application, based on the level of capabilities and capacity they need to meet their specific needs. For example, in Spring 2018, Dynamics 365 for Sales will offer additional price point(s) with different level(s) of functionality.

The expressed desire to enable organisations to “choose” what they want goes back to what I mentioned at the start of this post – providing a billing/pricing mechanism that would support the modular nature of the late Dynamics CRM product. Its introduction as a concept at this stage comes a little late in the whole conversation regarding Dynamics 365 and represents an important turning point in defining the vision for the product. Whether this took feedback from partners/customers or an internal realisation to bring this about, I’m not sure. But it’s arrival represents the maturity in thinking concerning the wider¬†Dynamics 365¬†offering.

Dynamics 365 for Customer Engagement

Following the retirement of the Business/Enterprise monikers and, in a clear attempt to simplify and highlight the CRM aspect of Dynamics 365, the term¬†Customer Engagement started to pop-up across various online support and informational posts. I cannot seem to locate a specific announcement concerning the introduction of this wording, but its genesis appears to be early or mid-2017. The problem is that I don’t think there is 100% certainty yet on how the exact phrasing of this terminology should be used.

The best way to emphasise the inconsistency is to look to some examples. The first is derived from the name of several courses currently available on the Dynamics Learning Portal, published this year:

Now, take a look at the title and for the following Microsoft Docs article on impending changes to the application:

Notice it yet? There seems to be some confusion about whether for should be used when referring to the Customer Engagement product. Whilst the majority of articles I have found online seem to suggest that Dynamics 365 Customer Engagement is the correct option, again, I cannot point to a definitive source that states without question the correct phraseology that should be used. Regardless, we can see here the birth of the Customer Engagement naming convention, which I would argue is a positive step forward.

The Present Day: Customer Engagement and it’s 1:N Relationships

Rather handily, Customer Engagement takes us straight through to the present. Today, when you go to the pricing website for Dynamics 365, the following handy chart is presented that helps to simplify all of the various options available across the Dynamics 365 range:

This also indirectly confirms a few things for us:

  • Microsoft Dynamics 365 Customer Engagement¬†and not¬†Microsoft Dynamics 365 for Customer Engagement looks to be the approved terminology when referring to what was previously Dynamics CRM.
  • Microsoft Dynamics 365¬†is the overarching name for¬†all¬†modules that form a part of the overall offering. It has, by the looks of things, replaced the original¬†Dynamics 365 for Enterprise¬†designation.
  • The business offering – now known as Dynamics 365 Business Central – is in effect a completely separate offering from Microsoft Dynamics 365.

When rolled together, all of this goes a long way towards providing the guidance needed to correctly refer to the whole, or constituent parts, of the entire Dynamics 365 offering.

With that being said, can it be reliably said that the naming “crisis” has ended?

My key concern through all of this is a confusing and conflicting message being communicated to customers interested in adopting the system, to the potential end result of driving them away to competitor systems. This appears to have been the case for about 1-2 years since the original Dynamics 365 announcement, and a large part of this can perhaps be explained by the insane acquisition drive in 2015/6. Now, with everything appearing to slot together nicely and the pricing platform in place to support each Dynamics 365 “module” and the overall offering, I would hope that any further change in this area is minimal. As highlighted above though, there is still some confusion about the correct replacement terminology for Dynamics CRM – is it Dynamics 365 Customer Engagement or Dynamics 365 for Customer Engagement? Answers on a postcode, please!

Another factor to consider amongst all of this is that naming will constantly be an issue should Microsoft go through another cycle of acquisitions focused towards enhancing the Dynamics 365 offering. Microsoft’s recent acquisition plays appear to be more focused towards providing cost optimization services for Azure and other cloud-based tools, so it can be argued that a period of calm can be expected when it comes to incorporating acquired ISV solutions into the Dynamics 365 product range.

I’d be interested to hear from others on this subject, so please leave a comment below if you have your own thoughts on the interesting journey from Dynamics CRM to Dynamics 365 Customer Engagement