Very much like a stopped clock telling the correct time twice a day, you can guarantee there will be two Dynamics 365 Customer Engagement releases each year. The first such occasion this year has come around quickly, with Microsoft setting out the stall for the Spring 2018 release earlier this week. The headline messages around this release are all around providing reassurance that the application is GDPR ready and in emphasising the maturity of Power Apps & Microsoft Flow as products within their own right and in conjunction with Dynamics 365. I’ve been studying the release notes in greater detail and, as part of this week’s blog post, I wanted to delve underneath the headlines and extrapolate some of the less touted, but potentially most impactful, new features that I am most looking forward to.

Answer Tags for Voice of the Customer Surveys

I made a commitment earlier this year to utilise the Voice of the Customer solution more. When used correctly, and if you are already heavily invested in Dynamics 365, the solution can present a straightforward and cost-effective way of starting to understand what customers are thinking, both with respect to specific experiences they have with your business and towards the organisation overall. One new feature to be introduced with Voice of the Customer, which I am looking forward to getting my hands on, is the ability to use Answer Tags to dynamically structure any subsequent questions within the survey. A good example of how this works in practice can be seen below, as shown in the release notes:

The key driver behind the automation of customer feedback tools should be to ensure that customers receive tailored and relevant surveys relating to services they have received, whilst also taking away any administrative headache when distributing and collating feedback answers. The feature above helps to solidify the benefits that Voice of the Customer can deliver when utilised in tandem with Dynamics 365 Customer Engagement, as well as allowing for more powerful and broadly applicable surveys to be structured at design time.

The rise of the Unified Interface

The rebrand of the entire Dynamics 365 Customer Engagement application has been much promised and touted over the past year. With this release, it becomes a reality. Pretty much every key application module – Customer Service, Sales, Field Service & Project Service Automation – has been updated to utilise the new Unified Interface. The following applications/solutions will also be Unified Interface ready as part of the release:

  • Dynamics 365 App for Outlook
  • LinkedIn Sales Navigator
  • Gamification

The Unified Interface is very much an offshoot of the Interactive Service Hub, which it now replaces fully as part of this release (Interactive Service Hub users should read this article carefully, as there are some important points to consider if you plan to upgrade in the near future). I saw the new unified interface in action when attending the CRMUG Meeting in Reading last year, and its introduction represents one of the ways Microsoft is investing heavily within the Dynamics 365 product moving forward. Its key benefits in comparison to the current experience can be summarised as follows:

  • Consistent end-user experience when accessing the application from desktop, mobile or tablet operating systems.
  • Fully mobile responsive template, that adjusts to your specific device to provide the optimal experience
  • Better utilisation of empty spacing across entity views, forms etc.

With this release, administrators and developers need to start actively considering the impact the Unified Interface has on their systems and plan accordingly. Whilst I imagine there to be some pain involved as part of this, the end result – a much crisper and effective end-user interface – is worth the trade-off.

PowerShell Management for PowerApps

Up until now, your options for the automation of administrative tasks for PowerApps were limited. This issue was addressed to a certain extent for Dynamics 365 Customer Engagement Online very recently, via the introduction of PowerShell modules to facilitate organisation backups, instance resets and/or administrative mode toggling. These types of tools can go a long way if you have implemented automated release management tools for your various environments, taking human error out of the equation and streamlining deployments.

PowerApps looks to be going in the right direction in this regard, as the Spring Wave release will introduce a set of cmdlets that allow for the following actions to be accomplished:

  • Environments and environment permissions
  • PowerApps and app permission
  • Flows and flow permissions
  • Export and import of resource packages across environments
  • PowerApps and Flow licenses report (of active users)

Whilst definitely more administrative as opposed to deployment focused, their introduction is no doubt a welcome step in the right direction.

Future of the Common Data Service

Microsoft released the Common Data Service (CDS) in late 2016, around the same time as Microsoft Flow and the Dynamics CRM rebrand. The premise was simple and admirable: a common framework for you to develop the data you need for your business, that is instantly re-usable across multiple applications. My chief concern when this was first announced is where this left the traditional customisation experience for Dynamics CRM/365 Customer Engagement, commonly referred to as xRM. Having to countenance potential redevelopments of “legacy” xRM systems, just to make them compatible with the CDS could prove to be a costly and unnecessary exercise; this can perhaps be summed up best by the old saying “If it ain’t broke, don’t fix it!”.

There seems to have been a recognition of this dilemma as part of this release, with the following announcement regarding the Common Data Service and PowerApps specifically:

This release also includes major advancements to the Common Data Service for Apps (the data platform that comes with PowerApps) and client UX creation tools. These new capabilities are backward-compatible with the Dynamics 365 platform (frequently called the xRM platform), which means that Dynamics 365 customizers and partners can use already-acquired skills to create apps with PowerApps.

What I think this means, in simple terms, is that the customisation experience between Dynamics 365 Customer Engagement and Power Apps will, in time, become virtually indistinguishable. And this is great for a number of reasons – it negates any excuse that individuals/organisations may raise to explore PowerApps further, gives us the ability to quickly develop our own custom mobile applications for our particular Dynamics 365 solution and provides an easy framework to unify business data across multiple applications. This very much parallels the intended experience that Power BI has for traditional Excel users – namely, providing an identical toolbox that can be leveraged to quickly deploy solutions with reduced technical debt. As with a lot of these announcements, we’re not going to know exactly how things operate until they are in our hands, but the immediate upshot appears to be the nullification of any new learning requirements for CDS.

If you are looking for further detail regarding this change, then the ever so excellent Jukka Niiranen has published a blog post which really breaks down the detail behind this better than I ever could 🙂

Yes, XRM Is The New Common Data Service

Email Notifications for Microsoft Flow Failures

Similar to Voice of the Customer, I also promised myself to use Microsoft Flow more this year. After some uneventful early testing, the tool has become (for me) an indisposable means of achieving integration requirements that would traditionally require custom code and a dedicated server environment to execute. Microsoft Flows do get some much-deserved love and attention as part of this release, and the one new feature which I think is going to be of the biggest help is email notifications for flow failures. The announced feature details are as follows:

Enable email notifications to detect flow failures. To enable this feature, go to the Flow details page, and then, on the contextual menu (…), subscribe to receiving emails about flow failures. These useful email notifications provide:

  • Information about why your flow failed.

  • Meaningful remediation steps.

  • Additional resources to help you build robust flows that never fail.

There’s so much more about this release that you could talk for days about…

…but I would be unsure whether anyone would still be listening by the end! You can dive into the detail behind each of the above highlights and what else to expect in the next release by downloading the release notes yourself. Let me know in the comments below what you are looking forward to the most as part of the next release.

This is an accompanying blog post to my YouTube video Dynamics 365 Customer Engagement Deep Dive: Creating a Basic Custom Workflow Assembly. The video is part of my tutorial series on how to accomplish developer focused tasks within Dynamics 365 Customer Engagement. You can watch the video in full below:

Below you will find links to access some of the resources discussed as part of the video and to further reading topics:

PowerPoint Presentation (click here to download)

Full Code Sample

using System;
using System.Activities;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Workflow;
using Microsoft.Xrm.Sdk.Query;

namespace D365.SampleCWA
{
    public class CWA_CopyQuote : CodeActivity
    {
        protected override void Execute(CodeActivityContext context)
        {
            IWorkflowContext c = context.GetExtension<IWorkflowContext>();

            IOrganizationServiceFactory serviceFactory = context.GetExtension<IOrganizationServiceFactory>();
            IOrganizationService service = serviceFactory.CreateOrganizationService(c.UserId);

            ITracingService tracing = context.GetExtension<ITracingService>();

            tracing.Trace("Tracing implemented successfully!", new Object());

            Guid quoteID = c.PrimaryEntityId;

            Entity quote = service.Retrieve("quote", quoteID, new ColumnSet("freightamount", "discountamount", "discountpercentage", "name", "pricelevelid", "customerid", "description"));

            quote.Id = Guid.Empty;
            quote.Attributes.Remove("quoteid");

            quote.Attributes["name"] = "Copy of " + quote.GetAttributeValue<string>("name");
            Guid newQuoteID = service.Create(quote);

            EntityCollection quoteProducts = RetrieveRelatedQuoteProducts(service, quoteID);
            EntityCollection notes = RetrieveRelatedNotes(service, quoteID);

            tracing.Trace(quoteProducts.TotalRecordCount.ToString() + " Quote Product records returned.", new Object());

            foreach (Entity product in quoteProducts.Entities)
            {
                product.Id = Guid.Empty;
                product.Attributes.Remove("quotedetailid");
                product.Attributes["quoteid"] = new EntityReference("quote", newQuoteID);
                service.Create(product);
            }
            foreach (Entity note in notes.Entities)
            {
                note.Id = Guid.Empty;
                note.Attributes.Remove("annotationid");
                note.Attributes["objectid"] = new EntityReference("quote", newQuoteID);
                service.Create(note);
            }
        }

        [Input("Quote Record to Copy")]
        [ReferenceTarget("quote")]

        public InArgument<EntityReference> QuoteReference { get; set; }
        private static EntityCollection RetrieveRelatedQuoteProducts(IOrganizationService service, Guid quoteID)
        {
            QueryExpression query = new QueryExpression("quotedetail");
            query.ColumnSet.AllColumns = true;
            query.Criteria.AddCondition("quoteid", ConditionOperator.Equal, quoteID);
            query.PageInfo.ReturnTotalRecordCount = true;

            return service.RetrieveMultiple(query);
        }
        private static EntityCollection RetrieveRelatedNotes(IOrganizationService service, Guid objectID)
        {
            QueryExpression query = new QueryExpression("annotation");
            query.ColumnSet.AllColumns = true;
            query.Criteria.AddCondition("objectid", ConditionOperator.Equal, objectID);
            query.PageInfo.ReturnTotalRecordCount = true;

            return service.RetrieveMultiple(query);
        }
    }
}

Download/Resource Links

Visual Studio 2017 Community Edition

Setup a free 30 day trial of Dynamics 365 Customer Engagement

C# Guide (Microsoft Docs)

Source Code Management Solutions

Further Reading

Microsoft Docs – Create a custom workflow activity

MSDN – Register and use a custom workflow activity assembly

MSDN – Update a custom workflow activity using assembly versioning (This topic wasn’t covered as part of the video, but I would recommend reading this article if you are developing an ISV solution involving custom workflow assemblies)

MSDN – Sample: Create a custom workflow activity

You can also check out some of my previous blog posts relating to Workflows:

  • Implementing Tracing in your CRM Plug-ins – We saw as part of the video how to utilise tracing, but this post goes into more detail about the subject, as well as providing instructions on how to enable the feature within the application (in case you are wondering why nothing is being written to the trace log 🙂 ). All code examples are for Plug-ins, but they can easily be repurposed to work with a custom workflow assembly instead.
  • Obtaining the User who executed a Workflow in Dynamics 365 for Customer Engagement (C# Workflow Activity) – You may have a requirement to trigger certain actions within the application, based on the user who executed a Workflow. This post walks through how to achieve this utilising a custom workflow assembly.

If you have found the above video useful and are itching to learn more about Dynamics 365 Customer Engagement development, then be sure to take a look at my previous videos/blog posts using the links below:

Have a question or an issue when working through the code samples? Be sure to leave a comment below or contact me directly, and I will do my best to help. Thanks for reading and watching!

I started working with Azure Application Insights towards the end of last year, during a major project for an enterprise organisation. At the time, our principal requirement was to ensure that we could effectively track the usage patterns of a website, the pages visited, amount of time spent on each page etc. – all information types that, traditionally, you may turn to tools such as Google Analytics to generate. At the time, what swung the decision to go for Application Insights was the added value that the service provides from a developer standpoint, particularly if you are already heavily invested within .NET or Azure. The ability to view detailed information regarding errors on a web page, to automatically export this information out into Team Foundation Server/Team Services for further investigation and the very expendable and customisable manner in which data can be accessed or exported were all major benefits for our particular scenario. It’s a decision which I don’t think I would ever look back on and, even today, the product is still finding very neat and clever ways of surprising me 🙂

One of the other features that make Application Insights a more wholly expandable solution compared with Google Analytics is the ability to extend the amount of information that is scraped from your website page requests or visitors. These properties will then be viewable within Application Insights Analytics, as well as being exportable (as we will discover shortly). For example, if you are interested in determining the previous URL that a web user was on before visiting a page within a C# MVC website, create a new class file within your project called ReferrerTelemetryInitializer and add the following sample code:

using Microsoft.ApplicationInsights.Extensibility;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using Microsoft.ApplicationInsights.Channel;
using Microsoft.ApplicationInsights.DataContracts;

namespace MyNamespace.MyProject.MyProjectName.MVC.Utilities
{
    public class ReferrerTelemetryInitializer : ITelemetryInitializer
    {
        public void Initialize(ITelemetry telemetry)
        {
            if(telemetry is RequestTelemetry)
            {
                string referrer = HttpContext.Current.Request?.UrlReferrer?.ToString();
                telemetry.Context.Properties["Referrer"] = referrer;
            }
        }
    }
}

Be sure to add all required service references and rename your namespace value accordingly before deploying out. After this is done, the following query can then be executed within Application Insights Analytics to access the Referral information:

requests
| extend referrer = tostring(customDimensions.Referrer)
| project id, session_Id, user_Id, referrer

The extend portion of the query is required because Application Insights groups all custom property fields together into a key/value array. If you are working with other custom property fields, then you would just replace the value after the customDimensions. portion of the query with the property name declared within your code.

If you have very short-term and simplistic requirements for monitoring your website performance data, then the above solution will likely be sufficient for your purposes. But maybe you require data to be stored beyond the default 90-day retention limit or you have a requirement to incorporate the information as part of an existing database or reporting application. This is where the Continuous Export feature becomes really handy, by allowing you to continually stream all data that passes through the service to an Azure Storage Account. From there, you can look at configuring a Stream Analytics Job to parse the data and pass it through to your required location. Microsoft very handily provides two guides on how to get this data into a SQL database and also into a Stream Analytics Dashboard within Power BI.

What I like about Stream Analytics the most is that it utilises a SQL-like language when interacting with data. For recovering T-SQL addicts like myself, this can help overcome a major learning barrier when first getting to grips with the service. I would still highlight some caution, however, and recommend you have the online documentation for the Stream Analytics Query Language to hand, as there a few things that may catch you out. A good example of this is that whilst the language supports data type conversions via CAST operations, exactly like T-SQL, you are restricted to a limited list of data types as part of the conversion.

SQL developers may also encounter a major barrier when working with custom property data derived from Application Insights. Given the nature of how these are stored, there is specific Stream Analytics syntax that has to be utilised to access individual property values. We’ll take a closer look now to see just how this is done, continuing our earlier example utilising the Referrer field.

First of all, make sure you have configured an Input to your Request data from within Data Analytics. The settings should look similar to the image below if done correctly:

The full Path pattern will be the name of your Application Insights resource, a GUID value that can be found on the name of the Storage Account container utilised for Continuous Export, and then the path pattern that determines the folder name and the variables for date/time. It should look similar to this:

myapplicationinsightsaccount_16ef6b1f59944166bc37198a41c3fcf1/Requests/{date}/{time}

With your input configured correctly, you can now navigate to the Query tab and utilise the following query to return the same information that we accessed above within Application Insights:

SELECT 
    requestflat.ArrayValue.id AS RequestID,
    r.context.session.id AS SessionID,
	r.context.[user].anonId AS AnonymousID,
    GetRecordPropertyValue(GetArrayElement(r.[context].[custom].[dimensions], 5), 'Referrer') AS ReferralURL,
INTO [RequestsOutput]
FROM [Requests] AS r
CROSS APPLY GetElements(r.[request]) AS requestflat

Now, its worth noting that the GetArrayElement function is reliant on a position value within the array to return the data correctly. For the example provided above, the position of the Referrer field is always the fifth key/value pair within the array. Therefore, it may be necessary to inspect the values within the context.custom.dimensions object to determine the position of your required field. In the above query, you could add the field r.context.custom.dimensions to your SELECT clause to facilitate any further interrogation.

Application Insights in of itself, arguably, provides feature-rich and expansive capabilities as a standalone web analytics tool. It certainly is a lot more “open” with respect to how you can access the data – a welcome design decision that is at the heart of a lot Microsoft products these days. When you start to combine Application Insights with Stream Analytics, a whole world of additional reporting capabilities and long-term analysis can be immediately unlocked for your web applications. Stream Analytics is also incredibly helpful for those who have a much more grounded background working with databases. Using the tool, you can quickly interact and convert data into the required format using a familiar language. It would be good to perhaps see, as part of some of the examples available online, tutorials on how to straightforwardly access Custom Dimensions properties, so as to make this task simpler to achieve. But this in and of itself does not detract away from how impressive I think both tools are – both as standalone products and combined together.

This week’s blog post is sponsored by ActiveCrypt Software.

Encryption appears to be a topic of near constant discussion at the moment, spearheaded primarily by the impending deadline of the General Data Protection Regulations (GDPR). These are, in essence, a new set of data protection rules that will apply to all organisations operating within the European Economic Area (EEA). A key aspect of them concerns implementing appropriate technical controls over sensitive data categories, to mitigate against any damage resulting from a data breach. Now, the key thing to highlight around this is the “proportionality” aspect; i.e. any technical controls implemented should be reasonably expected, based on the size of the organisation in question and the nature of their data processing/controlling activity. You should, therefore, be carefully evaluating your organisation to identify whether the lack of encryption could result in damage to a data subject.

I’ve had a look previously at database encryption in the context of Dynamics 365 Customer Engagement. What is nice about the application, and nearly all of Microsoft’s Software as a Service (SaaS) products at the moment, is that GDPR is very much at the centre of each individual offering. I have been genuinely impressed to see the level of effort Microsoft has been devoting to GDPR and in ensuring their SaaS product lines are compliant with the regulations – often without the need for charging customers an arm and a leg in the process. The same can perhaps not be said for any on-premise equivalent of a particular SaaS product. This is, to be fair, expected – Microsoft has been incredibly vocal about adopting a “cloud first” strategy in all things. But for organisations who do find themselves having to support on-premise applications or database systems, the journey towards implementing the required technical solutions for encryption could be rocky.

Case in point – SQL Server has long provided the capability to implement Transparent Database Encryption (TDE), which satisfies the requirement for at rest encryption without the need to redevelop applications from the ground up. Setting up Transparent Database Encryption can be an onerous process (more on this in a second), and requires the involvement of manual scripting. The following script outlines all the steps involved:

--First, a Master Key should be created on the Server instance

USE master;  
GO  
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'mymasterkey';  
GO

--Next, a Certificate for the Server should be created.

CREATE CERTIFICATE MyCert WITH SUBJECT = 'DEK Certificate for testing purposes';  
GO

--This then allows for a Database Encryption Key to be created for encrypting a database. This needs to be created for
--EVERY database that requires encryption

USE EncryptionTest;  
GO  
CREATE DATABASE ENCRYPTION KEY  
WITH ALGORITHM = AES_256  
ENCRYPTION BY SERVER CERTIFICATE MyCert;  
GO  

--Once created, Encryption can then be enabled/disabled using the snippets below

ALTER DATABASE MyTestDatabase 
SET ENCRYPTION ON;  
GO

ALTER DATABASE MyTestDatabase 
SET ENCRYPTION OFF;
GO

--The Server Certificate should be backed up for disaster recovery scenarios or to enable databases to be restored to
--other SQL Server instances. First, backup the certificate with an encrypted private key...

USE master;
GO
BACKUP CERTIFICATE MyCert TO FILE = 'C:\MyCert.cer'  
    WITH PRIVATE KEY ( FILE = 'C:\MyCert.pvk',
    ENCRYPTION BY PASSWORD = 'mypassword');
GO  

--Once saved, execute the following code on the target instance to restore the certificate...

CREATE CERTIFICATE MyCert FROM FILE ='C:\MyCert.cer'
	WITH PRIVATE KEY(FILE='C:\MyCert.pvk', DECRYPTION BY PASSWORD='mypassword');

Whilst TDE is a neat solution, it does have some issues:

  • It’s important to keep in mind any potential disaster recovery scenario, when working with TDE, by backing up the server certificate to a separate physical location. The above script provides the necessary snippet to accomplish this, so it is imperative that this is done for every certificate you plan to work with.
  • All required configuration steps have to be accomplished via scripting and the feature is not enabled by default, unlike Azure SQL Databases. Depending on your level of expertise when working with SQL Server, you may have to leverage assistance from other sources to get up and running with the feature.
  • Perhaps the biggest barrier to adopting TDE is the version restrictions. It is only made available as part of the Developer and Enterprise editions of SQL Server. As the name suggests, the Developer edition is licensed strictly for non-Production environments and the Enterprise edition has a staggering cost, licensed based on the number of cores the target server is running. To put this into better context, I was recently quoted a whopping £68,000 through Microsoft Volume Licensing! For most organisations, this can result in an incredibly high cost of ownership just to satisfy a single requirement.

Fortunately, for those who are wanting to implement database encryption via an accessible interface, there are a number of products available on the market to assist. The best one I have come across is DbDefence from ActiveCrypt, which offers a simple to use and efficient means of configuring encryption for your databases. In fact, depending on your database size, you can more than likely have your databases encrypted in less than 5 minutes 🙂 Let’s take a closer look at how straightforward the software is to use by encrypting a database from scratch:

  1. After downloading the installation package, you will need to run it on the server where your SQL Server instance resides. During the installation process, the Full installation option can be selected and you will also need to specify the SQL Server instance that you wish to utilise with the software:

  1. After the installation completes successfully, launch the application and then connect to your target SQL Server instance. Next, select the database that you want to encrypt. You should see a window similar to the below if done correctly:
  2. At this point, you could choose to accept the default Encryption and Protection options and proceed to the next step. However, I would recommend changing the options as follows:
    • Modify the AES Encryption Options value to 256-bit. Whilst the risk of a successful brute force between 128 and 256 bit is effectively zero, 256 still supports longer keys and is, therefore, more secure.
    • In most cases, you just need to ensure data is encrypted at rest and not provide any additional access restrictions beyond this. In these situations, I would recommend setting the required level of protection to Only Encryption. Maximum Transparency. This negates the need for any additional configuration after encryption to ensure your client applications still work successfully.

  1. To encrypt the database, a password/key is required. You should always ensure you utilise a random, sequential password that contains upper/lower case letters, numbers and symbols. I would also recommend having a seperate password for each database you encrypt and to ensure that these are all stored seperately (as they may be required to decrypt the databases at a later date). The length of the password to use will depend on the AES encryption mode, but if you are using 256 bit, then an 18 character password is recommended.
  2. When you are ready to start the encryption process, press the Encrypt button and confirm the warning box that appears:

Give it a few minutes and you will then be able to see in the main window that your database has been encrypted successfully:

If you ever have the requirement to decrypt the database, then you can return to the application at any time, connect up to the database, enter the password and then press Decrypt:

  1. As a final step, you can then test that your database files have been encrypted successfully by attempting to mount the encrypted database files onto a seperate SQL Server instance. You should get an error message similar to the below, indicating that your database has been encrypted successfully:

Conclusions or Wot I Think

The world of encryption can be a veritable nightmare to those approaching for the first time, and GDPR can be blamed – but also, I would argue, welcomed – in raising the profile of the topic recently. As with a lot of things concerning GDPR, there is a real opportunity for organisations to get a handle on the personal data they work with every day and to implement the required processes and systems to ensure the right thing is being done when handling sensitive data. Database encryption is one weapon in your arsenel when it comes to satisfying a number of areas within GDPR; but, as we have seen, the total cost of ownership and technical expertise required to implement such a solution could – regrettably – force many to simply look the other way when it comes to securing their databases. DbDefence assists greatly in both these regards – by significantly reducing cost and providing a simplified, easy to use interface, to deploy database encryption within minutes. What’s great as well is that, as part of evaluating the software, I found the support team at ActiveCrypt incredibly reactive and helpful in dealing with the queries I had around the product. If you are looking for a cheaper, yet wholly effective, solution to implement database encryption for SQL Server, then I would not hesitate to recommend the DbDefence product.