This is an accompanying blog post to my YouTube video Dynamics 365 Customer Engagement Deep Dive: Creating a Basic Custom Workflow Assembly. The video is part of my tutorial series on how to accomplish developer focused tasks within Dynamics 365 Customer Engagement. You can watch the video in full below:

Below you will find links to access some of the resources discussed as part of the video and to further reading topics:

PowerPoint Presentation (click here to download)

Full Code Sample

using System;
using System.Activities;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Workflow;
using Microsoft.Xrm.Sdk.Query;

namespace D365.SampleCWA
    public class CWA_CopyQuote : CodeActivity
        protected override void Execute(CodeActivityContext context)
            IWorkflowContext c = context.GetExtension<IWorkflowContext>();

            IOrganizationServiceFactory serviceFactory = context.GetExtension<IOrganizationServiceFactory>();
            IOrganizationService service = serviceFactory.CreateOrganizationService(c.UserId);

            ITracingService tracing = context.GetExtension<ITracingService>();

            tracing.Trace("Tracing implemented successfully!", new Object());

            Guid quoteID = c.PrimaryEntityId;

            Entity quote = service.Retrieve("quote", quoteID, new ColumnSet("freightamount", "discountamount", "discountpercentage", "name", "pricelevelid", "customerid", "description"));

            quote.Id = Guid.Empty;

            quote.Attributes["name"] = "Copy of " + quote.GetAttributeValue<string>("name");
            Guid newQuoteID = service.Create(quote);

            EntityCollection quoteProducts = RetrieveRelatedQuoteProducts(service, quoteID);
            EntityCollection notes = RetrieveRelatedNotes(service, quoteID);

            tracing.Trace(quoteProducts.TotalRecordCount.ToString() + " Quote Product records returned.", new Object());

            foreach (Entity product in quoteProducts.Entities)
                product.Id = Guid.Empty;
                product.Attributes["quoteid"] = new EntityReference("quote", newQuoteID);
            foreach (Entity note in notes.Entities)
                note.Id = Guid.Empty;
                note.Attributes["objectid"] = new EntityReference("quote", newQuoteID);

        [Input("Quote Record to Copy")]

        public InArgument<EntityReference> QuoteReference { get; set; }
        private static EntityCollection RetrieveRelatedQuoteProducts(IOrganizationService service, Guid quoteID)
            QueryExpression query = new QueryExpression("quotedetail");
            query.ColumnSet.AllColumns = true;
            query.Criteria.AddCondition("quoteid", ConditionOperator.Equal, quoteID);
            query.PageInfo.ReturnTotalRecordCount = true;

            return service.RetrieveMultiple(query);
        private static EntityCollection RetrieveRelatedNotes(IOrganizationService service, Guid objectID)
            QueryExpression query = new QueryExpression("annotation");
            query.ColumnSet.AllColumns = true;
            query.Criteria.AddCondition("objectid", ConditionOperator.Equal, objectID);
            query.PageInfo.ReturnTotalRecordCount = true;

            return service.RetrieveMultiple(query);

Download/Resource Links

Visual Studio 2017 Community Edition

Setup a free 30 day trial of Dynamics 365 Customer Engagement

C# Guide (Microsoft Docs)

Source Code Management Solutions

Further Reading

Microsoft Docs – Create a custom workflow activity

MSDN – Register and use a custom workflow activity assembly

MSDN – Update a custom workflow activity using assembly versioning (This topic wasn’t covered as part of the video, but I would recommend reading this article if you are developing an ISV solution involving custom workflow assemblies)

MSDN – Sample: Create a custom workflow activity

You can also check out some of my previous blog posts relating to Workflows:

  • Implementing Tracing in your CRM Plug-ins – We saw as part of the video how to utilise tracing, but this post goes into more detail about the subject, as well as providing instructions on how to enable the feature within the application (in case you are wondering why nothing is being written to the trace log šŸ™‚ ). All code examples are for Plug-ins, but they can easily be repurposed to work with a custom workflow assembly instead.
  • Obtaining the User who executed a Workflow in Dynamics 365 for Customer Engagement (C# Workflow Activity) – You may have a requirement to trigger certain actions within the application, based on the user who executed a Workflow. This post walks through how to achieve this utilising a custom workflow assembly.

If you have found the above video useful and are itching to learn more about Dynamics 365 Customer Engagement development, then be sure to take a look at my previous videos/blog posts using the links below:

Have a question or an issue when working through the code samples? Be sure to leave a comment below or contact me directly, and I will do my best to help. Thanks for reading and watching!

I started working with Azure Application Insights towards the end of last year, during a major project for an enterprise organisation. At the time, our principal requirement was to ensure that we could effectively track the usage patterns of a website, the pages visited, amount of time spent on each page etc. – all information types that, traditionally, you may turn to tools such as Google Analytics to generate. At the time, what swung the decision to go for Application Insights was the added value that the service provides from a developer standpoint, particularly if you are already heavily invested within .NET or Azure. The ability to view detailed information regarding errors on a web page, to automatically export this information out into Team Foundation Server/Team Services for further investigation and the very expendable and customisable manner in which data can be accessed or exported were all major benefits for our particular scenario. It’s a decision which I don’t think I would ever look back on and, even today, the product is still finding very neat and clever ways of surprising me šŸ™‚

One of the other features that make Application Insights a more wholly expandable solution compared with Google Analytics is the ability to extend the amount of information that is scraped from your website page requests or visitors. These properties will then be viewable within Application Insights Analytics, as well as being exportable (as we will discover shortly). For example, if you are interested in determining the previous URL that a web user was on before visiting a page within a C# MVC website, create a new class file within your project calledĀ ReferrerTelemetryInitializerĀ and add the following sample code:

using Microsoft.ApplicationInsights.Extensibility;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using Microsoft.ApplicationInsights.Channel;
using Microsoft.ApplicationInsights.DataContracts;

namespace MyNamespace.MyProject.MyProjectName.MVC.Utilities
    public class ReferrerTelemetryInitializer : ITelemetryInitializer
        public void Initialize(ITelemetry telemetry)
            if(telemetry is RequestTelemetry)
                string referrer = HttpContext.Current.Request?.UrlReferrer?.ToString();
                telemetry.Context.Properties["Referrer"] = referrer;

Be sure to add all required service references and rename your namespace value accordingly before deploying out. After this is done, the following query can then be executed within Application Insights Analytics to access the Referral information:

| extend referrer = tostring(customDimensions.Referrer)
| project id, session_Id, user_Id, referrer

TheĀ extend portion of the query is required because Application Insights groups all custom property fields together into a key/value array. If you are working with other custom property fields, then you would just replace the value after theĀ customDimensions. portion of the query with the property name declared within your code.

If you have very short-term and simplistic requirements for monitoring your website performance data, then the above solution will likely be sufficient for your purposes. But maybe you require data to be stored beyond the default 90-day retention limit or you have a requirement to incorporate the information as part of an existing database or reporting application. This is where the Continuous Export feature becomes really handy, by allowing you to continually stream all data that passes through the service to an Azure Storage Account. From there, you can look at configuring a Stream Analytics Job to parse the data and pass it through to your required location. Microsoft very handily provides two guides on how to get this data into a SQL database and also into a Stream Analytics Dashboard within Power BI.

What I like about Stream Analytics the most is that it utilises a SQL-like language when interacting with data. For recovering T-SQL addicts like myself, this can help overcome a major learning barrier when first getting to grips with the service. I would still highlight some caution, however, and recommend you have the online documentation for the Stream Analytics Query Language to hand, as there a few things that may catch you out. A good example of this is that whilst the language supports data type conversions via CAST operations, exactly like T-SQL, you are restricted to a limited list of data types as part of the conversion.

SQL developers may also encounter a major barrier when working with custom property data derived from Application Insights. Given the nature of how these are stored, there is specific Stream Analytics syntax that has to be utilised to access individual property values. We’ll take a closer look now to see just how this is done, continuing our earlier example utilising the Referrer field.

First of all, make sure you have configured an Input to your Request data from within Data Analytics. The settings should look similar to the image below if done correctly:

The full Path pattern will be the name of your Application Insights resource, a GUID value that can be found on the name of the Storage Account container utilised for Continuous Export, and then the path pattern that determines the folder name and the variables for date/time. It should look similar to this:


With your input configured correctly, you can now navigate to the Query tab and utilise the following query to return the same information that we accessed above within Application Insights:

SELECT AS RequestID, AS SessionID,
	r.context.[user].anonId AS AnonymousID,
    GetRecordPropertyValue(GetArrayElement(r.[context].[custom].[dimensions], 5), 'Referrer') AS ReferralURL,
INTO [RequestsOutput]
FROM [Requests] AS r
CROSS APPLY GetElements(r.[request]) AS requestflat

Now, its worth noting that the GetArrayElement function is reliant on a position value within the array to return the data correctly. For the example provided above, the position of the Referrer field is always the fifth key/value pair within the array. Therefore, it may be necessary to inspect the values within the context.custom.dimensions object to determine the position of your required field. In the above query, you could add the field r.context.custom.dimensions to your SELECT clause to facilitate any further interrogation.

Application Insights in of itself, arguably, provides feature-rich and expansive capabilities as a standalone web analytics tool. It certainly is a lot more “open” with respect to how you can access the data – a welcome design decision that is at the heart of a lot Microsoft products these days. When you start to combine Application Insights with Stream Analytics, a whole world of additional reporting capabilities and long-term analysis can be immediately unlocked for your web applications. Stream Analytics is also incredibly helpful for those who have a much more grounded background working with databases. Using the tool, you can quickly interact and convert data into the required format using a familiar language. It would be good to perhaps see, as part of some of the examples available online, tutorials on how to straightforwardly access Custom Dimensions properties, so as to make this task simpler to achieve. But this in and of itself does not detract away from how impressive I think both tools are – both as standalone products and combined together.

This week’s blog post is sponsored by ActiveCrypt Software.

Encryption appears to be a topic of near constant discussion at the moment, spearheaded primarily by the impending deadline of the General Data Protection Regulations (GDPR). These are, in essence, a new set of data protection rules that will apply to all organisations operating within the European Economic Area (EEA). A key aspect of them concerns implementing appropriate technical controls over sensitive data categories, to mitigate against any damage resulting from a data breach. Now, the key thing to highlight around this is the “proportionality” aspect; i.e. any technical controls implemented should be reasonably expected, based on the size of the organisation in question and the nature of their data processing/controlling activity. You should, therefore, be carefully evaluating your organisation to identify whether the lack of encryption could result in damage to a data subject.

I’ve had a look previously at database encryption in the context of Dynamics 365 Customer Engagement. What is nice about the application, and nearly all of Microsoft’s Software as a Service (SaaS) products at the moment, is that GDPR is very much at the centre of each individual offering. I have been genuinely impressed to see the level of effort Microsoft has been devoting to GDPR and in ensuring their SaaS product lines are compliant with the regulations – often without the need for charging customers an arm and a leg in the process. The same can perhaps not be said for any on-premise equivalent of a particular SaaS product. This is, to be fair, expected – Microsoft has been incredibly vocal about adopting a “cloud first” strategy in all things. But for organisations who do find themselves having to support on-premise applications or database systems, the journey towards implementing the required technical solutions for encryption could be rocky.

Case in point – SQL Server has long provided the capability to implement Transparent Database Encryption (TDE), which satisfies the requirement for at rest encryption without the need to redevelop applications from the ground up. Setting up Transparent Database Encryption can be an onerous process (more on this in a second), and requires the involvement of manual scripting. The following script outlines all the steps involved:

--First, a Master Key should be created on the Server instance

USE master;  

--Next, a Certificate for the Server should be created.

CREATE CERTIFICATE MyCert WITH SUBJECT = 'DEK Certificate for testing purposes';  

--This then allows for a Database Encryption Key to be created for encrypting a database. This needs to be created for
--EVERY database that requires encryption

USE EncryptionTest;  

--Once created, Encryption can then be enabled/disabled using the snippets below



--The Server Certificate should be backed up for disaster recovery scenarios or to enable databases to be restored to
--other SQL Server instances. First, backup the certificate with an encrypted private key...

USE master;
    WITH PRIVATE KEY ( FILE = 'C:\MyCert.pvk',
    ENCRYPTION BY PASSWORD = 'mypassword');

--Once saved, execute the following code on the target instance to restore the certificate...


Whilst TDE is a neat solution, it does have some issues:

  • It’s important to keep in mind any potential disaster recovery scenario, when working with TDE, by backing up the server certificate to a separate physical location. The above script provides the necessary snippet to accomplish this, so it is imperative that this is done for every certificate you plan to work with.
  • All required configuration steps have to be accomplished via scripting and the feature is not enabled by default, unlike Azure SQL Databases. Depending on your level of expertise when working with SQL Server, you may have to leverage assistance from other sources to get up and running with the feature.
  • Perhaps the biggest barrier to adopting TDE is the version restrictions. It is only made available as part of the Developer and Enterprise editions of SQL Server. As the name suggests, the Developer edition is licensed strictly for non-Production environments and the Enterprise edition has a staggering cost, licensed based on the number of cores the target server is running. To put this into better context, I was recently quoted a whopping Ā£68,000 through Microsoft Volume Licensing! For most organisations, this can result in an incredibly high cost of ownership just to satisfy a single requirement.

Fortunately, for those who are wanting to implement database encryption via an accessible interface, there are a number of products available on the market to assist. The best one I have come across is DbDefence from ActiveCrypt, which offers a simple to use and efficient means of configuring encryption for your databases. In fact, depending on your database size, you can more than likely have your databases encrypted in less than 5 minutes šŸ™‚ Let’s take a closer look at how straightforward the software is to use by encrypting a database from scratch:

  1. After downloading the installation package, you will need to run it on the server where your SQL Server instance resides. During the installation process, theĀ Full installation option can be selected and you will also need to specify the SQL Server instance that you wish to utilise with the software:

  1. After the installation completes successfully, launch the application and then connect to your target SQL Server instance. Next, select the database that you want to encrypt. You should see a window similar to the below if done correctly:
  2. At this point, you could choose to accept the defaultĀ Encryption and ProtectionĀ options and proceed to the next step. However, I would recommend changing the options as follows:
    • Modify the AES Encryption Options value to 256-bit. Whilst the risk of a successful brute force between 128 and 256 bit is effectively zero, 256 still supports longer keys and is, therefore, more secure.
    • In most cases, you just need to ensure data is encrypted at rest and not provide any additional access restrictions beyond this. In these situations, I would recommend setting the required level of protection to Only Encryption. Maximum Transparency.Ā This negates the need for any additional configuration after encryption to ensure your client applications still work successfully.

  1. To encrypt the database, a password/key is required. You should always ensure you utilise a random, sequential password that contains upper/lower case letters, numbers and symbols. I would also recommend having a seperate password for each database you encrypt and to ensure that these are all stored seperately (as they may be required to decrypt the databases at a later date). The length of the password to use will depend on the AES encryption mode, but if you are using 256 bit, then an 18 character password is recommended.
  2. When you are ready to start the encryption process, press the Encrypt button and confirm the warning box that appears:

Give it a few minutes and you will then be able to see in the main window that your database has been encrypted successfully:

If you ever have the requirement to decrypt the database, then you can return to the application at any time, connect up to the database, enter the password and then pressĀ Decrypt:

  1. As a final step, you can then test that your database files have been encrypted successfully by attempting to mount the encrypted database files onto a seperate SQL Server instance. You should get an error message similar to the below, indicating that your database has been encrypted successfully:

Conclusions or Wot I Think

The world of encryption can be a veritable nightmare to those approaching for the first time, and GDPR can be blamed – but also, I would argue, welcomed – in raising the profile of the topic recently. As with a lot of things concerning GDPR, there is a real opportunity for organisations to get a handle on the personal data they work with every day and to implement the required processes and systems to ensure the right thing is being done when handling sensitive data. Database encryption is one weapon in your arsenel when it comes to satisfying a number of areas within GDPR; but, as we have seen, the total cost of ownership and technical expertise required to implement such a solution could – regrettably – force many to simply look the other way when it comes to securing their databases. DbDefence assists greatly in both these regards – by significantly reducing cost and providing a simplified, easy to use interface, to deploy database encryption within minutes. What’s great as well is that, as part of evaluating the software, I found the support team at ActiveCrypt incredibly reactive and helpful in dealing with the queries I had around the product. If you are looking for a cheaper, yet wholly effective, solution to implement database encryption for SQL Server, then I would not hesitate to recommend the DbDefence product.

When working with web applications and Azure App Service, it may sometimes be necessary to delete or remove files from a website. Whether it is a deprecated feature or a bit of development “junk” that was accidentally left on your website, these files can often introduce processing overhead or even security vulnerabilities if left unattended. It is, therefore, good practice to ensure that these are removed during a deployment. Fortunately, this is where tools such as Web Deploy really come into their element. When using this from within Visual Studio via the Publish dialog box, you can specify to remove any file that does not exist within your Project via theĀ File Publish Options section on theĀ SettingsĀ tab:

There is also theĀ Exclude files from the App_Data folder setting, which has a bearing on how the above operates, but we’ll come back to that shortly…

Whilst this feature is useful if you deploying out to a dev/test environment manually from Visual Studio, it is less so if you have implemented an automated release strategy via a tool such as Visual Studio Team Services (the cloud version of Team Foundation Services). This is where all steps as part of a deployment are programmatically defined and then re-used whenever a new release to an environment needs to be performed. Typically, this may involve some coding as part of a PowerShell script or similar, but what makes Team Services really effective is the ability to “drag and drop” common deployment actions that cover most release scenarios. Case in point – a specific task is provided out of the box to handle deployments to Azure App Service:

What’s even better is the fact that we have the same option at our disposal Ć  laĀ Visual Studio – although you would be forgiven for overlooking it given how neatly the settings are tucked away šŸ™‚

Note that you have to specifically enable the option to Publish using Web Deploy for the Remove additional files at destination option to appear. It’s important, therefore, that you fully understand how Web Deploy works in comparison to other options, such as FTP deploy. I will think you will find, though, that the list of benefits far outweighs any negatives. In fact, the only drawback of using this option is that you must be using Microsoft “approved” tools, such as Visual Studio, to facilitate.

We saw earlier in this post the option forĀ Exclude files from the App_Data folder setting. Typically, you may be using this folder as some form of local file store for configuration data or similar. It is also the location where any WebJobs configured for your website would be stored. With theĀ Exclude files from the App_Data folder setting enabled, you may assume that Web Deploy will indiscriminately delete all files residing within theĀ App_DataĀ directory. Luckily, the automated task instead throws an error if it detects any files within the directory that may be affected by the delete operation:

Helpful in the sense that the task is not deleting files which could be potentially important, frustrating in that the deployment will not complete successfully! As you may have already guessed, enabling theĀ Exclude files from the App_Data folder setting in Visual Studio/on the Team Services task gets around this issue:

You can then sit back and verify as part of the Team Services Logs that any file not in your source project is deleted successfully during deployment:

Manual deployments to Production systems can be fraught with countless hidden dangers – the potential for an accidental action chief among them, but also the risk of outdated components of a system not being flagged up and removed as part of a release cycle. Automating deployments go a long way in taking human error out of this equation and, with the inclusion of this handy feature to remove files not within your source code during the deployment, also negates the need for any manual intervention after a deployment to rectify any potential issues. If you are still toying with introducing fully automated deployments within your environment, then I would urge wholeheartedly to commit the effort towards achieving this outcome. Get in touch if you need any help with this, and I would be happy to lend some assistance šŸ™‚

Slight change of pace with this week’s blog post, which will be a fairly condensed and self-indulgent affair – due to personal circumstances, I have been waylaid somewhat when it comes to producing content for the blog and I have also been unable to make any further progress with my new YouTube video series. Hoping that normal service will resume shortly, meaning additional videos and more content-rich blog posts, so stay tuned.

I’ve been running the CRM Chap blog for just over 2 years now. Over this time, I have been humbled and proud to have received numerous visitors to the site, some of whom have been kind enough to provide feedback or to share some of their Dynamics CRM/365 predicaments with me. Having reached such a landmark now seems to be good a time as any to take a look back on the posts that have received the most attention and to, potentially, give those who missed them the opportunity to read them. In descending order, here is the list of the most viewed posts to date on the website:

  1. Utilising SQL Server Stored Procedures with Power BI
  2. Installing Dynamics CRM 2016 SP1 On-Premise
  3. Power BI Deep Dive: Using the Web API to Query Dynamics CRM/365 for Enterprise
  4. Utilising Pre/Post Entity Images in a Dynamics CRM Plugin
  5. Modifying System/Custom Views FetchXML Query in Dynamics CRM
  6. Grant Send on Behalf Permissions for Shared Mailbox (Exchange Online)
  7. Getting Started with Portal Theming (ADXStudio/CRM Portals)
  8. Microsoft Dynamics 365 Data Export Service Review
  9. What’s New in the Dynamics 365 Developer Toolkit
  10. Implementing Tracing in your CRM Plug-ins

I suppose it is a testament to the blog’s stated purpose that posts covering areas not exclusive to Dynamics CRM/365 rank so highly on the list and, indeed, represents how this application is so deeply intertwined with other technology areas within the Microsoft “stack”.

To all new and long-standing followers of the blog, thank you for your continued support and appreciation for the content šŸ™‚