The Voice of the Customer (VoC) add-on solution for Dynamics 365 Customer Engagement (D365CE) presents a really nice way of incorporating survey capabilities within your existing Dynamics application estate, without any additional cost or significant administrative overhead. I’ve talked about the tool previously, within the context of specific application errors, and I can attest to its capabilities – both as a standalone solution and as one that can be leveraged alongside other D365CE functionality to generate additional value.

One feature that is particularly useful is the ability to include diverse Survey Response controls. This can cover the range of anticipated user inputs that most web developers would be used to – text inputs, ratings, date pickers etc. – along with more marketing specific choices such as Net Promoter Score and even a Smilies rating control. The final one of these really does have to be seen to wholly appreciate:

I hope you agree that this is definitely one of those features that becomes so fun that it soaks up WAY more time than necessary ūüôā

One of the final options that VoC provides you is the ability to upload files to a Survey Response, which is stored within the application and made retrievable at any time by locating the appropriate Survey Response record. You can customise the guidance text presented to the user for this control, such as in the example below:

Uploaded files are then saved onto an Azure Blob Storage location (which you don’t have direct access to), with the access URL stored within D365CE. The inclusion of this feature does provide the capability to accommodate several potential business scenarios, such as:

  • Allowing a service desk to create an automated survey that allows error logs or screenshots to be uploaded for further diagnosis.
  • The gathering of useful photographic information as part of a pre-qualification process for a product installation.
  • Enabling customers to upload a photo that provides additional context relating to their experience – either positive or negative.

Putting all of this aside, however, and there are a few things that you should bear in mind when first evaluating this feature for your particular requirements. What follows is my list of major things to be aware of, along with some tips to sidestep any issues.

Privacy concerns…

To better understand why this is relevant, it helps to be aware of exactly¬†how files can be stored on Azure. Azure file storage works on the principle of “blobs” (i.e. files), which can only be created within a corresponding Storage Container. These can be configured using a couple of different options, depending on how you would like to access your data, which is elaborated upon in this really helpful article:

You can configure a container with the following permissions:

  • No public read access: The container and its blobs can be accessed only by the storage account owner. This is the default for all new containers.

  • Public read access for blobs only: Blobs within the container can be read by anonymous request, but container data is not available. Anonymous clients cannot enumerate the blobs within the container.

  • Full public read access: All container and blob data can be read by anonymous request. Clients can enumerate blobs within the container by anonymous request, but cannot enumerate containers within the storage account.

To presumably mitigate the need for complex deployments of the VoC solution, all uploaded Survey Response files are saved in Full public read access storage containers, meaning that anyone with the URL can access these files. And, as mentioned already, administrators have no direct access to the Azure Storage Account to modify these permissions, potentially compounding this access problem. Now, before you panic too much, the VoC solution deliberately structures the uploaded file in the following format:

https://<VoC Region Identifier>.blob.core.windows.net/<Survey Response GUID>-files/<File GUID>-<Example File>

This degree of complexity added during this goes a long way towards satisfying any privacy concerns Рit would be literally impossible for a human being or computer to guess what a particular uploaded file path is, even if they did have the Survey Response record GUID Рbut this still does not address the fact that the URL can be freely accessed and shared by anyone with sufficient permissions over the Survey Response entity in D365CE. You should, therefore, take appropriate care when scoping your security privileges within D365CE and look towards carrying out a Privacy Impact Assessment (PIA) over the type of data you are collecting via the upload file control.

…even after you delete a Survey Response.

As mentioned above, the Blob Storage URL is tagged to the Survey Response record within D365CE. So what happens when you delete this record? The answer, courtesy of Microsoft via a support request:

Deleting Survey Response should delete the file uploaded as part of the Survey Response

Based on my testing, however, this does not look to be the case. My understanding of the VoC solution is that it needs to regularly synchronise with components located on Azure, which can lead to a delay in certain actions completing (publish a Survey, create Survey Response record etc.). However, a file from a Survey Response test record that I deleted still remains accessible via its URL up to 8 hours after completing this action. This, evidently, raises a concern over what level of control you have over potentially critical and sensitive data types that may be included in uploaded files. I would urge you to carry out your own analysis as part of a PIA to sufficiently gauge what impact, if any, this may have on your data collection (and, more critically, storage) activities.

Restrictions

For the most part, file upload controls are not a heavily constrained feature, but it is worthwhile to keep the following restrictions in mind:

  • Executable file types are not permitted for upload (.exe, .ps1, .bat etc.)
  • Larger file types may not upload successfully, generating 404 server errors within the control. There is not a documented size limitation, but my testing would indicate that files as big as 60MB will not upload correctly.
  • Only one file upload control is allowed¬†per survey.

The last of these limitations is perhaps the most significant constraint. If you do have a requirement for separate files to be uploaded, then the best option is to provide instructions on the survey, advising users to compress their files into a single .zip archive before upload.

Conclusions or Wot I Think

Despite what this post may be leaning towards, I very much believe the VoC solution and, in particular, the ability to upload¬†Survey Response files, is all in a perfect, working condition. Going a step further on this, when viewed from a technical standpoint, I would even say that its method of execution is wholly justified. With the advent of the General Data Protection Regulations (GDPR) earlier this year, current attention is all around ensuring that appropriate access controls over data have been properly implemented, that ensures the privacy of individuals is fully upheld. Here is where the solution begins to fall over to a degree and evidence of the journey that VoC has made in becoming part of the Dynamics 365 “family” becomes most apparent. As can be expected, any product which is derived from an external acquisition will always present challenges when being “smushed” with a new application system.¬†I have been informed that there is an update coming to the VoC solution in August this year, with a range of new features that may address some of the data privacy concerns highlighted earlier. For example, the option will be provided for administrators to delete any uploaded file within a¬†Survey Response on-demand. Changes like this will surely go a long way towards providing the appropriate opportunities for VoC to be fully utilised by businesses looking to integrate an effective, GDPR-proof, customer survey tool.

The Voice of the Customer (VoC) solution, available as part of Dynamics 365 Customer Engagement (D365CE), works most effectively when you are tightly integrating your survey’s around other features or datasets stored within the application. That’s not to say that it must only ever be utilised in tandem as opposed to isolation. If you have the requirement to quickly send out a survey to a small list of individuals (regardless of whether they are D365CE Contact records), VoC presents a natural choice if you are already paying for D365CE Online, as it is included as part of your subscription cost. As services such as¬†SurveyMonkey tend to charge money to let you develop more complex, bespoke branded surveys, VoC, by comparison, offers all of this to you at no additional cost. Just don’t be buying licenses to use only this specific piece of functionality. ūüôā Ultimately, the upside of all this¬†is that VoC represents a solid solution in and of itself,¬†but working with the solution in this manner is just the icing on top of the cake. When you start to take into account the myriad of different integration touchpoints that VoC can instantly support, thanks to its resident status within D365CE, this is where things start to get really exciting. With this in mind, you can look to implement solutions that:

  • Send out surveys automatically via e-mail.
  • Route survey responses to specific users, based on answers to certain questions or the Net Promoter Score (NPS) of the response.
  • Tailor a specific email to a customer that is sent out after a survey is completed.
  • Include survey data as part of an Azure Data Export Profile and leverage the full power of T-SQL querying to get the most out of your survey data.

It is in the first of these scenarios – sending out surveys via a WorkFlow – that you may find yourself encountering the error referenced in the title of this post. The error can occur when you look to take advantage of the survey snippet feature as part of an Email Template – in laymen’s terms, the ability to send out a survey automatically and tag the response back to the record that it is triggered from. To implement this, you would look towards creating a WorkFlow that looks similar to the below:

With then either the desired email message typed out or a template specified that contains the survey snippet code, available from a published survey’s form:

All of this should work fine and dandy up until the user in question attempts to trigger the workflow; at which point, we see the appropriate error message returned when viewing the workflow execution in the System Jobs area of the application:

Those who are familiar with errors like these in D365CE will instantly realise that this is a security role permission issue. In the above example, the user in question had not been granted the Survey¬†User role, which is included as part of the solution and gives you the basic “set menu” of permissions required to work with VoC. A short while following on from rectifying this, we tried executing the workflow again and the error still occurred, much to our frustration. Our next step was to start combing through the list of custom entity privileges on the Security Role tab to see if there was a permission called¬†Azure Deployment or similar. Much to our delight, we came across the following permission which looked like a good candidate for further scrutiny:

When viewing this security role within the¬†Survey User role, the¬†Read permission for this organization-owned custom entity was not set. By rectifying this and attempting to run the workflow again, we saw that it executed successfully ūüôā

It seems a little odd that the standard user security role for the VoC module is missing this privilege for an arguably key piece of functionality. In our situation, we simply updated the Survey User security role to include this permission and cascaded this change accordingly across the various dev/test/prod environments for this deployment. You may also choose to add this privilege to a custom security role instead, thereby ensuring that it is properly transported during any solution updates. Regardless, with this issue resolved, a really nice piece of VoC functionality can be utilised to streamline the process of distributing surveys out to D365CE customer records.

After going through a few separate development cycles involving Dynamics 365 Customer Engagement (D365CE), you begin to get a good grasp of the type of tasks that need to be followed through each time. Most of these are what you may expect Рsuch as importing an unmanaged/managed solution into a production environment Рbut others can differ depending on the type of deployment. What ultimately emerges as part of this is the understanding that there are certain configuration settings and records that are not included as part of a Solution file and which must be migrated across to different environments in an alternate manner.

The application has many record types that fit under this category, such as Product or Product Price List. When it comes to migrating these record types into a Production environment, those out there who are strictly familiar with working inside the application only may choose to utilise the Advanced Find facility in the following manner:

  • Generate a query to return all of the records that require migration, ensuring all required fields are returned.
  • Export out the records into an Excel Spreadsheet
  • Import the above spreadsheet into your target environment via the Data Import wizard.

And there would be nothing wrong with doing things this way, particularly if your skillset sits more within a functional, as opposed to technical, standpoint. Where you may come unstuck with this approach is if you have a requirement to migrate Subject record types across environments. Whilst a sensible (albeit time-consuming) approach to this requirement could be to simply create them from scratch in your target environment, you may fall foul of this method if you are utilising Workflows or Business Rules that reference Subject values. When this occurs, the application looks for the underlying Globally Unique Identifier (GUID) of the Subject record, as opposed to the Display Name. If a record with this exact GUID value does not exist within your target environment, then your processes will error and fail to activate. Taking this into account, should you then choose to follow the sequence of tasks above involving Advanced Find, your immediate stumbling block will become apparent, as highlighted below:

As you can see, there is no option to select the Subject entity for querying, compounding any attempts to get them exported out of the application. Fortunately, there is a way to get overcome this via the Configuration Migration tool. This has traditionally been bundled together as part of the applications Solution Developer Kit (SDK). The latest version of the SDK for 8.2 of the application can be downloaded from Microsoft directly, but newer versions – to your delight or chagrin – are only available via NuGet. For those who are unfamiliar with using this, you can download version 9.0.2.3 of the Configuration Migration tool alone using the link below:

Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf.9.0.2.3

With everything downloaded and ready to go, the steps involved in migrating Subject records between different D365CE environments are as follows:

  1. The first step before any export can take place is to define a Schema Рbasically, a description of the record types and fields you wish to export. Once defined, schemas can be re-used for future export/import jobs, so it is definitely worth spending some time defining all of the record types that will require migration between environments. Select Create schema on the CRM Configuration Migration screen and press Continue.

  1. Login to D365CE using the credentials and details for your specific environment.

  1. After logging in and reading your environment metadata, you then have the option of selecting the Solution and Entities to export. A useful aspect to all of this is that you have the ability to define which entity fields you want to utilise with the schema and you can accommodate multiple Entities within the profile. For this example, we only want to export out the Subject entity, so select the Default Solution, the entity in question and hit the Add Entity > button. Your window should resemble the below if done correctly:

  1. With the schema fully defined, you can now save the configuration onto your local PC. After successfully exporting the profile, you will be asked whether you wish to export the data from the instance you are connected to. Hit Yes to proceed.

  1. At this point, all you need to do is define the Save to data file location, which is where a .zip file containing all exported record data will be saved. Once decided, press the Export Data button. This can take some time depending on the number of records being processed. The window should update to resemble the below once the export has successfully completed. Select the Exit button when you are finished to return to the home screen.

  1. You have two options at this stage Рeither you can either exit the application entirely or, if you have your target import environment ready, select the Import data and Continue buttons, signing in as required.

  1. All that remains is to select the .zip file created in step 5), press the Import Data button, sit back and confirm that all record data imports successfully.

It’s worth noting that this import process works similarly to how the in-application Import Wizard operates with regards to record conflicts; namely, if a record with the same GUID value exists in the target instance, then the above import will overwrite the record data accordingly. This can be helpful, as it means that changes to records such as the¬†Subject entity can be completed safely within a development context and promoted accordingly to new environments.

The Configuration Migration tool is incredibly handy to have available but is perhaps not one that it is shouted from the rooftops that often. It’s usefulness not just extends to the¬†Subject entity, but also when working with the other entity types discussed at the start of this post. Granted, if you do not find yourself working much with Processes that reference these so-called “configuration” records, then introducing the above step as part of any release management process could prove to be an unnecessary administrative burden. Regardless, there is at least some merit to factor in the above tool as part of an initial release of a D365CE solution to ensure that all development-side configuration is quickly and easily moved across to your production environment.

This is an accompanying blog post to my YouTube video Dynamics 365 Customer Engagement Deep Dive: Creating a Basic Custom Workflow Assembly. The video is part of my tutorial series on how to accomplish developer focused tasks within Dynamics 365 Customer Engagement. You can watch the video in full below:

Below you will find links to access some of the resources discussed as part of the video and to further reading topics:

PowerPoint Presentation (click here to download)

Full Code Sample

using System;
using System.Activities;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Workflow;
using Microsoft.Xrm.Sdk.Query;

namespace D365.SampleCWA
{
    public class CWA_CopyQuote : CodeActivity
    {
        protected override void Execute(CodeActivityContext context)
        {
            IWorkflowContext c = context.GetExtension<IWorkflowContext>();

            IOrganizationServiceFactory serviceFactory = context.GetExtension<IOrganizationServiceFactory>();
            IOrganizationService service = serviceFactory.CreateOrganizationService(c.UserId);

            ITracingService tracing = context.GetExtension<ITracingService>();

            tracing.Trace("Tracing implemented successfully!", new Object());

            Guid quoteID = c.PrimaryEntityId;

            Entity quote = service.Retrieve("quote", quoteID, new ColumnSet("freightamount", "discountamount", "discountpercentage", "name", "pricelevelid", "customerid", "description"));

            quote.Id = Guid.Empty;
            quote.Attributes.Remove("quoteid");

            quote.Attributes["name"] = "Copy of " + quote.GetAttributeValue<string>("name");
            Guid newQuoteID = service.Create(quote);

            EntityCollection quoteProducts = RetrieveRelatedQuoteProducts(service, quoteID);
            EntityCollection notes = RetrieveRelatedNotes(service, quoteID);

            tracing.Trace(quoteProducts.TotalRecordCount.ToString() + " Quote Product records returned.", new Object());

            foreach (Entity product in quoteProducts.Entities)
            {
                product.Id = Guid.Empty;
                product.Attributes.Remove("quotedetailid");
                product.Attributes["quoteid"] = new EntityReference("quote", newQuoteID);
                service.Create(product);
            }
            foreach (Entity note in notes.Entities)
            {
                note.Id = Guid.Empty;
                note.Attributes.Remove("annotationid");
                note.Attributes["objectid"] = new EntityReference("quote", newQuoteID);
                service.Create(note);
            }
        }

        [Input("Quote Record to Copy")]
        [ReferenceTarget("quote")]

        public InArgument<EntityReference> QuoteReference { get; set; }
        private static EntityCollection RetrieveRelatedQuoteProducts(IOrganizationService service, Guid quoteID)
        {
            QueryExpression query = new QueryExpression("quotedetail");
            query.ColumnSet.AllColumns = true;
            query.Criteria.AddCondition("quoteid", ConditionOperator.Equal, quoteID);
            query.PageInfo.ReturnTotalRecordCount = true;

            return service.RetrieveMultiple(query);
        }
        private static EntityCollection RetrieveRelatedNotes(IOrganizationService service, Guid objectID)
        {
            QueryExpression query = new QueryExpression("annotation");
            query.ColumnSet.AllColumns = true;
            query.Criteria.AddCondition("objectid", ConditionOperator.Equal, objectID);
            query.PageInfo.ReturnTotalRecordCount = true;

            return service.RetrieveMultiple(query);
        }
    }
}

Download/Resource Links

Visual Studio 2017 Community Edition

Setup a free 30 day trial of Dynamics 365 Customer Engagement

C# Guide (Microsoft Docs)

Source Code Management Solutions

Further Reading

Microsoft Docs – Create a custom workflow activity

MSDN – Register and use a custom workflow activity assembly

MSDN – Update a custom workflow activity using assembly versioning (This topic wasn’t covered as part of the video, but I would recommend reading this article if you are developing an ISV solution involving custom workflow assemblies)

MSDN – Sample: Create a custom workflow activity

You can also check out some of my previous blog posts relating to Workflows:

  • Implementing Tracing in your CRM Plug-ins – We saw as part of the video how to utilise tracing, but this post goes into more detail about the subject, as well as providing instructions on how to enable the feature within the application (in case you are wondering why nothing is being written to the trace log ūüôā ). All code examples are for Plug-ins, but they can easily be repurposed to work with a custom workflow assembly instead.
  • Obtaining the User who executed a Workflow in Dynamics 365 for Customer Engagement (C# Workflow Activity) – You may have a requirement to trigger certain actions within the application, based on the user who executed a Workflow. This post walks through how to achieve this utilising a custom workflow assembly.

If you have found the above video useful and are itching to learn more about Dynamics 365 Customer Engagement development, then be sure to take a look at my previous videos/blog posts using the links below:

Have a question or an issue when working through the code samples? Be sure to leave a comment below or contact me directly, and I will do my best to help. Thanks for reading and watching!

As part of developing Dynamics CRM/Dynamics 365 Customer Engagement (CRM/D365CE) plug-ins day in, day out, you can often forget about the Execution Mode setting. This can be evidenced by the fact that I make no mention of it in my recent tutorial video on plug-in development. In a nutshell, this setting enables you to customise whether your plug-in executes in Synchronous or Asynchronous mode. Now, you may be asking Рjust what the hell does that mean?!? The best way of understanding is by rephrasing the terminology; it basically tells the system when you want your code to be executed. Synchronous plug-ins execute all of your business logic whilst the record is being saved by the user, with this action not being considered complete and committed to the backend database until the plug-in completes. By comparison, Asynchronous plug-ins are queued for execution after the record has been saved. A System Job record is created and queued alongside other jobs in the system via the Asynchronous Service. Another way of remembering the difference between each one is to think back to the options available to you as part of a Workflow. They can either be executed in real time (synchronously) or in the background (asynchronously). Plug-ins are no different and give you the flexibility to ensure your business logic is applied immediately or, if especially complex, queued so that the system has sufficient time to process in the background.

I came across a strange issue with an arguably even stranger Synchronous plug-in the other day, which started failing after taking an inordinately long time saving the record:

Unexpected exception from plug-in (Execute): MyPlugin.MyPluginClass: System.AggregateException: One or more errors occurred.

The “strange” plug-in was designed so that, on the Create action of an Entity record, it goes out and creates various related records within the application, based on a set of conditions. We originally had issues with the plug-in a few months back erroring, due to the execution time exceeding the 2 minute limit for sandbox plug-ins. A rather ingenious and much more accomplished developer colleague got around the issue by implementing a degree of asynchronous processing within the plug-in, achieved like so:

await Task.Factory.StartNew(() =>
{
    lock (service)
    {
        Stopwatch stopwatch = Stopwatch.StartNew();
        Guid record = service.Create(newRecord);
        tracing.Trace("Record with ID " + record.ToString() + " created successfully after: {0}ms.", stopwatch.ElapsedMilliseconds);
    }
});

I still don’t fully understand just exactly what this is doing, but I put this down to my novice level C# knowledge ūüôā The important thing was that the code worked…until some additional processing was added to the plug-in, leading to the error message above.

At this juncture, our only choice was to look at forcing the plug-in to execute in Asynchronous mode by modifying the appropriate setting on the plug-in step within the Plugin Registration Tool:

After making this change and attempting to create the record again in the application, everything worked as expected. However, this did create a new problem for us to overcome – end users of the application were previously used to seeing the related records created by the plug-in within sub-grids on the Primary Entity form, which would then be accessed and worked through accordingly. As the very act of creating these records now took place within the background and took some time to complete, we needed to display an informative message to the user to advise them to refresh the form after a few minutes. You do have the ability within plug-ins to display a custom message back to the user, but this is only in situations where you are throwing an error message and it didn’t seem to be a particularly nice solution for this scenario.

In the end, the best way of achieving this requirement was to implement a JScript function on the form. This would trigger whenever the form is saved and displays a message box that the user has to click OK on before the save action is carried out:

function displaySaveMessage(context) {

    var eventArgs = context.getEventArgs();
    var saveMode = eventArgs.getSaveMode();

    if (saveMode == 70 || saveMode == 2 || saveMode == 1 || saveMode == 59) {
        var message = "Records will be populated in the background and you will need to refresh the form after a few minutes to see them on the Sub-Grid. Press OK to save the record."
        Xrm.Utility.alertDialog(message, function () {
            Xrm.Page.data.save().then(function () {
                Xrm.Page.data.refresh();
            })
        });
    }
}

By feeding through the execution context parameter, you are able to determine the type of save action that the alert will trigger on; in this case, Save, Save & Close, Save & New and Autosave. Just make sure you configure your script with the correct properties on the form, which are:

  • Using the¬†OnSave event handler
  • With the¬†Pass execution context as first parameter setting enabled

From the end-users perspective, they will see something similar to the below when the record is saved:

It’s a pity that we don’t have similar kind of functionality exposed via Business Rules that enable us to display OnSave alerts that are more in keeping with the applications look and feel. Nevertheless, the versatility of utilising JScript functions should be evident here and can often achieve these types of bespoke actions with a few lines of code.

When it comes to plug-in development, understanding the impact and processing time that your code has within the application is important for two reasons Рfirst, in ensuring that end users are not frustrated by long loading times and, secondly, in informing the choice of Execution Mode when it comes to deploying out a plug-in. Whilst Asynchronous plug-ins can help to mitigate any user woes and present a natural choice when working with bulk operations within the application, make sure you fully understand the impact that these have on the Asynchronous Service and avoid a scenario where the System Job entity is queued with more jobs then it can handle.