Working in-depth amidst the Sales entities (e.g. Product, Price List, Quote etc.) within Dynamics CRM/Dynamics 365 Customer Engagement (CRM/D365CE) can produce some unexpected complications. What you may think is simple to achieve on the outset, based on how other entities work within the system, often leads you in a completely different direction. A good rule of thumb is that any overtly complex customisations to these entities will mean having to get down and dirty with C#, VB.Net or even JScript. For example, we’ve seen previously on the blog how, with a bit of a developer expertise, it is possible to overhaul the entire pricing engine within the application to satisfy specific business requirements. There is no way in which this can be modified directly through the application interface, which can lead to CRM deployments that make imaginative and complicated utilisation of features such as Workflows, Business Rules and other native features. Whilst there is nothing wrong with this approach per-say, the end result is often implementations that look messy when viewed cold and which become increasingly difficult to maintain in the long term. As always, there is a balance to be found, and any approach which makes prudent use of both application features and bespoke code is arguably the most desirous end goal for achieving certain business requirements within CRM/D365CE.

To prove my point around Sales entity “oddities”, a good illustration can be found when it comes to working with relationship field mappings and Product records. The most desirable feature at the disposal of CRM customisers is the ability to configure automated field mapping between Entities that have a one-to-many (1:N) relationship between them. What this means, in simple terms, is that when you create a many (N) record from the parent entity (1), you can automatically copy the field values to a matching field on the related entity. This can help to save data entry time when qualifying a Lead to an Opportunity, as all the important field data you need to continue working on the record will be there ready on the newly created Opportunity record. Field mappings can be configured from the 1:N relationship setting window, via the Mappings button:

There are a few caveats to bear in mind – you can only map across fields that have the same underlying data type and you cannot map multiple source fields to the same target (it should be obvious why this is 🙂 ) – but on the whole, this is a handy application feature that those who are more accustomed to CRM development should always bear in the mind when working with CRM/D365CE.

Field mappings are, as indicated, a standard feature within CRM/D365CE – but when you inspect the field relationships between the Product and Quote Product entity, there is no option to configure mappings at all:

Upon closer inspection, many of the relationships between the Product entity and others involved as part of the sales order process are missing the ability to configure field mappings. So, for example, if you have a requirement to map across the value of the Description entity to a newly created Quote Product record, you would have to look at implementing a custom plugin to achieve your requirements. The main benefit of this route is that we have relatively unrestricted access to the record data we need as part of a plugin execution session and – in addition – we can piggyback onto the record creation process to add on our required field “in-flight” – i.e. whilst the record is being created. The code for achieving all of this is as follows:

using System;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
{
    public class PreQuoteProductCreate_GetProductAttributeValues : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            //Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            //Get a reference to the Organization service.

            IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = factory.CreateOrganizationService(context.UserId);

            //Extract the tracing service for use in debugging sandboxed plug-ins

            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            tracingService.Trace("Tracing implemented successfully!");

            if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)

            {
                Entity qp = (Entity)context.InputParameters["Target"];

                //Only execute for non-write in Quote Product records

                EntityReference product = qp.GetAttributeValue<EntityReference>("productid");

                if (product != null)

                {

                    Entity p = RetrieveProductID(service, product.Id);
                    string desc = p.GetAttributeValue<string>("description");
                    tracingService.Trace("Product Description = " + desc);
                    qp.Attributes["description"] = desc;

                }

                else

                {
                    tracingService.Trace("Quote Product with record ID " + qp.GetAttributeValue<Guid>("quotedetailid").ToString() + " does not have an associated Product record, cancelling plugin execution.");
                    return;
                }
            }
        }

        public Entity RetrieveProductID(IOrganizationService service, Guid productID)
        {
            ColumnSet cs = new ColumnSet("description"); //Additional fields can be specified using a comma seperated list

            //Retrieve matching record

            return service.Retrieve("product", productID, cs);
        }
    }
}

They key thing to remember when registering your Plugin via the Plugin Registration Tool (steps which regular readers of the blog should have a good awareness of) is to ensure that the Event Pipeline Stage of Execution is set to Pre-operation. From there, the world is your oyster – you could look at returning additional fields from the Product entity to update on your Quote Product record or you could even look at utilising the same plugin for the Order Product and Invoice Product entities (both of these entities also have Description field, so the above code should work on these entities as well).

It’s a real shame that Field Mappings are not available to streamline the population of record data from the Product entity; or the fact that there is no way to utilise features such as Workflows to give you an alternate way of achieving the requirement exemplified in this post. This scenario is another good reason why you should always strive to be a Dynamics 365 Swiss Army Knife, ensuring that you have a good awareness of periphery technology areas that can aid you greatly in mapping business requirements to CRM/D365CE.

Working with Dynamics CRM/Dynamics 365 Customer Engagement (CRM/D365CE) solution imports can often feel a lot like persuing a new diet or exercise regime; we start out with the best of intentions of how we want things to proceed, but then something comes up to kick the wheel off the wagon and we end up back at square one 🙂 Anything involving a change to an IT system can generally be laborious to implement, due to the dependencies involved, and things can invariably go wrong at any stage in the process. The important thing is to always keep a cool head, take things slowly and try not to overcomplicate things from the outset, as often the simplest or most obvious explanation for an issue is where all due attention should be focused towards.

In the case of CRM/D365CE, we have the ability to access full log information relating to a solution import – regardless of whether it has failed or succeeded. This log can prove to be incredibly useful in troubleshooting solution import failures. Available as an XML download, it can be opened within Excel to produce a very readable two tab spreadsheet containing the following information:

  • The Solution tab provides high-level information regarding the solution package, its publisher, the status of the import and any applicable error messages.
  • The Components tab lists every single attempted action that the solution attempted to execute against the target instance, providing a timestamp and any applicable error codes for each one.

The above document should always be your first port of call when a solution import fails, and it will almost certainly allow you to identify the root cause of the failure – as it did for me very recently.

An unmanaged solution import failed with the top-level error message Fields that are not valid were specified for the entity. Upon closer investigation within the import log, I was able to identify the affected component – a custom attribute on the Quote entity – and the specific error message generated – Attribute…has SourceType 0, but 1 was specified:

The reason why the error was being generated is that a field with the same logical name was present within the environment, something which – for clearly understandable reasons – is not allowed. In this particular scenario, we were doing some tidy up of an existing solution and replacing a calculated field with a new field, with a different data type, using the same attribute name. The correct step that should have been taken before the solution import was to delete the “old” field in the target environment, but this was accidentally not included in the release notes. After completing this and re-attempting the solution import, it completed successfully.

The likelihood of this error ever occurring in the first place should be remote, assuming that you are customising your system the right way (i.e. using Solution Publisher prefixes for all custom attributes/entities). In this occasion, the appropriate note as part of the release documentation for the solution would have prevented the issue from occurring in the first place. So, as long as you have implemented a sufficiently robust change management procedure, that includes full instructions that are required to be completed both before and after a solution import, you can avoid a similar situation when it comes to replacing entity attributes within your CRM/D365CE solution.

In last week’s post, we took a look at how a custom Workflow activity can be implemented within Dynamics CRM/Dynamics 365 for Customer Engagement to obtain the name of the user who triggered the workflow. It may be useful to retrieve this information for a variety of different reasons, such as debugging, logging user activity or to automate the population of key record information. I mentioned in the post the “treasure trove” of information that the IWorkflowContext interface exposes to developers. Custom Workflow activities are not unique in having execution-specific information exposable, with an equivalent interface at our disposal when working with plug-ins. No prizes for guessing its name – the IPluginExecutionContext.

When comparing both interfaces, some comfort can be found in that they share almost identical properties, thereby allowing us to replicate the functionality demonstrated in last weeks post as Post-Execution Create step for the Lead entity. The order of work for this is virtually the same:

  1. Develop a plug-in C# class file that retrieves the User ID of the account that has triggered the plugin.
  2. Add supplementary logic to the above class file to retrieve the Display Name of the User.
  3. Deploy the compiled .dll file into the application via the Plug-in Registration Tool, adding on the appropriate execution step.

The emphasis on this approach, as will be demonstrated, is much more focused towards working outside of the application; something you may not necessarily be comfortable with. Nevertheless, I hope that the remaining sections will provide enough detail to enable you to replicate within your own environment.

Developing the Class File

As before, you’ll need to have ready access to a Visual Studio C# Class file project and the Dynamics 365 SDK. You’ll also need to ensure that your project has a Reference added to the Microsoft.Xrm.Sdk.dll. Create a new Class file and copy and paste the following code into the window:

using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
{
    public class PostLeadCreate_GetInitiatingUserExample : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            // Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            // Obtain the organization service reference.
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            // The InputParameters collection contains all the data passed in the message request.
            if (context.InputParameters.Contains("Target") &&
                context.InputParameters["Target"] is Entity)

            {
                Entity lead = (Entity)context.InputParameters["Target"];

                //Use the Context to obtain the Guid of the user who triggered the plugin - this is the only piece of information exposed.
      
                Guid user = context.InitiatingUserId;

                //Then, use GetUserDisplayCustom method to retrieve the fullname attribute value for the record.

                string displayName = GetUserDisplayName(user, service);

                //Build out the note record with the required field values: Title, Regarding and Description field

                Entity note = new Entity("annotation");
                note["subject"] = "Test Note";
                note["objectid"] = new EntityReference("lead", lead.Id);
                note["notetext"] = @"This is a test note populated with the name of the user who triggered the Post Create plugin on the Lead entity:" + Environment.NewLine + Environment.NewLine + "Executing User: " + displayName;

                //Finally, create the record using the IOrganizationService reference

                service.Create(note);
            }
        }
    }
}

Note also that you will need to rename the namespace value to match against the name of your project.

To explain, the code replicates the same functionality developed as part of the Workflow on last week’s post – namely, create a Note related to a newly created Lead record and populate it with the Display Name of the User who has triggered the plugin.

Retrieving the User’s Display Name

After copying the above code snippet into your project, you may notice a squiggly red line on the following method call:

The GetUserDisplayName is a custom method that needs to be added in manually and is the only way in which we can retrieve the Display Name of the user, which is not returned as part of the IPluginExecutionContext. We, therefore, need to query the User (systemuser) entity to return the Full Name (fullname) field, which we can then use to populate our newly create Note record. We use a custom method to return this value, which is provided below and should be placed after the last 2 curly braces after the Execute method, but before the final 2 closing braces:

private string GetUserDisplayName(Guid userID, IOrganizationService service)
    {
        Entity user = service.Retrieve("systemuser", userID, new ColumnSet("fullname"));
        return user.GetAttributeValue<string>("fullname");
    }

Deploy to the application using the Plug-in Registration Tool

The steps involved in this do not differ greatly from what was demonstrated in last week’s post, so I won’t repeat myself 🙂 The only thing you need to make sure you do after you have registered the plug-in is to configure the plug-in Step. Without this, your plug-in will not execute. Right-click your newly deployed plug-in on the main window of the Registration Tool and select Register New Step:

On the form that appears, populate the fields/values indicated below:

  • Message: Create
  • Primary Entity: Lead
  • Run in User’s Context: Calling User
  • Event Pipeline Stage of Execution: Post-Operation

The window should look similar to the below if populated correctly. If so, then you can click Register New Step to update the application:

All that remains is to perform a quick test within the application by creating a new Lead record. After saving, we can then verify that the plug-in has created the Note record as intended:

Having compared both solutions to achieve the same purpose, is there a recommended approach to take?

The examples shown in the past two blog posts indicate excellently how solutions to specific scenarios within the application can be achieved via differing ways. As clearly evidenced, one could argue that there is a code-heavy (plug-in) and a light-touch coding (custom Workflow assembly) option available, depending on how comfortable you are with working with the SDK. Plug-ins are a natural choice if you are confident working solely within Visual Studio or have a requirement to perform additional business logic as part of your requirements. This could range from complex record retrieval operations within the application or even an external integration piece involving specific and highly tailored code. The Workflow path clearly favours those of us who prefer to work within the application in a supported manner and, in this particular example, can make certain tasks easier to accomplish. As we have seen, the act of retrieving the Display Name of a user is greatly simplified when we go down the Workflow route. Custom Workflow assemblies also offer greater portability and reusability, meaning that you can tailor logic that can be applied to multiple different scenarios in the future. Code reusability is one of the key drivers in many organisations these days, and the use of custom Workflow assemblies neatly fits into this ethos.

These are perhaps a few considerations that you should make when choosing the option that fits the needs of your particular requirement, but it could be that the way you feel most comfortable with ultimately wins the day – so long as this does not compromise the organisation as a consequence, then this is an acceptable stance to take. Hopefully, this short series of posts have demonstrated the versatility of the application and the ability to approach challenges with equally acceptable pathways for resolution.

It’s sometimes useful to determine the name of the user account that executes a Workflow within Dynamics CRM/Dynamics 365 for Customer Engagement (CRM/D365CE). What can make this a somewhat fiendish task to accomplish is the default behaviour within the application, which exposes very little contextual information each time a Workflow is triggered. Take, for example, the following simplistic Workflow which creates an associated Note record whenever a new Lead record is created:

The Note record is set to be populated with the default values available to us regarding the Workflow execution session – Activity Count, Activity Count including Process and Execution Time:

We can verify that this Workflow works – and view the exact values of these details – by creating a new Lead record and refreshing the record page:

The Execution Time field is somewhat useful, but the Activity Count Activity Count including Process values relate to Workflow execution sessions and are only arguably useful for diagnostic review – not something that end users of the application will generally be interested in 🙂

Going back to the opening sentence of this post, if we were wanting to develop this example further to include the Name of the user who executed the Workflow in the note, we would have to look at deploying a Custom Workflow Assembly to extract the information out. The IWorkflowContext Interface is a veritable treasure trove of information that can be exposed to developers to retrieve not just the name of the user who triggers a Workflow, but the time when the corresponding system job was created, the Business Unit it is being executed within and information to determine whether the Workflow was triggered by a parent. There are three steps involved in deploying out custom code into the application for utilisation in this manner:

  1. Develop a CodeActivity C# class file that performs the desired functionality.
  2. Deploy the compiled .dll file into the application via the Plugin Registration Tool.
  3. Modify the existing Workflow to include a step that accesses the custom Workflow Activity.

All of these steps will require ready access to Visual Studio, a C# class plugin project (either a new one or existing) and the CRM SDK that corresponds to your version for the application.

Developing the Class File

To begin with, make sure your project includes References to the following Frameworks:

  • System.Activities
  • Microsoft.Xrm.Sdk
  • Microsoft.Xrm.Sdk.Workflow

Add a new Class (.cs) file to your project and copy & paste the below code, overwriting any existing code in the window. Be sure to update the namespace value to reflect your project name:

using System.Activities;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Workflow;

namespace D365.Demo.Plugins
{
    public class GetWorkflowInitiatingUser : CodeActivity
    {
        protected override void Execute(CodeActivityContext executionContext)
        {
            IWorkflowContext workflowContext = executionContext.GetExtension<IWorkflowContext>();
            CurrentUser.Set(executionContext, new EntityReference("systemuser", workflowContext.InitiatingUserId));
        }

        [Output("Current User")]
        [ReferenceTarget("systemuser")]
        public OutArgument<EntityReference> CurrentUser { get; set; }
    }
}

Right-click your project and select Build. Verify that no errors are generated and, if so, then that’s the first step done and dusted 🙂

Deploy to CRM/D365CE

Open up the Plugin Registration Tool and connect to your desired instance. If you are deploying an existing, updated plugin class, then right-click it on the list of Registered Plugins & Custom Workflow Activities and click Update; otherwise, select Register -> Register New Assembly. The same window opens in any event. Load the newly built assembly from your project (can be located in the \bin\Debug\ folder by default) and ensure the Workflow Activity entry is ticked before selecting Register Selected Plugins:

After registration, the Workflow Activity becomes available for use within the application; so time to return to the Workflow we created earlier!

Adding the Custom Workflow Activity to a Process

By deactivating the Workflow Default Process Values Example Workflow and selecting Add Step, we can verify that the Custom Workflow Assembly is available for use:

Select the above, making sure first of all that the option Insert Before Step is toggled (to ensure it appears before the already configured Create Note for Lead step). It should look similar to the below if done correctly:

Now, when we go and edit the Create Note for Lead step, we will see a new option under Local Values which, when selected, bring up a whole range of different fields that correspond to fields from the User Entity. Modify the text within the Note to retrieve the Full Name value and save it onto the Note record, as indicated below:

After saving and reactivating the Workflow, we can verify its working by again creating a new Lead record and refreshing to review the Note text:

All working as expected!

The example shown in this post has very limited usefulness in a practical business scenario, but could be useful in different circumstances:

  • If your Workflow contains branching logic, then you can test to see if a Workflow has executed by a specific user and then perform bespoke logic based on this value.
  • Records can be assigned to other users/teams, based on who has triggered the Workflow.
  • User activity could be recorded in a separate entity for benchmarking/monitoring purposes.

It’s useful to know as well that the same kind of functionality can also be deployed when working with plugins as well in the application. We will take a look at how this works as part of next week’s blog post.

When it comes to technology learning, it can often feel as if you are fighting against a constant wave of change, as studying is outpaced by the introduction of new technical innovations. Fighting the tide is often the most desirous outcome to work towards, but it can be understandable why individuals choose to specialise in a particular technology area. There is no doubt some comfort in becoming a subject matter expert and in not having to worry about “keeping up with the Joneses”. However, when working with an application such as Dynamics 365 for Customer Engagement (D365CE), I would argue it is almost impossible to ignore the wider context of what sit’s alongside the application, particularly Azure, Microsoft’s cloud as a service platform. Being able to understand how the application can be extended via external integrations is typically high on the list of any project requirements, and often these integrations require a light-touch Azure involvement, at a minimum. Therefore, the ability to say that you are confident in accomplishing certain key tasks within Azure instantly puts you ahead of others and in a position to support your business/clients more straightforwardly.

Here are 4 good reasons why you should start to familiarise yourself with Azure, if you haven’t done so already, or dedicate some additional time towards increasing your knowledge in an appropriate area:

Dynamics 365 for Customer Engagement is an Azure application

Well…we can perhaps not answer this definitively and say that 100% of D365CE is hosted on Azure (I did hear a rumour that some aspects of the infrastructure were hosted on AWS). Certainly, for instances that are provisioned within the UK, there is ample evidence to suggest this to be the case. What can be said with some degree of certainty is that D365CE is an Azure leveraged application. This is because it uses key aspects of the service to deliver various functionality within the application:

  • Azure Active Directory: Arguably the crux of D365CE is the security/identity aspect, all of which is powered using Microsoft’s cloud version of Active Directory.
  • Azure Key Vault: Encryption is enabled by default on all D365CE databases, and the management of encryption keys is provided via Azure Key Vault.
  • Office 365: Similar to D365CE, Office 365 is – technically – an Azure cloud service provided by Microsoft. As both Office 365 and D365CE often need to be tightly knitted together, via features such as Server-Side Synchronisation, Office 365 Groups and SharePoint document management, it can be considered a de facto part of the base application.

It’s fairly evident, therefore, that D365CE can be considered as a Software as a Service (SaaS) application hosted on Azure. But why is all this important? For the simple reason that, because as a D365CE professional, you will be supporting the full breadth of the application and all it entails, you are already an Azure professional by default. Not having a cursory understanding of Azure and what it can offer will immediately put you a detriment to others who do, and increasingly places you in a position where your D365CE expertise is severely blunted.

It proves to prospective employers that you are not just a one trick pony

When it comes to interviews for roles focused around D365CE, I’ve been at both sides of the table. What I’ve found separates a good D365CE CV from an excellent one all boils down to how effectively the candidate has been able to expand their knowledge into the other areas. How much additional knowledge of other applications, programming languages etc. does the candidate bring to the business? How effectively has the candidate moved out of their comfort zone in the past in exploring new technologies, either in their current roles or outside of work? More importantly, how much initiative and passion has the candidate shown in embracing changes? A candidate who is able to answer these questions positively and is able to attribute, for example, extensive knowledge of Azure will instantly move up in my estimation of their ability. On the flip side of this, I believe that interviews that have resulted in a job offer for me have been helped, in no small part, to the additional technical skills that I can make available to a prospective employer.

To get certain things done involving D365CE, Azure knowledge is a mandatory requirement

I’ve talked about one of these tasks before on the blog, namely, how to setup the Azure Data Export solution to automatically synchronise your application data to an Azure SQL Database. Unless you are in the fortunate position of having an Azure savvy colleague who can assist you with this, the only way you are going to be able to successfully complete this task is to know how to deploy an Azure SQL Server instance, a database for this instance and the process for setting up an Azure Key Vault. Having at least some familiarity with how to deploy simple resources in Azure and accomplish tasks via PowerShell script execution will place you in an excellent position to achieve the requirements of this task, and others such as:

The above is just a flavour of some of the things you can do with D365CE and Azure together, and there are doubtless many more I have missed 🙂 The key point I would highlight is that you should not just naively assume that D365CE is containerised away from Azure; in fact, often the clearest and cleanest way of achieving more complex business/technical requirements will require a detailed consideration of what can be built out within Azure.

There’s really no good reason not to, thanks to the wealth of resources available online for Azure training.

A sea change seems to be occurring currently at Microsoft with respect to online documentation/training resources. Previously, TechNet and MSDN would be your go-to resources to find out how something Microsoft related works. Now, the Microsoft Docs website is where you can find the vast majority of technical documentation. I really rate the new experience that Microsoft Docs provides, and there now seems to be a concerted effort to ensure that these articles are clear, easy to follow and include end-to-end steps on how to complete certain tasks. This is certainly the case for Azure and, with this in mind, I defy anyone to find a reasonable enough excuse not to begin reading through these articles. They are the quickest way towards expanding your knowledge within an area of Azure that interests you the most or to help prepare you to, for example, setup a new Azure SQL database from scratch.

For those who learn better via visual tools, Microsoft has also greatly expanded the number of online video courses available for Azure, that can be accessed for free. There are also some excellent, “deep-dive” topic areas that can also be used to help prepare you for Azure certification.

Conclusions or Wot I Think

I use the term “D365CE professional” a number of times throughout this post. This is a perhaps unhelpful label to ascribe to anyone working with D365CE today. A far better title is, I would argue, “Microsoft cloud professional”, as this gets to the heart of what I think anyone who considers themselves a D365CE “expert” should be. Building and supporting solutions within D365CE is by no means an isolated experience, as you might have argued a few years back. Rather, the onus is on ensuring that consultants, developers etc. are as multi-faceted as possible from a skillset perspective. I talked previously on the blog about becoming a swiss army knife in D365CE. Whilst this is still a noble and recommended goal, I believe casting the net wider can offer a number of benefits not just for yourself, but for the businesses and clients you work with every day. It puts you centre-forward in being able to offer the latest opportunities to implement solutions that can increase efficiency, reduce costs and deliver positive end-user experiences. And, perhaps most importantly, it means you can confidently and accurately attest to your wide-ranging expertise in any given situation.