With 2018 now very firmly upon us, it’s time again to see what’s new in the world of Dynamics 365 certification. Nothing much has changed this time around, but there are a few noteworthy updates to be aware if you are keen to keep your certifications as up to date as possible. Here’s my round-up of what’s new – let me know in the comments if you think I have missed anything out.

MCSE Business Applications 2018

The introduction of a dedicated Microsoft Certified Solutions Architect (MCSA) and Microsoft Certified Business Applications (MCSE) certification track for Dynamics 365 was a positive step in highlighting the importance of Dynamics 365 alongside other core Microsoft products. Under the new system, those wishing to maintain a “good standing” MCSE will need to re-certify each year with a brand new exam to keep their certification current for the year ahead. Those who obtained their MCSE last year will now notice on their certificate planner the opportunity to attain the 2018 version of the competency via the passing of a single exam. For Dynamics 365 Customer Engagement focused professionals, assuming you only passed either the Sales or Customer Service exam last year, passing the other exam should be all that is required to recertify – unless you fancy your chances trying some of the new exams described below.

New MCSE Exams

Regardless of what boat you are relating to the Business Applications MCSE, those looking to obtain to 2018 variant of the certification can expect to see two additional exams available that will count towards the necessary award requirements:

The exams above currently only appear on the US Microsoft Learning website but expect them to appear globally within the next few weeks/months.

MB2-877 represents an interesting landmark in Microsoft’s journey towards integrating FieldOne as part of Dynamics CRM/Dynamics 365 Customer Engagement, with it arguably indicating the ultimate fruition of this journey. To pass the exam,¬† you are going to have to have a good grasp of the various entities involved as part of the field service app, as well as a thorough understanding of the mobile application itself. As is typically the case when it comes to Dynamics 365 certification, the Dynamics Learning Portal (DLP) is going to be your destination for preparatory learning resources for the exam; along with a good play around with the application itself within a testing environment. If you have access to the DLP, it is highly recommended you complete the following courses at your own pace before attempting the exam:

Dynamics 365 for Retail is a fairly new addition to the “family” and one which – admittedly – I have very little experience with. It’s rather speedy promotion to exam level status I, therefore, find somewhat surprising. This is emphasised further by the fact that there are no dedicated exams for other, similar applications to Field Service, such as Portals. Similar to MB2-877, preparation for MB6-897 will need to be directed primarily through DLP, with the following recommended courses for preparation:

Exam Preparation Tips

I’ve done several blog posts in the past where I discuss Dynamics CRM/Dynamics 365 exams, offering help to those who may be looking to achieve a passing grade. Rather than repeat myself (or risk breaking any non-disclosure agreements), I’d invite you to cast your eyes over these and (I hope!) that they prove useful in some way as part of your preparation:

If you have got an exam scheduled in, then good luck and make sure you study! ūüôā

Did you know that you can write Plug-ins for Dynamics 365 Customer Engagement/Dynamics CRM (D365CE/CRM) using Visual Basic .NET (VB.NET)? You wouldn’t have thought so after a thorough look through the D365CE/CRM Software Development Kit (SDK). Whilst there is a plethora of code examples available for C# plug-ins, no examples are provided on how to write a basic plug-in for the application using VB.NET. This is to be expected, perhaps due to the common status that the language has when compared with C#. Whilst VB.NET knowledge is a great asset to have when extending Office applications via Visual Basic for Applications, you would struggle to find many at scale application systems that are written using VB.NET. C# is pretty much the¬†de facto language that you need to utilise when developing in .NET, and the commonly held view is that exclusive VB.NET experience is a detriment as opposed to an asset. With this in mind, it is somewhat understandable why the SDK does not have any in-depth VB.NET code examples.

Accepting the above, it is likely however that many long-standing developers will have knowledge of the BASIC language, thereby making VB.NET a natural choice when attempting to extend D365CE/CRM. I do not have extensive experience using the language, but I was curious to see how difficult it would be to implement a plug-in using it Рand to hopefully provide assistance to any lonely travellers out there who want to put their VB.NET expertise to the test. The best way to demonstrate this is to take an existing plug-in developed in C# and reverse engineer the code into VB.NET. We took a look at a fully implemented plug-in previously on the blog, that can be used to retrieve the name of the User who has created a Lead record. The entire class file for this is reproduced below:

using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
    public class PostLeadCreate_GetInitiatingUserExample : IPlugin
        public void Execute(IServiceProvider serviceProvider)
            // Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            // Obtain the organization service reference.
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            // The InputParameters collection contains all the data passed in the message request.
            if (context.InputParameters.Contains("Target") &&
                context.InputParameters["Target"] is Entity)

                Entity lead = (Entity)context.InputParameters["Target"];

                //Use the Context to obtain the Guid of the user who triggered the plugin - this is the only piece of information exposed.
                Guid user = context.InitiatingUserId;

                //Then, use GetUserDisplayCustom method to retrieve the fullname attribute value for the record.

                string displayName = GetUserDisplayName(user, service);

                //Build out the note record with the required field values: Title, Regarding and Description field

                Entity note = new Entity("annotation");
                note["subject"] = "Test Note";
                note["objectid"] = new EntityReference("lead", lead.Id);
                note["notetext"] = @"This is a test note populated with the name of the user who triggered the Post Create plugin on the Lead entity:" + Environment.NewLine + Environment.NewLine + "Executing User: " + displayName;

                //Finally, create the record using the IOrganizationService reference


The code above encapsulates a number of common operations that a plug-in can seek to accomplish Рupdating a record, obtaining context specific values and performing a Retrieve operation against an entity Рthereby making it a good example for what will follow in this post.

With everything ready to go, it’s time for less talking and more coding ūüôā We’ll build out a VB.NET version of the above class file, covering some of the “gotchas” to expect on each step, before then bringing all of the code together in a finished state.

Importing References

As you would expect within C#, a class file requires references to the D365CE SDK DLL files. These should be imported into your project and then added to your class file using the Imports statement:

With these two lines of code, there are immediately two things which you may need to untrain yourself from doing if you have come from a C# background:

  1. Make sure not to add your semi-colons at the end of each line, as it is not required for VB.NET
  2. You may be tempted to use the Return key to auto-complete your syntax, which works fine in a C# project…but will instead skip you down to the next line in VB.NET. Instead, use the Tab key to autocomplete any IntelliSense prompts.

Adding a Namespace

By default, a VB.NET class project does not implement a Namespace for your class. This will need to be added next, underneath the project references like so:

Implementing the IPlugin Interface

So far so good…and things continue in the same vein when implementing the IPlugin interface. This is configured like so:

The only thing to remember here, if you are still in C# mode, is that your colon is replaced with the Implements statement and that this part of the code needs to be moved to the next line.

Putting together the Execute Method

The Execute method is the heart and soul of any plug-in, as this contains the code that will execute when the plug-in is triggered. In C#, this is implemented using a void method¬†(i.e. a block of code that does not return a specific value, object etc.). It’s equivalent within VB.Net is a¬†Sub – short for “Subroutine” – which needs to be additionally peppered with an¬†Implements¬†statement to the¬†IPlugin.Execute sub:

Implementing Variables

Here’s where things start to get different. Variables within C# are generally implemented using the following syntax:

<Type> <Variable Name> = <Variable Value>;

So to declare a string object called text, the following code should be used:

string text = “This is my string text”;

Variables in VB.NET, by contrast, are always declared as¬†Dim‘s, with the name and then the Type of the variable declared afterwards. Finally, a value can then be (optionally) provided. A complete example of this can be seen below in the implementing of the IPluginExecutionContext interface:

In this above example, we also see two additional differences that C# developers have to reconcile themselves with:

  • There is no need to specifically cast the value as an¬†IPluginExecutionContext object – a definite improvement over C# ūüôā
  • Rather than using the typeof¬†operator when obtaining the service, the VB.NET equivalent GetType should be used instead.

The process of creating variables can seem somewhat labourious when compared with C#, and there are a few other things to bear in mind with variables and this plug-in specifically. These will be covered shortly.

The If…Then Statement

Pretty much every programming language has an implementation of an If…Else construct to perform decisions based on conditions (answers in the comments if you have found a language that doesn’t!). VB.NET is no different, and we can see how this is implemented in the next bit of the plug-in code:

Compared with C#, you have to specifically remember to add a¬†Then statement after your conditional test and also to include an End If at the end of your code block. It’s also important to highlight the use of different operators as well – in this case, And should be used as opposed to &&.

Assigning Values to a CRM Entity Object

The assignment of entity attribute values differs only slight compared with C# – you just need to ensure that you surround your attribute Logical Name value behind brackets as opposed to square brackets:

String concatenates also work slightly differently. Be sure to use & as opposed to + to achieve the same purpose. For new line breaks, there is also an equivalent VB.NET snippet that can be used for this, vbCrLf.

Obtaining the Users Display Name Value

The final part of the class file is the retrieval of the Full Name value of the user. This has to be done via a Function as opposed to a Dim, as a specific value needs to be returned. Keep in mind the following as well:

  • Parameters that are fed to the¬†Function must always be prefaced with the¬†ByVal statement – again, another somewhat tedious thing to remember!
  • Note that for the GetAttributeVale method, we specify the attribute data type using the syntax¬†(Of String)¬†as opposed to¬†<string>

Other than that, syntax-wise, C# experienced developers should have little trouble re-coding this method into VB.NET. This is evidenced by the fact that the below code snippet is approximately 75% similar to how it needs to be in C#:

Bringing it all together

Having gone through the class file from start to bottom, the entire code for the plug-in is reproduced below:

Imports Microsoft.Xrm.Sdk
Imports Microsoft.Xrm.Sdk.Query

Namespace D365.BlogDemoAssets.VB

    Public Class PostLeadCreate_GetInitiatingUserExample
        Implements IPlugin

        Private Sub IPlugin_Execute(serviceProvider As IServiceProvider) Implements IPlugin.Execute

            'Obtain the execution context from the service provider.

            Dim context As IPluginExecutionContext = serviceProvider.GetService(GetType(IPluginExecutionContext))

            'Obtain the organization service reference.
            Dim serviceFactory As IOrganizationServiceFactory = serviceProvider.GetService(GetType(IOrganizationServiceFactory))
            Dim service As IOrganizationService = serviceFactory.CreateOrganizationService(context.UserId)

            'The InputParameters collection contains all the data passed in the message request.
            If (context.InputParameters.Contains("Target") And TypeOf context.InputParameters("Target") Is Entity) Then

                Dim lead As Entity = context.InputParameters("Target")

                'Use the Context to obtain the Guid of the user who triggered the plugin - this is the only piece of information exposed.

                Dim user As Guid = context.InitiatingUserId

                'Then, use GetUserDisplayCustom method to retrieve the fullname attribute value for the record.

                Dim displayName As String = GetUserDisplayName(user, service)

                'Build out the note record with the required field values: Title, Regarding and Description field

                Dim note As Entity = New Entity("annotation")
                note("subject") = "Test Note"
                note("objectid") = New EntityReference("lead", lead.Id)
                note("notetext") = "This is a test note populated with the name of the user who triggered the Post Create plugin on the Lead entity:" & vbCrLf & vbCrLf & "Executing User: " & displayName

                'Finally, create the record using the IOrganizationService reference


            End If

        End Sub
        Private Function GetUserDisplayName(ByVal userID As Guid, ByVal service As IOrganizationService) As String

            Dim user As Entity = service.Retrieve("systemuser", userID, New ColumnSet("fullname"))
            Return user.GetAttributeValue(Of String)("fullname")

        End Function

    End Class

End Namespace

Conclusions or Wot I Think

C# is arguably the de facto choice when programming using D365CE/CRM, and more generally as well for .NET. All of the code examples, both within the SDK and online, will favour C# and I do very much hold the view that development in C# should always be preferred over VB.NET. Is there ever then a good business case for developing in VB.NET over C#? Clearly, if you have a developer available within your business who can do amazing things in VB.NET, it makes sense for this to be the language of choice to use for time-saving purposes. There may also be a case for developing in VB.NET from a proprietary standpoint. VB.NET is, as acknowledged, not as widely disseminated compared with C#. By developing custom plug-ins in VB.NET, that contain sensitive business information, you are arguably safeguarding the business by utilising a language that C# developers may have difficulty in initially deciphering.

All being said, C# should be preferred when developing plug-ins, custom workflow assemblies or custom applications involving D365CE/CRM. Where some work could be made is in ensuring that¬†all¬†supported programming languages are adequately provisioned for within the D365CE SDK moving forward. Because, let’s be honest – there is no point in supporting something if people have no idea how to use it in the first place.

Working in-depth amidst the Sales entities (e.g. Product, Price List, Quote etc.) within Dynamics CRM/Dynamics 365 Customer Engagement (CRM/D365CE) can produce some unexpected complications. What you may think is simple to achieve on the outset, based on how other entities work within the system, often leads you in a completely different direction. A good rule of thumb is that any overtly complex customisations to these entities will mean having to get down and dirty with C#, VB.Net or even JScript. For example,¬†we’ve seen previously on the blog how,¬†with a bit of a developer expertise, it is possible to overhaul the entire pricing engine within the application to satisfy specific business requirements.¬†There is no way in which this can be modified directly through the application interface, which can lead to CRM deployments that make imaginative and complicated utilisation of features such as Workflows, Business Rules and other native features. Whilst there is nothing wrong with this approach per-say, the end result is often implementations that look messy when viewed cold and which become increasingly difficult to maintain in the long term. As always, there is a balance to be found, and any approach which makes prudent use of both application features and bespoke code is arguably the most desirous end goal for achieving certain business requirements within CRM/D365CE.

To prove my point around Sales entity “oddities”, a good illustration can be found when it comes to working with relationship field mappings and Product records. The most desirable feature at the disposal of CRM customisers is the ability to configure automated field mapping between Entities that have a one-to-many (1:N) relationship between them. What this means, in simple terms, is that when you create a many (N) record from the parent entity (1), you can automatically copy the field values to a matching field on the related entity. This can help to save data entry time when qualifying a¬†Lead to an¬†Opportunity, as all the important field data you need to continue working on the record will be there ready on the newly created¬†Opportunity record. Field mappings can be configured from the 1:N relationship setting window, via the¬†Mappings¬†button:

There are a few caveats to bear in mind – you can only map across fields that have the same underlying data type and you cannot map multiple source fields to the same target (it should be obvious why this is ūüôā ) – but on the whole, this is a handy application feature that those who are more accustomed to CRM development should always bear in the mind when working with CRM/D365CE.

Field mappings are, as indicated, a standard feature within CRM/D365CE Рbut when you inspect the field relationships between the Product and Quote Product entity, there is no option to configure mappings at all:

Upon closer inspection, many of the relationships between the¬†Product entity and others involved as part of the sales order process are missing the ability to configure field mappings. So, for example, if you have a requirement to map across the value of the¬†Description entity to a newly created¬†Quote Product¬†record, you would have to look at implementing a custom plugin to achieve your requirements. The main benefit of this route is that we have relatively unrestricted access to the record data we need as part of a plugin execution session and – in addition – we can piggyback onto the record creation process to add on our required field “in-flight” – i.e. whilst the record is being created. The code for achieving all of this is as follows:

using System;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
    public class PreQuoteProductCreate_GetProductAttributeValues : IPlugin
        public void Execute(IServiceProvider serviceProvider)
            //Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            //Get a reference to the Organization service.

            IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = factory.CreateOrganizationService(context.UserId);

            //Extract the tracing service for use in debugging sandboxed plug-ins

            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            tracingService.Trace("Tracing implemented successfully!");

            if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)

                Entity qp = (Entity)context.InputParameters["Target"];

                //Only execute for non-write in Quote Product records

                EntityReference product = qp.GetAttributeValue<EntityReference>("productid");

                if (product != null)


                    Entity p = RetrieveProductID(service, product.Id);
                    string desc = p.GetAttributeValue<string>("description");
                    tracingService.Trace("Product Description = " + desc);
                    qp.Attributes["description"] = desc;



                    tracingService.Trace("Quote Product with record ID " + qp.GetAttributeValue<Guid>("quotedetailid").ToString() + " does not have an associated Product record, cancelling plugin execution.");

        public Entity RetrieveProductID(IOrganizationService service, Guid productID)
            ColumnSet cs = new ColumnSet("description"); //Additional fields can be specified using a comma seperated list

            //Retrieve matching record

            return service.Retrieve("product", productID, cs);

They key thing to remember when registering your Plugin via the Plugin Registration Tool (steps which regular readers of the blog should have a good awareness of) is to ensure that the Event Pipeline Stage of Execution is set to Pre-operation. From there, the world is your oyster Рyou could look at returning additional fields from the Product entity to update on your Quote Product record or you could even look at utilising the same plugin for the Order Product and Invoice Product entities (both of these entities also have Description field, so the above code should work on these entities as well).

It’s a real shame that Field Mappings are not available to streamline the population of record data from the Product entity; or the fact that there is no way to utilise features such as Workflows to give you an alternate way of achieving the requirement exemplified in this post. This scenario is another good reason why you should always strive to be a Dynamics 365 Swiss Army Knife, ensuring that you have a good awareness of periphery technology areas that can aid you greatly in mapping business requirements to CRM/D365CE.

Working with Dynamics CRM/Dynamics 365 Customer Engagement (CRM/D365CE) solution imports can often feel a lot like persuing a new diet or exercise regime; we start out with the best of intentions of how we want things to proceed, but then something comes up to kick the wheel off the wagon and we end up back at square one ūüôā Anything involving a change to an IT system can generally be laborious to implement, due to the dependencies involved, and things can invariably go wrong at any stage in the process. The important thing is to always keep a cool head, take things slowly and try not to overcomplicate things from the outset, as often the simplest or most obvious explanation for an issue is where all due attention should be focused towards.

In the case of CRM/D365CE, we have the ability to access full log information relating to a solution import – regardless of whether it has failed or succeeded. This log can prove to be incredibly useful in troubleshooting solution import failures. Available as an XML download, it can be opened within Excel to produce a very readable two tab spreadsheet containing the following information:

  • The¬†Solution tab provides high-level information regarding the solution package, its publisher, the status of the import and any applicable error messages.
  • The¬†Components¬†tab lists every single attempted action that the solution attempted to execute against the target instance, providing a timestamp and any applicable error codes for each one.

The above document should always be your first port of call when a solution import fails, and it will almost certainly allow you to identify the root cause of the failure – as it did for me very recently.

An unmanaged solution import failed with the top-level error message¬†Fields that are not valid were specified for the entity. Upon closer investigation within the import log, I was able to identify the affected component – a custom attribute on the¬†Quote entity – and the specific error message generated – Attribute…has SourceType 0, but 1 was specified:

The reason why the error was being generated is that a field with the same logical name was present within the environment, something which – for clearly understandable reasons – is not allowed. In this particular scenario, we were doing some tidy up of an existing solution and replacing a calculated field with a new field, with a different data type, using the same attribute name. The correct step that should have been taken before the solution import was to delete the “old” field in the target environment, but this was accidentally not included in the release notes. After completing this and re-attempting the solution import, it completed successfully.

The likelihood of this error ever occurring in the first place should be remote, assuming that you are customising your system the right way (i.e. using Solution Publisher prefixes for all custom attributes/entities). In this occasion, the appropriate note as part of the release documentation for the solution would have prevented the issue from occurring in the first place. So, as long as you have implemented a sufficiently robust change management procedure, that includes full instructions that are required to be completed both before and after a solution import, you can avoid a similar situation when it comes to replacing entity attributes within your CRM/D365CE solution.

Microsoft Flow is a tool that I increasingly have to bring front and centre when considering how to straightforwardly accommodate certain business requirements. The problem I have had with it, at times, is that there are often some notable caveats when attempting to achieve something that looks relatively simple from the outset. A good example of this is the SQL Server connector which, based on headline reading, enables you to trigger workflows when rows are added or changed within a database. Having the ability to trigger an email based on a database record update, create a document on OneDrive or even post a Tweet based on a modified database record are all things that instantly have a high degree of applicability for any number of different scenarios. When you read the fine print behind this, however, there are a few things which you have to bear in mind:


The triggers do have the following limitations:

  • It does not work for on-premises SQL Server
  • Table must have an IDENTITY column for the new row trigger
  • Table must have a ROWVERSION (a.k.a. TIMESTAMP) column for the modified row trigger

A slightly frustrating side to this is that Microsoft Flow doesn’t intuitively tell you when your table is incompatible with the requirements – contrary to what is stated in the above post. Whilst readers of this post may be correct in chanting “RTFM!”, it still would be nice to be informed of any potential incompatibilities within Flow itself. Certainly, this can help in preventing any needless head banging early on ūüôā

Getting around these restrictions are fairly straightforward if you have the ability to modify the table you are wanting to interact with using Flow. For example, executing the following script against the MyTable table will get it fully prepped for the service:

	[RowVersion] ROWVERSION

Accepting this fact, there may be certain situations when this is not the best option to implement:

  • The database/tables you are interacting with form part of a propriety application, therefore making it impractical and potentially dangerous to modify table objects.
  • The table in question could contain sensitive information. Keep in mind the fact that the Microsoft Flow service would require service account access with full SELECT privileges against your target table. This could expose a risk to your environment, should the credentials or the service itself be compromised in future.
  • If your target table already contains an inordinately large number of columns and/or rows, then the introduction of additional columns and processing via an IDENTITY/ROWVERSION seed could start to tip your application over the edge.
  • Your target database does not use an integer field and IDENTITY seed to uniquely identify columns, meaning that such a column needs to (arguably unnecessarily) added.

An alternative approach to consider would be to configure a “gateway” table for Microsoft Flow to access – one which contains¬†only the fields that Flow needs to process with, is linked back to the source table via a foreign key relationship and which involves the use of a database trigger to automate the creation of the “gateway” record. Note that this approach only works if you have a unique row identifier in your source table in the first place; if your table is recording important, row-specific information and this is not in place, then you should probably re-evaluate your table design ūüėČ

Let’s see how the above example would work in practice, using the following example table:

CREATE TABLE [dbo].[SourceTable]
	[SourceTableCol1] VARCHAR(50) NULL,
	[SourceTableCol2] VARCHAR(150) NULL,
	[SourceTableCol3] DATETIME NULL

In this scenario, the table object is using the UNIQUEIDENTIFIER column type to ensure that each row can be…well…uniquely identified!

The next step would be to create our “gateway” table. Based on the table script above, this would be built out via the following script:

CREATE TABLE [dbo].[SourceTableLog]
	CONSTRAINT FK_SourceTable_SourceTableLog FOREIGN KEY ([SourceTableUID])
		REFERENCES [dbo].[SourceTable] ([SourceTableUID])

The use of a FOREIGN KEY here will help to ensure that the “gateway” table stays tidy in the event that any related record is deleted from the source table. This is handled automatically, thanks to the ON DELETE CASCADE option.

The final step would be to implement a trigger on the dbo.SourceTable object that fires every time a record is INSERTed into the table:

CREATE TRIGGER [trInsertNewSourceTableToLog]
ON [dbo].[SourceTable]
	INSERT INTO [dbo].[SourceTableLog] ([SourceTableLogUID])
	SELECT [SourceTableUID]
	FROM inserted

For those unfamiliar with how triggers work, the¬†inserted table is a special object exposed during runtime that allows you to access the values that have been…OK, let’s move on!

With all of the above in place, you can now implement a service account for Microsoft Flow to use when connecting to your database that is sufficiently curtailed in its permissions. This can either be a database user associated with a server level login:

CREATE USER [mydatabase-flow] FOR LOGIN [mydatabase-flow]


GRANT CONNECT TO [mydatabase-flow]


GRANT SELECT ON [dbo].[SourceTableLog] TO [mydatabase-flow]


Or a contained database user account (this would be my recommended option):

CREATE USER [mydatabase-flow] WITH PASSWORD = 'P@ssw0rd1',


GRANT CONNECT TO [mydatabase-flow]


GRANT SELECT ON [dbo].[SourceTableLog] TO [mydatabase-flow]


From there, the world is your oyster – you can start to implement whatever action, conditions etc. that you require for your particular requirement(s). There are a few additional tips I would recommend when working with SQL Server and Azure:

  • If you need to retrieve specific data from SQL, avoid querying tables directly and instead encapsulate your logic into Stored Procedures instead.
  • In line with the ethos above, ensure that you always use a dedicated service account for authentication and scope the permissions to only those that are required.
  • If working with Azure SQL, you will need to ensure that you have ticked the¬†Allow access to Azure services options on the Firewall rules page of your server.

Despite some of the challenges you may face in getting your databases up to spec to work with Microsoft Flow, this does not take away from the fact that the tool is incredibly effective in its ability to integrate disparate services together, once you have overcome some initial hurdles at the starting pistol.