With two major Microsoft events recently taking place back to back over the last fortnight – Microsoft Inspire & the Business Applications Summit – there is, understandably, a plethora of major new announcements that concern those of us who are working in the Business Applications space today. The critical announcement from my perspective is the October 2018 Business Application Release Notes, which gives us all a nice and early look at what is going to be released soon for Dynamics 365, Microsoft Flow, PowerApps, Power BI and other related services. Unlike previous Spring or Fall releases, the sheer breadth of different features that now sit within the Business Applications space makes it all the more important to consider any new announcement carefully and to ensure that they are adequately factored into any architectural decisions in months ahead. If you are having trouble wading through all 239 pages of the document, then I have been through the notes and picked out what I feel are most relevant highlights from a Dynamics CRM/Dynamics 365 Customer Engagement (D365CE) perspective and their potential impact or applicability to business scenarios.

SharePoint Integration with Portals

This is a biggie and a feature that no doubt many portal administrators have been clamouring for, with the only other option being a complicated SDK solution or a third-party vendor approach. Document management directly within CRM/D365CE has always been a sketchy idea at best when you consider the database size limitations of the application and the cost for additional database storage. That’s why SharePoint has always represented the optimal choice for storing any documents related to a record, facilitating a much more inexpensive route and affording opportunities to take advantage of the vast array of SharePoint features. When you start adding portals into the mix – for example, to enable customers to upload documents relating to a loan application – the whole thing currently falls flat on its face, as documents (to the best of my knowledge) can only be uploaded and stored directly within CRM/D365CE. With the removal of this feature, a significant adoption barrier for CRM Portals will be eliminated, and I am pleased to also see an obligatory Power BI reference included as part of this announcement ūüôā

In addition, we are providing the ability to embed Power BI charts within a portal, allowing users to benefit from the interactive visualizations of Power BI.

Portal Configuration Migration

Another process that can regularly feel disjointed and laborious are the steps involved in deploying Portal changes from Dev -> UAT/Test -> Production environments, with no straightforward means of packaging up changes via a Solution or similar for easy transportation. This torment promises to change as part of the release in October, thanks to the following:

To reduce the time and effort required to manage portal configuration across environments, we are publishing schema for configuration migration that works with the Configuration Migration SDK tool.

If you are not aware of the Configuration Migration tool, then you owe it to yourself to find out more about what it can accomplish, as I am sure it will take a lot of headache out of everyday business settings, product catalogue or other non-solution customisation activity that you may be carrying out in multiple environments. The neat thing about this particular announcement is that an existing, well-established tool can be used to achieve these new requirements, as opposed to an entirely new, unfamiliar mechanism. Integration with the current Configuration Migration tool will surely help in adopting this solution more quickly and enable deployment profiles to be put together that contain nearly all required configuration data for migration.

Portal Access Restrictions

In Portal terms, this is a relatively minor one, but a welcome addition nonetheless. When testing and developing any software application, it is always prudent to restrict access to only the users or organisations who require access to it. This option has not been available to Portals to date, but no longer thanks to the following announcement:

This feature would allow administrators to define a list of IP addresses that are allowed to access your portal. The allow list can include individual IP addresses or a range of IP addresses defined by a subnet mask. When a request to the portal is generated from any user, their IP address is evaluated against the allow list. If the IP address is not in the list, the portal replies with an HTTP 403 status code

The capabilities exposed here demonstrate a lot of parity with Azure Web Apps, which is, I understand, what is used to host portals. I would hope that we can see the exposure of more Azure Web App configuration features for portal administrators in the years ahead.

Multi-resource Scheduling

There has been a real drive in getting the Resource Scheduling experience within D365CE looking as visually optimal and feature-rich as possible in recent years. There is a specific reason to explain this – the introduction of Project Service Automation and Field Service capability requires this as an almost mandatory pre-requisite. There is a wide array of new features relating to resource scheduling as part of this update, but the announcement that caught my eye, in particular, was the ability to group related resources on the Resource Scheduler, as predefined “crews”. This new feature is hugely welcome for many reasons:

  • Different types of jobs/work may require resources with a specific set of skills in combination to complete.
  • It may be prudent to group specific resources if, for example, previous experience tells you that they work well together.
  • Location may be a factor as part of all this, meaning that by scheduling a “crew” of resources together within the same locale, you can reduce the unnecessary effort involved in travelling and ensure your resources are utilising their time more effectively.

The release notes give us a teaser of how this will look in practice, and I am eager to see how this works in practice:

Leave and absence management in Dynamics 365 Talent

I have been watching with casual, distant interest how the Dynamics 365 Talent product has been developing since its release, billed as one of the first applications built on top of the new Unified Interface/Common Data Service experience. I have noted its primary utility to date has been more towards the Human Resources hiring and onboarding process, with a significant feature gap that other HR systems on the market today would more than happily fill, by providing central hubs for policy documents, managing personal information and leave requests. I think there may be a recognition of this fact within Microsoft, which explains the range of new features contained within Dynamics 365 Talent as part of the October 2018 release. The new feature that best epitomises the applications maturity is the ability to manage leaves and absences, noted as follows:

Organizations can configure rules and policies related to their leave and absence plans. They can choose how employees accrue their time off, whether it’s by years of service or by hours worked. They also can configure when this time off can be taken and if certain types of time off must be taken before others. If they allow employees to get a pay-out of their time off, this can be configured as well.

Managers can see an all-up calendar view of their team members’ time off as well as company holidays and closures. This view shows them where they may have overlap as well as time-off trends for their team and enables them to drill down to gain a better understanding of an individual’s time off.

This immediately places the system as a possible challenger to other HR systems and represents a natural, and much needed, coming-of-age development for the system. I would undoubtedly say that Dynamics 365 Talent is starting to become something that warrants much closer attention in future.

Develop Microsoft Flows Using Visio

Microsoft Flow is great. This fact should be self-evident to regular followers of the blog. As a regularly developing, relatively young product, though, it is understandable that some aspects of it require further work. An excellent example of this is the ability to manage the deployment of Flows between different environments or stages. While Flows big brother, Microsoft Logic Apps, has this pretty well covered, the ability to deploy development or concepts Flows repeatedly often ends up being a case of manually creating each Flow again from scratch, which isn’t exactly fun.

The October release promises to change this with the introduction of a specific piece of integration with Microsoft Visio:

Microsoft Visio enables enterprises to capture their business processes using its rich modeling capabilities. Anyone who creates flowcharts or SharePoint workflows can now use Visio to design Microsoft Flow workflows. You can use Visio’s sharing and commenting capabilities to collaborate with multiple stakeholders and arrive at a complete workflow in little time. As requested here, you can publish the workflow to Microsoft Flow, then supply parameters to activate it.

This feature will be available to Visio Online Plan 2 subscription users. Office Insiders can expect early access in July 2018. In the future, you’ll also be able to export existing Flows and modify them in Visio.

Now, it’s worth noting, in particular, the requirement for Visio Online Plan 2 to accommodate this neat piece of functionality. But, assuming this is not an issue for your organisation, the potential here to define Flows locally, share them quickly for approval, and deploy them¬†en masse is enormous, bringing a much-needed degree of automation to a product that currently does not support this. I’m looking forward to getting my hands on this in due course.

Custom Fonts in Power BI

Continuing the theme of obligatory Power BI references, my final pick has to be the introduction of Custom Fonts into Power BI, which will be in Public Preview as part of October’s release:

Corporate themes often include specific fonts that are distributed and used throughout the company. You can use those fonts in your Power BI reports.

For any font property, Power BI Desktop will show a complete list of all the fonts installed on your computer. You can choose from these to use in your report. When distributing the report, anyone with the font installed will see it reflected in the report. If the end user doesn’t have it installed, it falls back to the default font.

For those who have particular branding requirements that require accommodation within their Power BI Reports, this new feature completes the puzzle and takes you an additional step further in transforming your reports so that they are almost unrecognisable from a default Power BI Report. Hopefully, the preview period for this new feature will be relatively short and then rolled out as part of general availability.

Conclusions or Wot I Think

The list above is just a flavour of my “choice cuts” of the most exciting features that will be in our hands within the next few months, and I really would urge you to read through the entire document if you have even just a little passing interest in any of the technologies included in these release notes. As you can tell, my list is ever so skewered towards Portals out of everything else. This is for a good reason – ever since Microsoft’s acquisition of ADXStudio a few years back, we have seen some progress in the development of CRM Portals from Microsoft, mainly in the context of integrating the product more tightly for Online users. In my view, this has been the only significant effort we have seen in taking the product forward, with a relatively extensive list of backlog feature requests that looked to have been consigned to the recycling bin. The October Release very much seems to flip this on its head and I am pleased to discover a whole range of new, most clamoured for, features being made available on Portals, which take the product forward in strides and enables organisations to more easily contemplate their introduction.

As you will probably expect based on where things are going in the D365CE space at the moment, the announcements for Flow, PowerApps and the Common Data Service are all very much framed towards the end goal of integrating these and the “old” CRM/D365CE experience together as tightly as possible, a change that should be welcomed. The release notes are also crucial in highlighting the importance of anyone working in this space to be as multi-skilled as possible from a technology standpoint. Microsoft is (quite rightly) encouraging all technology professionals to be fast and reactive to change, and anticipating us to have a diverse range of skills to help the organisations/businesses we work with every day. There is no point in fighting this and, the best way for you to succeed in this climate is to identify the relevant opportunities that you can drive forward from these product announcements and proactively implement as part of the work you are doing each day. In a nutshell, you should know how to deploy a Power BI Dashboard, have familiarity with the type of services that Flow connects to, see the difference between a Canvas and Model-driven PowerApps and – amongst all of this – understand how D365CE solutions operate. Be a Swiss Army Knife as much as possible and deliver as much value and benefit in your role as you possibly can.

I was very honoured and excited to be involved with the very first D365UG/CRMUG North West Chapter Meeting earlier this week, hosted at the Grindsmith just off Deansgate in Manchester. This is the first time that a D365UG/CRMUG event has taken place in the North West, and we were absolutely stunned by the level of interest this event generated – all in all, 37 people attended, representing a broad spectrum of Microsoft partners and organisations of varying sizes.

I very much got the impression that the amount of Dynamics 365 Customer Engagement (D365CE) users in the North West far exceed any number you could assume, and I am really looking forward to seeing how future events develop as we (hopefully!) get more people involved. Despite a few technical glitches with the AV facilities, the feedback we have received to both presentations has been overwhelmingly positive, so a huge thanks to everyone who turned up and to our presenters for the evening

In this post, I wanted to share my thoughts on both sets of presentations, provide an answer to some of the questions that we didn’t get around to due to time constraints and, finally, provide a link to the slide deck from the evening.

Transform Group – The Patient Journey

The first talk of the evening was provided courtesy of Bill Egan at Edgewater Fullscope, who took us through Transform Group’s adoption of D365CE. Bill provided some really useful insights – from both an organisation and a Microsoft partner’s perspective – of the challenges that any business can face when moving across to a system like D365CE. As with any IT project, there were some major hurdles along the way, but Bill very much demonstrated how the business was able to roll with the punches and the very optimistic 16 weeks planned deployment presents an, arguably, essential blueprint in how IT projects need to be executed; namely, targeted towards delivering as much business benefit in a near immediate timeframe.

The key takeaways from me out of all this was in emphasising the importance of adapting projects quickly to changing business priorities and to recognise the continued effort required to ensure that business systems are regularly reviewed and updated to suit the requirements of not just the users, but the wider business.

Power Up Your Power Apps

The second presentation was literally a “head to head” challenge with Craig Bird from Microsoft and Chris “The Tattooed CRM Guy” Huntingford from Hitachi Solutions, seeing who could build the best PowerApps. In the end, the voting was pretty unanimous and Craig was the proud recipient of a prize worthy of a champion. I hope Craig will be wearing his belt proudly at future events ūüôā

I found the presentation particularly useful in clearing up a number of worries I had around the Common Data Service and the future of D365CE. The changes that I saw are very much emphasised towards providing a needed facelift to the current customisation and configuration experience within D365CE, with little requirement to factor in migration and extensive learning of new tools to ensure that your D365CE entities are available within the Common Data Service. Everything “just works” and syncs across flawlessly.

https://twitter.com/joejgriffin/status/1009531079492079622

In terms of who had the best app, I think Craig literally knocked the socks off everyone with his translator application. Although I include myself in this category, I was still surprised to see that PowerApps supports Power BI embedded content, courtesy of Chris – a really nice thing to take away for any aspirational PowerApp developer.

Questions & Answers

We managed to get around to most questions for the first presentation but not for the second one. Here’s a list of all the questions that I am able to provide an answer to. I’m still in the process of collating together responses to the other questions received, so please keep checking back if you’re burning question is not answered below:

Presentation

For those who missed the event or are wanting to view the slides without a purple tinge, they will be downloadable for the next 30 days from the following location:

https://jamesgriffin-my.sharepoint.com/:p:/g/personal/joe_griffin_gb_net/EbRAws0urypMkrGyqCzoTdMB4ggjUQI4_npQlEZAYhea4w?e=U3lvf5

Looking Ahead

The next chapter meeting is scheduled to take place on the 2nd of October (venue TBC). If you are interested in getting involved, either through giving a presentation or in helping to organise the event, then please let us know by dropping us a message:

  • Email: crmuguknw@gmail.com
  • Twitter: @CRMUG_UK_NW

If you are looking for an easy-to-use and highly expandable mail relay service, SendGrid represents the most developer-friendly solution out in the market today. What’s even better is that it’s available on Azure, making it the ideal choice if you are developing an existing solution on the Azure stack. The best thing I like about the service is the extensive documentation covering every aspect of its Web API, structured to provide a clear explanation of endpoint methods, required properties, and example outputs – exactly the right way that all technical documentation should be laid out.

I recently had a requirement to integrate with the SendGrid API to extrapolate email statistic information into a SQL database. My initial thoughts were that I would need to resort to a bespoke C# solution to achieve these requirements. However, keenly remembering my commitment this year to find opportunities to utilise the service more, I decided to investigate whether Microsoft Flow could streamline this process. Suffice to say, I was pleasantly surprised and what I wanted to do as part of this week’s blog post was demonstrate how I was able to take advantage of Microsoft Flow to deliver my requirements. In the process, I hope to get you thinking about how you approach integration requirements in the future, challenging some of the pre-conceptions around this.

Before we get into creating the Flow itself…

…you will need to create a table within your SQL database to store the requisite data. This script should do the trick:

CREATE TABLE [dbo].[SendGridStatistic]
(
	[SendGridStatisticUID] [uniqueidentifier] NULL DEFAULT NEWID(),
	[Date] DATE NOT NULL,
	[CategoryName] VARCHAR(50) NULL,
	[Blocks] FLOAT NOT NULL,
	[BounceDrops] FLOAT NOT NULL,
	[Bounces] FLOAT NOT NULL,
	[Clicks] FLOAT NOT NULL,
	[Deferred] FLOAT NOT NULL,
	[Delivered] FLOAT NOT NULL,
	[InvalidEmail] FLOAT NOT NULL,
	[Opens] FLOAT NOT NULL,
	[Processed] FLOAT NOT NULL,
	[SpamReportDrops] FLOAT NOT NULL,
	[SpamReports] FLOAT NOT NULL,
	[UniqueClicks] FLOAT NOT NULL,
	[UniqueOpens] FLOAT NOT NULL,
	[UnsubscribeDrops] FLOAT NOT NULL,
	[Unsubscribes] FLOAT NOT NULL
)

A few things to point out with the above:

  • The CategoryName¬†field is only required if you are wishing to return statistic information grouped by category from the API. The example that follows primarily covers this scenario, but I will also demonstrate how to return consolidated statistic information as well if you wanted to exclude this column.
  • Microsoft Flow will only be able to map the individual statistic count values to FLOAT fields. If you attempt to use an INT, BIGINT etc. data type, then the option to map these fields will not appear. Kind of annoying, given that FLOATs are effectively “dirty”, imprecise numbers, but given the fact we are not working with decimal numbers, this shouldn’t cause any real problems.
  • The¬†SendGridStatisticUID¬†is technically optional and could be replaced by an INT/IDENTITY seed instead or removed entirely. Remember though that it is always good practice to have a unique column value for each table, to aid in individual record operations.

In addition, you will also need to ensure you have generated an API key for SendGrid that has sufficient privileges to access the Statistic information for your account.

With everything ready, we can now “flow” quite nicely into building things out. The screenshot below demonstrates how the completed Flow should look from start to finish. The sections that follow will discuss what is required for each constituent element

Recurrence

The major boon when working with Flow is the diverse options you have for triggering them – either based on certain conditions within an application or just simply based off a recurring schedule. For this example, as we will be extracting statistic information for an entire 24 period, you should ensure that the Flow executes at least once daily. The precise timing of this is up to you, but for this example, I have suggested 2 AM local time each day. The configured recurrence settings should resemble the below if done correctly:

You should be aware that when your Flow is first activated, it will execute straightaway, regardless of what settings you have configured above.

HTTP

As the SendGrid Web API is an HTTP endpoint, we can utilise the built-in HTTP connector to retrieve the information we need. This is done via a GET operation, with authentication achieved via a Raw header value containing the API key generated earlier. The tricky bit comes when building the URI and how we want the Flow to retrieve our information – namely, all statistic information covering the previous day. There is also the (optional) requirement of ensuring that statistic information is grouped by category when retrieved. Fortunately, we can get around this problem by using a bit of Expression trickery to build a dynamic URI value each time the Flow is executed. The expression code to use will depend on whether or not you require category grouping. I have provided both examples below, so simply choose the one that meets your specific requirement:

Retrieve Consolidated Statistics
concat('https://api.sendgrid.com/v3/stats?start_date=', string(getPastTime(1, 'day', 'yyyy-MM-dd')), '&end_date=', string(getPastTime(1, 'day', 'yyyy-MM-dd')))
Retrieve Statistics Grouped By Category
concat('https://api.sendgrid.com/v3/categories/stats?start_date=', string(getPastTime(1, 'day', 'yyyy-MM-dd')), '&end_date=', string(getPastTime(1, 'day', 'yyyy-MM-dd')), '&categories=cat1&categories=cat2')

Note: For this example, statistic information would be returned only for the categories that equal cat1 & cat2. These should be updated to suit your requirements, and you can add on additional categories by extending the URI value like so: &categories=cat3&categories=cat4 etc.

Your completed HTTP component should resemble the below if done correctly. Note in particular the requirement to have Bearer and a space before specifying your API key:

Parse JSON

A successful 200 response to the Web API endpoint will return a JSON object, listing all statistic information grouped by date (and category, if used). I always struggle when it comes to working with JSON Рa symptom of working too long with relational databases I think Рand they are always challenging for me when attempting to serialize result sets. Once again, Flow comes to the rescue by providing a Parse JSON component. This was introduced with what appears to be little fanfare last year, but really proves its capabilities in this scenario. The only bit you will need to worry about is providing a sample schema so that the service can properly interpret your data. The Use sample payload to generate schema option is the surest way of achieving this, and you can use the example payloads provided on the SendGrid website to facilitate this:

Retrieve Consolidated Statistics: https://sendgrid.com/docs/API_Reference/Web_API_v3/Stats/global.html

Retrieve Statistics Grouped By Category: https://sendgrid.com/docs/API_Reference/Web_API_v3/Stats/categories.html

An example screenshot is provided below in case you get stuck with this:

Getting the records into the database

Here’s where things get confusing…at least for me when I was building out this flow for the first time. When you attempt to add in an¬†Insert row step to the flow and specify your input from the¬†Parse JSON step, Microsoft Flow will automatically add¬†two Apply to each step to properly handle the input. I can understand why this is the case, given that we are working with a nested JSON response, but it does provide an ample opportunity to revisit an internet meme of old…

Just pretend Xzibit is Microsoft Flow…

With the above ready and primed, you can begin to populate your Insert row step. Your first step here will more than likely be to configure your database connection settings using the + Add New Connection option:

The nicest thing about this is that you can utilise the on-premise gateway service to connect to a non-cloud database if required. Usual rules apply, regardless of where your database is located – use a minimum privileged account, configure any required IP whitelisting etc.

With your connection configured, all that’s left is to provide the name of your table and then perform a field mapping exercise from the JSON response. If you are utilising the SendGridStatisticUID¬†field, then this should be left blank to ensure that the default constraint kicks in correctly on the database side:

The Proof is in the Pudding: Testing your Flow

All that’s left now is to test your Flow. As highlighted earlier in the post, your Flow will automatically execute after being enabled, meaning that you will be in a position to determine very quickly if things are working or not. Assuming everything executes OK, you can verify that your database table resembles the below example:

This example output utilises the CategoryName¬†value, which will result in multiple data rows for each date, depending on the number of categories you are working with. This is why the¬†SendGridStatisticUID¬†is so handy for this scenario ūüôā

Conclusions or Wot I Think

When it came to delivering the requirements as set out in this posts introduction, I cannot overemphasise how much Microsoft Flow saved my bacon. My initial scoping exercise around this strongly led me towards having to develop a fully bespoke solution in code, with additional work than required to deploy this out to a dedicated environment for continuous execution. This would have surely led to:

  • Increased deployment time
  • Additional cost for the provisioning of a dedicated execution environment
  • Wasted time/effort due to bug-fixing or unforeseen errors
  • Long-term problems resulting from maintaining a custom code base and ensuring that other colleagues within the business could properly interpret the code correctly.

Instead, I was able to design, build and fully test the above solution in less than 2 hours, utilising a platform that has a wide array of documentation and online support and which, for our particular scenario, did not result in any additional cost. And this, I feel, best summarises the value that Microsoft Flow can bring to the table. It overturns many of the assumptions that you generally have to make when implementing complex integration requirements, allowing you to instead focus on delivering an effective solution quickly and neatly. And, for those who need a bit more punch due to very highly specific business requirements, then Azure Logic Apps act as the perfect meeting ground for both sides of the spectrum. The next time you find yourself staring down the abyss of a seemingly impossible integration requirement, take a look at what Microsoft Flow can offer. You just might surprise yourself.

Very much like a stopped clock telling the correct time twice a day, you can guarantee there will be two Dynamics 365 Customer Engagement releases each year. The first such occasion this year has come around quickly, with Microsoft setting out the stall for the Spring 2018 release earlier this week. The headline messages around this release are all around providing reassurance that the application is GDPR ready and in emphasising the maturity of Power Apps & Microsoft Flow as products within their own right and in conjunction with Dynamics 365. I’ve been studying the release notes in greater detail and, as part of this week’s blog post, I wanted to delve underneath the headlines and extrapolate some of the less touted, but potentially most impactful,¬†new features that I am most looking forward to.

Answer Tags for Voice of the Customer Surveys

I made a commitment earlier this year to utilise the Voice of the Customer solution more. When used correctly, and if you are already heavily invested in Dynamics 365, the solution can present a straightforward and cost-effective way of starting to understand what customers are thinking, both with respect to specific experiences they have with your business and towards the organisation overall. One new feature to be introduced with Voice of the Customer, which I am looking forward to getting my hands on, is the ability to use Answer Tags to dynamically structure any subsequent questions within the survey. A good example of how this works in practice can be seen below, as shown in the release notes:

The key driver behind the automation of customer feedback tools should be to ensure that customers receive tailored and relevant surveys relating to services they have received, whilst also taking away any administrative headache when distributing and collating feedback answers. The feature above helps to solidify the benefits that Voice of the Customer can deliver when utilised in tandem with Dynamics 365 Customer Engagement, as well as allowing for more powerful and broadly applicable surveys to be structured at design time.

The rise of the Unified Interface

The rebrand of the entire Dynamics 365 Customer Engagement application has been much promised and touted over the past year. With this release, it becomes a reality. Pretty much every key application module – Customer Service, Sales, Field Service & Project Service Automation – has been updated to utilise the new Unified Interface. The following applications/solutions will also be Unified Interface ready as part of the release:

  • Dynamics 365 App for Outlook
  • LinkedIn Sales Navigator
  • Gamification

The Unified Interface is very much an offshoot of the Interactive Service Hub, which it now replaces fully as part of this release (Interactive Service Hub users should read this article carefully, as there are some important points to consider if you plan to upgrade in the near future). I saw the new unified interface in action when attending the CRMUG Meeting in Reading last year, and its introduction represents one of the ways Microsoft is investing heavily within the Dynamics 365 product moving forward. Its key benefits in comparison to the current experience can be summarised as follows:

  • Consistent end-user experience when accessing the application from desktop, mobile or tablet operating systems.
  • Fully mobile responsive template, that adjusts to your specific device to provide the optimal experience
  • Better utilisation of empty spacing across entity views, forms etc.

With this release, administrators and developers need to start actively considering the impact the Unified Interface has on their systems and plan accordingly. Whilst I imagine there to be some pain involved as part of this, the end result – a much crisper and effective end-user interface – is worth the trade-off.

PowerShell Management for PowerApps

Up until now, your options for the automation of administrative tasks for PowerApps were limited. This issue was addressed to a certain extent for Dynamics 365 Customer Engagement Online very recently, via the introduction of PowerShell modules to facilitate organisation backups, instance resets and/or administrative mode toggling. These types of tools can go a long way if you have implemented automated release management tools for your various environments, taking human error out of the equation and streamlining deployments.

PowerApps looks to be going in the right direction in this regard, as the Spring Wave release will introduce a set of cmdlets that allow for the following actions to be accomplished:

  • Environments and environment permissions
  • PowerApps and app permission
  • Flows and flow permissions
  • Export and import of resource packages across environments
  • PowerApps and Flow licenses report (of active users)

Whilst definitely more administrative as opposed to deployment focused, their introduction is no doubt a welcome step in the right direction.

Future of the Common Data Service

Microsoft released the Common Data Service (CDS) in late 2016, around the same time as Microsoft Flow and the Dynamics CRM rebrand. The premise was simple and admirable: a common framework for you to develop the data you need for your business, that is instantly re-usable across multiple applications. My chief concern when this was first announced is where this left the traditional customisation experience for Dynamics CRM/365 Customer Engagement, commonly referred to as xRM. Having to countenance potential redevelopments of “legacy” xRM systems, just to make them compatible with the CDS could prove to be a costly and unnecessary exercise; this can perhaps be summed up best by the old saying “If it ain’t broke, don’t fix it!”.

There seems to have been a recognition of this dilemma as part of this release, with the following announcement regarding the Common Data Service and PowerApps specifically:

This release also includes major advancements to the Common Data Service for Apps (the data platform that comes with PowerApps) and client UX creation tools. These new capabilities are backward-compatible with the Dynamics 365 platform (frequently called the xRM platform), which means that Dynamics 365 customizers and partners can use already-acquired skills to create apps with PowerApps.

What I think this means, in simple terms, is that the customisation experience between Dynamics 365 Customer Engagement and Power Apps will, in time, become virtually indistinguishable. And this is great for a number of reasons – it negates any excuse that individuals/organisations may raise to explore PowerApps further, gives us the ability to quickly develop our own custom mobile applications for our particular Dynamics 365 solution and provides an easy framework to unify business data across multiple applications. This very much parallels the intended experience that Power BI has for traditional Excel users – namely, providing an identical toolbox that can be leveraged to quickly deploy solutions with reduced technical debt. As with a lot of these announcements, we’re not going to know exactly how things operate until they are in our hands, but the immediate upshot appears to be the nullification of any new learning requirements for CDS.

If you are looking for further detail regarding this change, then the ever so excellent Jukka Niiranen has published a blog post which really breaks down the detail behind this better than I ever could ūüôā

Yes, XRM Is The New Common Data Service

Email Notifications for Microsoft Flow Failures

Similar to Voice of the Customer, I also promised myself to use Microsoft Flow more this year. After some uneventful early testing, the tool has become (for me) an indisposable means of achieving integration requirements that would traditionally require custom code and a dedicated server environment to execute. Microsoft Flows do get some much-deserved love and attention as part of this release, and the one new feature which I think is going to be of the biggest help is email notifications for flow failures. The announced feature details are as follows:

Enable email notifications to detect flow failures. To enable this feature, go to the Flow details page, and then, on the contextual menu (…), subscribe to receiving emails about flow failures. These useful email notifications provide:

  • Information about why your flow failed.

  • Meaningful remediation steps.

  • Additional resources to help you build robust flows that never fail.

There’s so much more about this release that you could talk for days about…

…but I would be unsure whether anyone would still be listening by the end! You can dive into the detail behind each of the above highlights and what else to expect in the next release by downloading the release notes yourself. Let me know in the comments below what you are looking forward to the most as part of the next release.

Microsoft Flow is a tool that I increasingly have to bring front and centre when considering how to straightforwardly accommodate certain business requirements. The problem I have had with it, at times, is that there are often some notable caveats when attempting to achieve something that looks relatively simple from the outset. A good example of this is the SQL Server connector which, based on headline reading, enables you to trigger workflows when rows are added or changed within a database. Having the ability to trigger an email based on a database record update, create a document on OneDrive or even post a Tweet based on a modified database record are all things that instantly have a high degree of applicability for any number of different scenarios. When you read the fine print behind this, however, there are a few things which you have to bear in mind:

Limitations

The triggers do have the following limitations:

  • It does not work for on-premises SQL Server
  • Table must have an IDENTITY column for the new row trigger
  • Table must have a ROWVERSION (a.k.a. TIMESTAMP) column for the modified row trigger

A slightly frustrating side to this is that Microsoft Flow doesn’t intuitively tell you when your table is incompatible with the requirements – contrary to what is stated in the above post. Whilst readers of this post may be correct in chanting “RTFM!”, it still would be nice to be informed of any potential incompatibilities within Flow itself. Certainly, this can help in preventing any needless head banging early on ūüôā

Getting around these restrictions are fairly straightforward if you have the ability to modify the table you are wanting to interact with using Flow. For example, executing the following script against the MyTable table will get it fully prepped for the service:

ALTER TABLE dbo.MyTable
ADD	[FlowID] INT IDENTITY(1,1) NOT NULL,
	[RowVersion] ROWVERSION
	

Accepting this fact, there may be certain situations when this is not the best option to implement:

  • The database/tables you are interacting with form part of a propriety application, therefore making it impractical and potentially dangerous to modify table objects.
  • The table in question could contain sensitive information. Keep in mind the fact that the Microsoft Flow service would require service account access with full SELECT privileges against your target table. This could expose a risk to your environment, should the credentials or the service itself be compromised in future.
  • If your target table already contains an inordinately large number of columns and/or rows, then the introduction of additional columns and processing via an IDENTITY/ROWVERSION seed could start to tip your application over the edge.
  • Your target database does not use an integer field and IDENTITY seed to uniquely identify columns, meaning that such a column needs to (arguably unnecessarily) added.

An alternative approach to consider would be to configure a “gateway” table for Microsoft Flow to access – one which contains¬†only the fields that Flow needs to process with, is linked back to the source table via a foreign key relationship and which involves the use of a database trigger to automate the creation of the “gateway” record. Note that this approach only works if you have a unique row identifier in your source table in the first place; if your table is recording important, row-specific information and this is not in place, then you should probably re-evaluate your table design ūüėČ

Let’s see how the above example would work in practice, using the following example table:

CREATE TABLE [dbo].[SourceTable]
(
	[SourceTableUID] UNIQUEIDENTIFIER PRIMARY KEY NOT NULL,
	[SourceTableCol1] VARCHAR(50) NULL,
	[SourceTableCol2] VARCHAR(150) NULL,
	[SourceTableCol3] DATETIME NULL
)

In this scenario, the table object is using the UNIQUEIDENTIFIER column type to ensure that each row can be…well…uniquely identified!

The next step would be to create our “gateway” table. Based on the table script above, this would be built out via the following script:

CREATE TABLE [dbo].[SourceTableLog]
(
	[SourceTableLogID] INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
	[SourceTableUID] UNIQUEIDENTIFIER NOT NULL,
	CONSTRAINT FK_SourceTable_SourceTableLog FOREIGN KEY ([SourceTableUID])
		REFERENCES [dbo].[SourceTable] ([SourceTableUID])
		ON DELETE CASCADE,
	[TimeStamp] ROWVERSION
)

The use of a FOREIGN KEY here will help to ensure that the “gateway” table stays tidy in the event that any related record is deleted from the source table. This is handled automatically, thanks to the ON DELETE CASCADE option.

The final step would be to implement a trigger on the dbo.SourceTable object that fires every time a record is INSERTed into the table:

CREATE TRIGGER [trInsertNewSourceTableToLog]
ON [dbo].[SourceTable]
AFTER INSERT
AS
BEGIN
	INSERT INTO [dbo].[SourceTableLog] ([SourceTableLogUID])
	SELECT [SourceTableUID]
	FROM inserted
END

For those unfamiliar with how triggers work, the¬†inserted table is a special object exposed during runtime that allows you to access the values that have been…OK, let’s move on!

With all of the above in place, you can now implement a service account for Microsoft Flow to use when connecting to your database that is sufficiently curtailed in its permissions. This can either be a database user associated with a server level login:

CREATE USER [mydatabase-flow] FOR LOGIN [mydatabase-flow]
	WITH DEFAULT_SCHEMA = dbo

GO

GRANT CONNECT TO [mydatabase-flow]

GO

GRANT SELECT ON [dbo].[SourceTableLog] TO [mydatabase-flow]

GO

Or a contained database user account (this would be my recommended option):

CREATE USER [mydatabase-flow] WITH PASSWORD = 'P@ssw0rd1',
	DEFAULT_SCHEMA = dbo

GO

GRANT CONNECT TO [mydatabase-flow]

GO

GRANT SELECT ON [dbo].[SourceTableLog] TO [mydatabase-flow]

GO

From there, the world is your oyster – you can start to implement whatever action, conditions etc. that you require for your particular requirement(s). There are a few additional tips I would recommend when working with SQL Server and Azure:

  • If you need to retrieve specific data from SQL, avoid querying tables directly and instead encapsulate your logic into Stored Procedures instead.
  • In line with the ethos above, ensure that you always use a dedicated service account for authentication and scope the permissions to only those that are required.
  • If working with Azure SQL, you will need to ensure that you have ticked the¬†Allow access to Azure services options on the Firewall rules page of your server.

Despite some of the challenges you may face in getting your databases up to spec to work with Microsoft Flow, this does not take away from the fact that the tool is incredibly effective in its ability to integrate disparate services together, once you have overcome some initial hurdles at the starting pistol.