With two major Microsoft events recently taking place back to back over the last fortnight – Microsoft Inspire & the Business Applications Summit – there is, understandably, a plethora of major new announcements that concern those of us who are working in the Business Applications space today. The critical announcement from my perspective is the October 2018 Business Application Release Notes, which gives us all a nice and early look at what is going to be released soon for Dynamics 365, Microsoft Flow, PowerApps, Power BI and other related services. Unlike previous Spring or Fall releases, the sheer breadth of different features that now sit within the Business Applications space makes it all the more important to consider any new announcement carefully and to ensure that they are adequately factored into any architectural decisions in months ahead. If you are having trouble wading through all 239 pages of the document, then I have been through the notes and picked out what I feel are most relevant highlights from a Dynamics CRM/Dynamics 365 Customer Engagement (D365CE) perspective and their potential impact or applicability to business scenarios.

SharePoint Integration with Portals

This is a biggie and a feature that no doubt many portal administrators have been clamouring for, with the only other option being a complicated SDK solution or a third-party vendor approach. Document management directly within CRM/D365CE has always been a sketchy idea at best when you consider the database size limitations of the application and the cost for additional database storage. That’s why SharePoint has always represented the optimal choice for storing any documents related to a record, facilitating a much more inexpensive route and affording opportunities to take advantage of the vast array of SharePoint features. When you start adding portals into the mix – for example, to enable customers to upload documents relating to a loan application – the whole thing currently falls flat on its face, as documents (to the best of my knowledge) can only be uploaded and stored directly within CRM/D365CE. With the removal of this feature, a significant adoption barrier for CRM Portals will be eliminated, and I am pleased to also see an obligatory Power BI reference included as part of this announcement ūüôā

In addition, we are providing the ability to embed Power BI charts within a portal, allowing users to benefit from the interactive visualizations of Power BI.

Portal Configuration Migration

Another process that can regularly feel disjointed and laborious are the steps involved in deploying Portal changes from Dev -> UAT/Test -> Production environments, with no straightforward means of packaging up changes via a Solution or similar for easy transportation. This torment promises to change as part of the release in October, thanks to the following:

To reduce the time and effort required to manage portal configuration across environments, we are publishing schema for configuration migration that works with the Configuration Migration SDK tool.

If you are not aware of the Configuration Migration tool, then you owe it to yourself to find out more about what it can accomplish, as I am sure it will take a lot of headache out of everyday business settings, product catalogue or other non-solution customisation activity that you may be carrying out in multiple environments. The neat thing about this particular announcement is that an existing, well-established tool can be used to achieve these new requirements, as opposed to an entirely new, unfamiliar mechanism. Integration with the current Configuration Migration tool will surely help in adopting this solution more quickly and enable deployment profiles to be put together that contain nearly all required configuration data for migration.

Portal Access Restrictions

In Portal terms, this is a relatively minor one, but a welcome addition nonetheless. When testing and developing any software application, it is always prudent to restrict access to only the users or organisations who require access to it. This option has not been available to Portals to date, but no longer thanks to the following announcement:

This feature would allow administrators to define a list of IP addresses that are allowed to access your portal. The allow list can include individual IP addresses or a range of IP addresses defined by a subnet mask. When a request to the portal is generated from any user, their IP address is evaluated against the allow list. If the IP address is not in the list, the portal replies with an HTTP 403 status code

The capabilities exposed here demonstrate a lot of parity with Azure Web Apps, which is, I understand, what is used to host portals. I would hope that we can see the exposure of more Azure Web App configuration features for portal administrators in the years ahead.

Multi-resource Scheduling

There has been a real drive in getting the Resource Scheduling experience within D365CE looking as visually optimal and feature-rich as possible in recent years. There is a specific reason to explain this – the introduction of Project Service Automation and Field Service capability requires this as an almost mandatory pre-requisite. There is a wide array of new features relating to resource scheduling as part of this update, but the announcement that caught my eye, in particular, was the ability to group related resources on the Resource Scheduler, as predefined “crews”. This new feature is hugely welcome for many reasons:

  • Different types of jobs/work may require resources with a specific set of skills in combination to complete.
  • It may be prudent to group specific resources if, for example, previous experience tells you that they work well together.
  • Location may be a factor as part of all this, meaning that by scheduling a “crew” of resources together within the same locale, you can reduce the unnecessary effort involved in travelling and ensure your resources are utilising their time more effectively.

The release notes give us a teaser of how this will look in practice, and I am eager to see how this works in practice:

Leave and absence management in Dynamics 365 Talent

I have been watching with casual, distant interest how the Dynamics 365 Talent product has been developing since its release, billed as one of the first applications built on top of the new Unified Interface/Common Data Service experience. I have noted its primary utility to date has been more towards the Human Resources hiring and onboarding process, with a significant feature gap that other HR systems on the market today would more than happily fill, by providing central hubs for policy documents, managing personal information and leave requests. I think there may be a recognition of this fact within Microsoft, which explains the range of new features contained within Dynamics 365 Talent as part of the October 2018 release. The new feature that best epitomises the applications maturity is the ability to manage leaves and absences, noted as follows:

Organizations can configure rules and policies related to their leave and absence plans. They can choose how employees accrue their time off, whether it’s by years of service or by hours worked. They also can configure when this time off can be taken and if certain types of time off must be taken before others. If they allow employees to get a pay-out of their time off, this can be configured as well.

Managers can see an all-up calendar view of their team members’ time off as well as company holidays and closures. This view shows them where they may have overlap as well as time-off trends for their team and enables them to drill down to gain a better understanding of an individual’s time off.

This immediately places the system as a possible challenger to other HR systems and represents a natural, and much needed, coming-of-age development for the system. I would undoubtedly say that Dynamics 365 Talent is starting to become something that warrants much closer attention in future.

Develop Microsoft Flows Using Visio

Microsoft Flow is great. This fact should be self-evident to regular followers of the blog. As a regularly developing, relatively young product, though, it is understandable that some aspects of it require further work. An excellent example of this is the ability to manage the deployment of Flows between different environments or stages. While Flows big brother, Microsoft Logic Apps, has this pretty well covered, the ability to deploy development or concepts Flows repeatedly often ends up being a case of manually creating each Flow again from scratch, which isn’t exactly fun.

The October release promises to change this with the introduction of a specific piece of integration with Microsoft Visio:

Microsoft Visio enables enterprises to capture their business processes using its rich modeling capabilities. Anyone who creates flowcharts or SharePoint workflows can now use Visio to design Microsoft Flow workflows. You can use Visio’s sharing and commenting capabilities to collaborate with multiple stakeholders and arrive at a complete workflow in little time. As requested here, you can publish the workflow to Microsoft Flow, then supply parameters to activate it.

This feature will be available to Visio Online Plan 2 subscription users. Office Insiders can expect early access in July 2018. In the future, you’ll also be able to export existing Flows and modify them in Visio.

Now, it’s worth noting, in particular, the requirement for Visio Online Plan 2 to accommodate this neat piece of functionality. But, assuming this is not an issue for your organisation, the potential here to define Flows locally, share them quickly for approval, and deploy them¬†en masse is enormous, bringing a much-needed degree of automation to a product that currently does not support this. I’m looking forward to getting my hands on this in due course.

Custom Fonts in Power BI

Continuing the theme of obligatory Power BI references, my final pick has to be the introduction of Custom Fonts into Power BI, which will be in Public Preview as part of October’s release:

Corporate themes often include specific fonts that are distributed and used throughout the company. You can use those fonts in your Power BI reports.

For any font property, Power BI Desktop will show a complete list of all the fonts installed on your computer. You can choose from these to use in your report. When distributing the report, anyone with the font installed will see it reflected in the report. If the end user doesn’t have it installed, it falls back to the default font.

For those who have particular branding requirements that require accommodation within their Power BI Reports, this new feature completes the puzzle and takes you an additional step further in transforming your reports so that they are almost unrecognisable from a default Power BI Report. Hopefully, the preview period for this new feature will be relatively short and then rolled out as part of general availability.

Conclusions or Wot I Think

The list above is just a flavour of my “choice cuts” of the most exciting features that will be in our hands within the next few months, and I really would urge you to read through the entire document if you have even just a little passing interest in any of the technologies included in these release notes. As you can tell, my list is ever so skewered towards Portals out of everything else. This is for a good reason – ever since Microsoft’s acquisition of ADXStudio a few years back, we have seen some progress in the development of CRM Portals from Microsoft, mainly in the context of integrating the product more tightly for Online users. In my view, this has been the only significant effort we have seen in taking the product forward, with a relatively extensive list of backlog feature requests that looked to have been consigned to the recycling bin. The October Release very much seems to flip this on its head and I am pleased to discover a whole range of new, most clamoured for, features being made available on Portals, which take the product forward in strides and enables organisations to more easily contemplate their introduction.

As you will probably expect based on where things are going in the D365CE space at the moment, the announcements for Flow, PowerApps and the Common Data Service are all very much framed towards the end goal of integrating these and the “old” CRM/D365CE experience together as tightly as possible, a change that should be welcomed. The release notes are also crucial in highlighting the importance of anyone working in this space to be as multi-skilled as possible from a technology standpoint. Microsoft is (quite rightly) encouraging all technology professionals to be fast and reactive to change, and anticipating us to have a diverse range of skills to help the organisations/businesses we work with every day. There is no point in fighting this and, the best way for you to succeed in this climate is to identify the relevant opportunities that you can drive forward from these product announcements and proactively implement as part of the work you are doing each day. In a nutshell, you should know how to deploy a Power BI Dashboard, have familiarity with the type of services that Flow connects to, see the difference between a Canvas and Model-driven PowerApps and – amongst all of this – understand how D365CE solutions operate. Be a Swiss Army Knife as much as possible and deliver as much value and benefit in your role as you possibly can.

Earlier this year, the Business Applications team at Microsoft published a blog post titled Modernizing the way we update Dynamics 365, a significant article that anyone involved with Dynamics 365 Customer Engagement (D365CE) should take time to read through carefully. Indeed, as a direct consequence of the announcements contained in this post, you may now be receiving emails similar to the below if you are an administrator of a D365CE instance:

Changes to well-established processes always can produce a mixture of questions, confusion and, in some cases, frustration for IT teams. Once you have fully understood the broader context of where D365CE is going and also the general sea change that has been occurring since Satya Nadella came to the helm of Microsoft, the modifications to the Update Policy are welcome and, arguably, necessary to ensure that D365CE users and administrators can take advantage of the different features available within a D365CE subscription today. For those who are still scratching their head at all of this, what follows is a summary of the most significant changes announced, along with some additional observations from me on why it is important to embrace all these changes wholeheartedly.

Version 9 or Bust

Longstanding D365CE online customers will be used to the regular update cycles and the ability to defer significant application updates¬†for a period. While this can be prudent for more complex deployments, it does potentially lead to additional overhead in the long term, mainly if Microsoft were ever to force this decision upon you. The well-established advice has always been to proactively manage your updates at your own pace, ideally targeting at least one major update a year. If you haven’t been doing this, then you may now be in for a particularly nasty shock. As mentioned in the article:

Since every customer will be updated on the continuous delivery schedule, your organization needs to update to the latest version if you are running an older version of Dynamics 365…For customers who are currently running older versions of Dynamics 365, we will continue to provide you with the ability to schedule an update to the latest version and want to make sure this effort is as seamless as possible through continuous improvements in our update engine…For Dynamics 365 (online) customer engagement applications, we sent update communications in May to all customers running v8.1 and have scheduled updates. Customers running v8.2 should plan to update to the latest version by January 31, 2019.

This point is reinforced in a much more explicit manner in the email above:

ACTION NEEDED: Schedule an update for your organization by August 16, 2018. The date for the update should be on or before January 31, 2019. You can find instructions on how to schedule and approve updates here.

If you do not schedule an update in the timeframe mentioned above, Microsoft will schedule an automatic update for your organization on August 17, 2018 and communicate the dates. The automatic update would take place during your normal maintenance window.

The implications should be clear, and it certainly seems that, in this scenario, Microsoft has decided to eliminate any degree of upgrade flexibility for its customers.

No Changes to Minor/Major Updates?

Again, if you are familiar with how D365CE Online operates, there are two flavours of updates:

  • Minor updates, to address bugs, performance and stability issues, are continually pushed out “behind the scenes”. You have no control over when and how these are applied, but they will always be carried out outside your regions regular business hours. The Office 365 Administrator Portal is your go-to place to view any past or upcoming minor updates.
  • Major updates generally referred to as¬†Spring Wave or¬†Fall Update releases. There has always been two of these each year, and administrators can choose when to apply these to a D365CE instance. These updates can generally take much longer to complete but will introduce significant new features.

Microsoft’s new Update Policy seems to leave this convention intact, with a noteworthy change highlighted below in bold:

We are transforming how we do service updates for Dynamics 365 (online). We will deliver two major releases per year ‚Äď April and October ‚Äď offering new capabilities and functionality. These updates will be backward compatible so your apps and customizations will continue to work post update. New features with major, disruptive changes to the user experience are off by default. This means administrators will be able to first test before enabling these features for their organization.

In addition to the two major updates, we will continue to deploy regular performance and reliability improvement updates throughout the year. We are phasing deployments over several weeks following safe deployment practices and monitoring updates closely for any issues.

Some additional detail around this will be welcome to determine its effectiveness, but I can imagine some parity with the Experimental Features area in PowerApps, which – contrary to the above – will often introduce new features that are left on by default. A derived version of this feature would, I think, work in practice and hopefully streamline the process of testing new functionality without necessarily introducing it unintended into Production environments.

On-Premise Implications

One question that all of this may raise is around the on-premise version of the application, in particular for those who consume online subscriptions, but use their dual-usage rights to create an on-premise instance instead. This situation becomes more pressing when you consider the following excerpt from the refreshed Update Policy:

Dynamics 365 (Online) version 8.2 will be fully supported until January 31, 2019. Customers running version 8.2 should plan to update to the latest version prior to this date.

Now, the important thing to stress is the fact that the above quotation makes explicit reference to Online as opposed to on-premise. Also, when we check Microsoft’s product lifecycle page, you can see that Mainstream support for this product ends in January 2021. On-premise administrators can, I would suggest, breath a sigh of relief for now, but I would urge you to contact Microsoft to clarify your support arrangements. I think as an organisation as well, you should also start seriously asking yourself the following questions:

  • Is an online, Software as a Service (SaaS) version of the application going to be easier to maintain compared with dedicated server environment(s)?
  • Is it possible to achieve all of your required functionality and business requirements using the Online version of the application?
  • Do you want to ensure you have the latest features exposed to you and can take advantage of Online-only functionality, such as Export to Excel Online?

If the answer to all of the above questions is “Yes”, then a migration to the Online version of the application would be my recommended course of action, as it wouldn’t surprise me if Microsoft were to stop releasing new versions/service packs for the on-premise version of the product or eliminate it by providing inexpensive sandbox instance options.

Recommended Next Steps

The fundamental aim of this move is a housekeeping exercise for Microsoft. The announcement earlier this year of version 2 of the Common Data Service – which is utilising the existing D365CE SQL database for all customisations – is the key driver behind a lot of the changes that are happing in the CRM/D365CE space today. The focus for the product team at Microsoft currently appears to be towards knitting together both experiences into the PowerApps interface. What this means in practice is that the traditional customisation experience is going to slowly fade away, to be replaced by Model-Driven App development instead. This refresh is excellent for several reasons – it provides a much-needed interface update, while also exposing additional functionality to us when creating business applications – but it is evident that such a massive change will require a consistent playing field for all of Microsoft’s existing version 8.2 and below D365CE customers. Getting everyone onto version 9 of the application is the apparent result towards rolling out version 2 of the Common Data Service for all existing customers while ensuring that D365CE can fit into the mould of other application release cycles across Microsoft today. Embracing the change should not be a difficult thing to do and, when you understand the broader context, there is no other option available on the table.

So what are the key takeaways from this that you should be thinking about in the weeks and months ahead? My suggested list would include the following:

  • Schedule your update to version 9 of the application manually well in advance of August 16th 2018.¬†DO NOT put yourself in a position where you are having an update forced upon you and give yourself the amount of time needed to successfully plan and test your upgrade in good time before January 31st 2019. I would also anticipate upgrade slots may start to fill up fast if you want to wait until as late as possible too ūüôā
  • Start considering your future strategy in regards to the on-premise version of the application, if you are still supporting these environments. I speak with literally zero authority here, but I would not be surprised if the on-premise version of the application receives no further update at all in future or that dual-usage rights get revoked entirely.
  • Get familiar with the Common Data Service and Power Apps, as this is increasingly going to be the go-to area D365CE development and administration in the future. If you get the opportunity to attend one of Microsoft’s PowerApp in Day course, then be sure to go along without any hesitation. I would also be happy to speak to and help anyone with training in this area.
  • As with anything in life, embrace change, be proactive and identify areas of opportunity from this. A good one from my perspective is the potential to more easily introduce the staggering array of differing Business Application functionality, with the outcome being the ability to quickly deploy bespoke business applications that achieve any possible requirement and integrate with a wide variety of different services or datasets.

On February 10th 2015, Microsoft published Security Bulletin MS15-011, which detailed a recently discovered critical flaw in every major version of Windows from Server 2003 right the way through to Windows 8.1. The flaw, relating to how Group Policy handles data, potentially allows:

…remote code execution if an attacker convinces a user with a domain-configured system to connect to an attacker-controlled network. An attacker who successfully exploited this vulnerability could take complete control of an affected system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights.

Microsoft was quick to release a corresponding security patch via the Windows Update service and each corresponding Operating System update can be downloaded and installed via the links below

Now, you may be asking at this point, why are you posting about this now in mid-2018? Surely the vulnerability has been addressed on all affected systems that are patched regularly? Well, as I found out very recently, this security patch is one of many released by Microsoft that requires additional administrator intervention after installation to ensure that the exploit hole is properly filled. The modifications required can be completed either via the Local Group Policy Editor for single machines not joined to a domain or Group Policy Management Console from a domain controller. The second option is naturally preferred if you are managing a large estate of Windows machines. Below are summarised steps that should provide the appropriate guidance on applying the fix for both environments

  1. Navigate to the Management Editor and expand and open up the Computer Configuration/Policies/Administrative Templates/Network/Network Provider folder path.

  1. On the right-hand pane, you should see an item called Hardened UNC Paths, marked in a state of Not configured. Click on it to open its properties

  1. There are then a couple of steps that need to be completed on the pop-up window that appears:
    • Ensure that the¬†Enabled box is selected.
    • In the¬†Options tab, scroll down to the Show… button and press it. The options at this stage depend upon your specific environment. For example, let’s assume that you have a domain with a file server called¬†MyServer, which is configured for shared access. The most appropriate option, in this case, would be a Value name of \\MyServer\* with a Value of¬†RequireMutualAuthentication=1, RequireIntegrity=1. Another example scenario could be that multiple Servers are used for sharing out access out to a share called Company. In this case, you could use the¬†Value name option of \\*\Company¬†with a¬†Value of¬†RequireMutualAuthentication=1, RequireIntegrity=1. Both of these examples are reproduced in the screenshot below, for your reference. Press the OK button to confirm the UNC path fields and¬†Apply to make the policy change.

 

  1. The final step will be to enforce a group policy refresh on the target machine and any others on the domain. This can be done by executing the gpupdate /force Command Prompt cmdlet and confirming that no errors are generated in the output.

And that’s it! Your Windows domain/computer should now be properly hardened against the vulnerability ūüôā

This whole example represents an excellent case study on the importance of regularly reviewing security bulletins or announcements from Microsoft. The process of carrying out Windows updates can often become one of those thankless tasks that can grind the gears of even the most ardent server administrators. With this in mind, it can be expected that a degree of apathy or lack of awareness regarding the context for certain updates can creep in, leading to situations where issues like this only get flagged up during a security audit or similar. I would strongly urge anyone who is still running one or all of the above Operating Systems to check their group policy configuration as soon as possible to verify that the required changes indicated in this post have been applied.

Once upon a time, there was a new cloud service known as Windows Azure. Over time, this cloud service developed with new features, became known more generally as just Azure, embraced the unthinkable from a technology standpoint and also went through a complete platform overhaul. Longstanding Azure users will remember the “classic” portal, with its very…distinctive…user interface:

Image courtesy of Microsoft

As the range of different services offered on Azure increased and the call for more efficient management tools became almost deafening, Microsoft announced the introduction of a new portal experience and Resource Group Management for Azure resources, both of which are now the de facto means of interacting with Azure today. The old style portal indicated above was officially discontinued earlier this year. In line with these changes, Microsoft introduced new, Resource Manager compatible versions of pretty much every major service available on the “classic” portal…with some notable exceptions. The following “classic” resources can still be created and worked with today using the new Azure portal:

This provides accommodation for those who are still operating compute resources dating back to the days of yonder, allowing you to create and manage resources that may be needed to ensure the continued success of your existing application deployment.¬†In most cases, you will not want to create these “classic” resources as part of new project work, as the equivalent Resource Manager options should be more than sufficient for your needs. The only question mark around this concerns Cloud Services. There is no equivalent Resource Manager resource available currently, with the recommended option for new deployments being Azure Service Fabric instead. Based on my research online, there appears to be quite a feature breadth between both offerings, with Azure Service Fabric arguably being overkill for more simplistic requirements. There also appears to be some uncertainty over whether Cloud Services are technically considered deprecated or not. I would highly recommend reading Andreas Helland’s blog post on the subject and form your own opinion from there.

For both experiences, Microsoft provided a full set of automation tools in PowerShell to help developers carry out common tasks on the Azure Portal. These are split out into the standard Azure cmdlets for the “classic” experience and a set of AzureRM cmdlets for the new Resource Management approach. Although the “classic” Azure resource cmdlets are still available and supported, they very much operate in isolation – that is, if you have a requirement to interchangeably create “classic” and Resource Manager resources as part of the same script file, then you are going to encounter some major difficulties and errors. One example of this is that the ability to switch subscriptions that you have access, but not ownership, to becomes nigh on impossible to achieve. For this reason, I would recommend utilising AzureRM cmdlets solely if you ever have a requirement to create classic resources to maintain an existing deployment. To help accommodate this scenario, the New-AzureRmResource cmdlet really becomes your best friend. In a nutshell, it lets you create¬†any Azure Resource of your choosing when executed. The catch around using it is that the exact syntax to utilise as part of the -ResourceType¬†parameter can take some time to discover, particularly in the case of working with “classic” resources. What follows are some code snippets that, hopefully, provide you with a working set of cmdlets to create the “classic” resources highlighted in the screenshot above.

Before you begin…

To use any of the cmdlets that follow, make sure you have connected to Azure, selected your target subscription and have a Resource Group created to store your resources using the cmdlets below. You can obtain your Subscription ID by navigating to its properties within the Azure portal:

#Replace the parameter values below to suit your requirements

$subscriptionID = 36ef0d35-2775-40f7-b3a1-970a4c23eca2
$rgName = 'MyResourceGroup'
$location = 'UK South'

Set-ExecutionPolicy Unrestricted
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionID
#Create Resource Group
New-AzureRMResourceGroup -Name $rgName -Location $location

With this done, you should hopefully encounter no problems executing the cmdlets that follow.

Cloud Services (classic)

#Create an empty Cloud Service (classic) resource in MyResourceGroup in the UK South region

New-AzureRmResource -ResourceName 'MyClassicCloudService' -ResourceGroupName $rgName `
                    -ResourceType 'Microsoft.ClassicCompute/domainNames' -Location $location -Force
                    

Disks (classic)

#Create a Disk (classic) resource using a Linux operating system in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyClassicDisk' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageaccounts/disks' ` 
                    -Location $location `
                    -PropertyObject @{'DiskName'='MyClassicDisk' 
                    'Label'='My Classic Disk' 
                    'VhdUri'='https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'
                    'OperatingSystem' = 'Linux'
                    } -Force
                    

Network Security groups (classic)

#Create a Network Security Group (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyNSG' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/networkSecurityGroups' `
                    -Location $location -Force
                    

Reserved IP Addresses (classic)

#Create a Reserved IP (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyReservedIP' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/reservedIps' `
                    -Location $location -Force
                    

Storage Accounts (classic)

#Create a Storage Account (classic) resource in MyResourceGroup in the UK South region.
#Storage account with use Standard Locally Redundant Storage

New-AzureRmResource -ResourceName 'MyStorageAccount' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/StorageAccounts' ` 
                    -Location $location -PropertyObject @{'AccountType' = 'Standard-LRS'} -Force
                    

Virtual Networks (classic)

#Create a Virtual Network (classic) resource in MyResourceGroup in the UK South Region

New-AzureRmResource -ResourceName 'MyVNET' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/virtualNetworks' `
                    -Location $location
                    -PropertyObject @{'AddressSpace' = @{'AddressPrefixes' = '10.0.0.0/16'}
                                      'Subnets' = @{'name' = 'MySubnet'
                                                    'AddressPrefix' = '10.0.0.0/24'
                                                    }
                                     }
                                     

VM Images (classic)

#Create a VM image (classic) resource in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyVMImage' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageAccounts/vmImages' `
                    -Location $location `
                    -PropertyObject @{'Label' = 'MyVMImage Label'
                    'Description' = 'MyVMImage Description'
                    'OperatingSystemDisk' = @{'OsState' = 'Specialized'
                                              'Caching' = 'ReadOnly'
                                              'OperatingSytem' = 'Windows'
                                              'VhdUri' = 'https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'}
                    }

Conclusions or Wot I Think

The requirement to work with the cmdlets shown in this post should only really be a concern for those who are maintaining “classic” resources as part of an ongoing deployment. It is therefore important to emphasise¬†not to use these cmdlets to create resources for new projects. Alongside the additional complexity involved in constructing the¬†New-AzureRmResource cmdlet, there is an abundance of new, updated AzureRM cmdlets at your disposal that enables you to more intuitively create the correct types of resources. The key benefit that these examples provide is the ability to use a single Azure PowerShell module for the management of your entire Azure estate, as opposed to having to switch back and forth between different modules. It is perhaps a testament to how flexible Azure is that cmdlets like the New-AzureRmResource exist in the first place, ultimately enabling anybody to fine-tune deployment and maintenance scripts to suit any conceivable situation.

The Voice of the Customer (VoC) add-on solution for Dynamics 365 Customer Engagement (D365CE) presents a really nice way of incorporating survey capabilities within your existing Dynamics application estate, without any additional cost or significant administrative overhead. I’ve talked about the tool previously, within the context of specific application errors, and I can attest to its capabilities – both as a standalone solution and as one that can be leveraged alongside other D365CE functionality to generate additional value.

One feature that is particularly useful is the ability to include diverse Survey Response controls. This can cover the range of anticipated user inputs that most web developers would be used to – text inputs, ratings, date pickers etc. – along with more marketing specific choices such as Net Promoter Score and even a Smilies rating control. The final one of these really does have to be seen to wholly appreciate:

I hope you agree that this is definitely one of those features that becomes so fun that it soaks up WAY more time than necessary ūüôā

One of the final options that VoC provides you is the ability to upload files to a Survey Response, which is stored within the application and made retrievable at any time by locating the appropriate Survey Response record. You can customise the guidance text presented to the user for this control, such as in the example below:

Uploaded files are then saved onto an Azure Blob Storage location (which you don’t have direct access to), with the access URL stored within D365CE. The inclusion of this feature does provide the capability to accommodate several potential business scenarios, such as:

  • Allowing a service desk to create an automated survey that allows error logs or screenshots to be uploaded for further diagnosis.
  • The gathering of useful photographic information as part of a pre-qualification process for a product installation.
  • Enabling customers to upload a photo that provides additional context relating to their experience – either positive or negative.

Putting all of this aside, however, and there are a few things that you should bear in mind when first evaluating this feature for your particular requirements. What follows is my list of major things to be aware of, along with some tips to sidestep any issues.

Privacy concerns…

To better understand why this is relevant, it helps to be aware of exactly¬†how files can be stored on Azure. Azure file storage works on the principle of “blobs” (i.e. files), which can only be created within a corresponding Storage Container. These can be configured using a couple of different options, depending on how you would like to access your data, which is elaborated upon in this really helpful article:

You can configure a container with the following permissions:

  • No public read access: The container and its blobs can be accessed only by the storage account owner. This is the default for all new containers.

  • Public read access for blobs only: Blobs within the container can be read by anonymous request, but container data is not available. Anonymous clients cannot enumerate the blobs within the container.

  • Full public read access: All container and blob data can be read by anonymous request. Clients can enumerate blobs within the container by anonymous request, but cannot enumerate containers within the storage account.

To presumably mitigate the need for complex deployments of the VoC solution, all uploaded Survey Response files are saved in Full public read access storage containers, meaning that anyone with the URL can access these files. And, as mentioned already, administrators have no direct access to the Azure Storage Account to modify these permissions, potentially compounding this access problem. Now, before you panic too much, the VoC solution deliberately structures the uploaded file in the following format:

https://<VoC Region Identifier>.blob.core.windows.net/<Survey Response GUID>-files/<File GUID>-<Example File>

This degree of complexity added during this goes a long way towards satisfying any privacy concerns Рit would be literally impossible for a human being or computer to guess what a particular uploaded file path is, even if they did have the Survey Response record GUID Рbut this still does not address the fact that the URL can be freely accessed and shared by anyone with sufficient permissions over the Survey Response entity in D365CE. You should, therefore, take appropriate care when scoping your security privileges within D365CE and look towards carrying out a Privacy Impact Assessment (PIA) over the type of data you are collecting via the upload file control.

…even after you delete a Survey Response.

As mentioned above, the Blob Storage URL is tagged to the Survey Response record within D365CE. So what happens when you delete this record? The answer, courtesy of Microsoft via a support request:

Deleting Survey Response should delete the file uploaded as part of the Survey Response

Based on my testing, however, this does not look to be the case. My understanding of the VoC solution is that it needs to regularly synchronise with components located on Azure, which can lead to a delay in certain actions completing (publish a Survey, create Survey Response record etc.). However, a file from a Survey Response test record that I deleted still remains accessible via its URL up to 8 hours after completing this action. This, evidently, raises a concern over what level of control you have over potentially critical and sensitive data types that may be included in uploaded files. I would urge you to carry out your own analysis as part of a PIA to sufficiently gauge what impact, if any, this may have on your data collection (and, more critically, storage) activities.

Restrictions

For the most part, file upload controls are not a heavily constrained feature, but it is worthwhile to keep the following restrictions in mind:

  • Executable file types are not permitted for upload (.exe, .ps1, .bat etc.)
  • Larger file types may not upload successfully, generating 404 server errors within the control. There is not a documented size limitation, but my testing would indicate that files as big as 60MB will not upload correctly.
  • Only one file upload control is allowed¬†per survey.

The last of these limitations is perhaps the most significant constraint. If you do have a requirement for separate files to be uploaded, then the best option is to provide instructions on the survey, advising users to compress their files into a single .zip archive before upload.

Conclusions or Wot I Think

Despite what this post may be leaning towards, I very much believe the VoC solution and, in particular, the ability to upload¬†Survey Response files, is all in a perfect, working condition. Going a step further on this, when viewed from a technical standpoint, I would even say that its method of execution is wholly justified. With the advent of the General Data Protection Regulations (GDPR) earlier this year, current attention is all around ensuring that appropriate access controls over data have been properly implemented, that ensures the privacy of individuals is fully upheld. Here is where the solution begins to fall over to a degree and evidence of the journey that VoC has made in becoming part of the Dynamics 365 “family” becomes most apparent. As can be expected, any product which is derived from an external acquisition will always present challenges when being “smushed” with a new application system.¬†I have been informed that there is an update coming to the VoC solution in August this year, with a range of new features that may address some of the data privacy concerns highlighted earlier. For example, the option will be provided for administrators to delete any uploaded file within a¬†Survey Response on-demand. Changes like this will surely go a long way towards providing the appropriate opportunities for VoC to be fully utilised by businesses looking to integrate an effective, GDPR-proof, customer survey tool.