On February 10th 2015, Microsoft published Security Bulletin MS15-011, which detailed a recently discovered critical flaw in every major version of Windows from Server 2003 right the way through to Windows 8.1. The flaw, relating to how Group Policy handles data, potentially allows:

…remote code execution if an attacker convinces a user with a domain-configured system to connect to an attacker-controlled network. An attacker who successfully exploited this vulnerability could take complete control of an affected system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights.

Microsoft was quick to release a corresponding security patch via the Windows Update service and each corresponding Operating System update can be downloaded and installed via the links below

Now, you may be asking at this point, why are you posting about this now in mid-2018? Surely the vulnerability has been addressed on all affected systems that are patched regularly? Well, as I found out very recently, this security patch is one of many released by Microsoft that requires additional administrator intervention after installation to ensure that the exploit hole is properly filled. The modifications required can be completed either via the Local Group Policy Editor for single machines not joined to a domain or Group Policy Management Console from a domain controller. The second option is naturally preferred if you are managing a large estate of Windows machines. Below are summarised steps that should provide the appropriate guidance on applying the fix for both environments

  1. Navigate to the Management Editor and expand and open up the Computer Configuration/Policies/Administrative Templates/Network/Network Provider folder path.

  1. On the right-hand pane, you should see an item called Hardened UNC Paths, marked in a state of Not configured. Click on it to open its properties

  1. There are then a couple of steps that need to be completed on the pop-up window that appears:
    • Ensure that the Enabled box is selected.
    • In the Options tab, scroll down to the Show… button and press it. The options at this stage depend upon your specific environment. For example, let’s assume that you have a domain with a file server called MyServer, which is configured for shared access. The most appropriate option, in this case, would be a Value name of \\MyServer\* with a Value of RequireMutualAuthentication=1, RequireIntegrity=1. Another example scenario could be that multiple Servers are used for sharing out access out to a share called Company. In this case, you could use the Value name option of \\*\Company with a Value of RequireMutualAuthentication=1, RequireIntegrity=1. Both of these examples are reproduced in the screenshot below, for your reference. Press the OK button to confirm the UNC path fields and Apply to make the policy change.

 

  1. The final step will be to enforce a group policy refresh on the target machine and any others on the domain. This can be done by executing the gpupdate /force Command Prompt cmdlet and confirming that no errors are generated in the output.

And that’s it! Your Windows domain/computer should now be properly hardened against the vulnerability 🙂

This whole example represents an excellent case study on the importance of regularly reviewing security bulletins or announcements from Microsoft. The process of carrying out Windows updates can often become one of those thankless tasks that can grind the gears of even the most ardent server administrators. With this in mind, it can be expected that a degree of apathy or lack of awareness regarding the context for certain updates can creep in, leading to situations where issues like this only get flagged up during a security audit or similar. I would strongly urge anyone who is still running one or all of the above Operating Systems to check their group policy configuration as soon as possible to verify that the required changes indicated in this post have been applied.

Once upon a time, there was a new cloud service known as Windows Azure. Over time, this cloud service developed with new features, became known more generally as just Azure, embraced the unthinkable from a technology standpoint and also went through a complete platform overhaul. Longstanding Azure users will remember the “classic” portal, with its very…distinctive…user interface:

Image courtesy of Microsoft

As the range of different services offered on Azure increased and the call for more efficient management tools became almost deafening, Microsoft announced the introduction of a new portal experience and Resource Group Management for Azure resources, both of which are now the de facto means of interacting with Azure today. The old style portal indicated above was officially discontinued earlier this year. In line with these changes, Microsoft introduced new, Resource Manager compatible versions of pretty much every major service available on the “classic” portal…with some notable exceptions. The following “classic” resources can still be created and worked with today using the new Azure portal:

This provides accommodation for those who are still operating compute resources dating back to the days of yonder, allowing you to create and manage resources that may be needed to ensure the continued success of your existing application deployment. In most cases, you will not want to create these “classic” resources as part of new project work, as the equivalent Resource Manager options should be more than sufficient for your needs. The only question mark around this concerns Cloud Services. There is no equivalent Resource Manager resource available currently, with the recommended option for new deployments being Azure Service Fabric instead. Based on my research online, there appears to be quite a feature breadth between both offerings, with Azure Service Fabric arguably being overkill for more simplistic requirements. There also appears to be some uncertainty over whether Cloud Services are technically considered deprecated or not. I would highly recommend reading Andreas Helland’s blog post on the subject and form your own opinion from there.

For both experiences, Microsoft provided a full set of automation tools in PowerShell to help developers carry out common tasks on the Azure Portal. These are split out into the standard Azure cmdlets for the “classic” experience and a set of AzureRM cmdlets for the new Resource Management approach. Although the “classic” Azure resource cmdlets are still available and supported, they very much operate in isolation – that is, if you have a requirement to interchangeably create “classic” and Resource Manager resources as part of the same script file, then you are going to encounter some major difficulties and errors. One example of this is that the ability to switch subscriptions that you have access, but not ownership, to becomes nigh on impossible to achieve. For this reason, I would recommend utilising AzureRM cmdlets solely if you ever have a requirement to create classic resources to maintain an existing deployment. To help accommodate this scenario, the New-AzureRmResource cmdlet really becomes your best friend. In a nutshell, it lets you create any Azure Resource of your choosing when executed. The catch around using it is that the exact syntax to utilise as part of the -ResourceType parameter can take some time to discover, particularly in the case of working with “classic” resources. What follows are some code snippets that, hopefully, provide you with a working set of cmdlets to create the “classic” resources highlighted in the screenshot above.

Before you begin…

To use any of the cmdlets that follow, make sure you have connected to Azure, selected your target subscription and have a Resource Group created to store your resources using the cmdlets below. You can obtain your Subscription ID by navigating to its properties within the Azure portal:

#Replace the parameter values below to suit your requirements

$subscriptionID = 36ef0d35-2775-40f7-b3a1-970a4c23eca2
$rgName = 'MyResourceGroup'
$location = 'UK South'

Set-ExecutionPolicy Unrestricted
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionID
#Create Resource Group
New-AzureRMResourceGroup -Name $rgName -Location $location

With this done, you should hopefully encounter no problems executing the cmdlets that follow.

Cloud Services (classic)

#Create an empty Cloud Service (classic) resource in MyResourceGroup in the UK South region

New-AzureRmResource -ResourceName 'MyClassicCloudService' -ResourceGroupName $rgName `
                    -ResourceType 'Microsoft.ClassicCompute/domainNames' -Location $location -Force
                    

Disks (classic)

#Create a Disk (classic) resource using a Linux operating system in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyClassicDisk' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageaccounts/disks' ` 
                    -Location $location `
                    -PropertyObject @{'DiskName'='MyClassicDisk' 
                    'Label'='My Classic Disk' 
                    'VhdUri'='https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'
                    'OperatingSystem' = 'Linux'
                    } -Force
                    

Network Security groups (classic)

#Create a Network Security Group (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyNSG' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/networkSecurityGroups' `
                    -Location $location -Force
                    

Reserved IP Addresses (classic)

#Create a Reserved IP (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyReservedIP' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/reservedIps' `
                    -Location $location -Force
                    

Storage Accounts (classic)

#Create a Storage Account (classic) resource in MyResourceGroup in the UK South region.
#Storage account with use Standard Locally Redundant Storage

New-AzureRmResource -ResourceName 'MyStorageAccount' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/StorageAccounts' ` 
                    -Location $location -PropertyObject @{'AccountType' = 'Standard-LRS'} -Force
                    

Virtual Networks (classic)

#Create a Virtual Network (classic) resource in MyResourceGroup in the UK South Region

New-AzureRmResource -ResourceName 'MyVNET' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/virtualNetworks' `
                    -Location $location
                    -PropertyObject @{'AddressSpace' = @{'AddressPrefixes' = '10.0.0.0/16'}
                                      'Subnets' = @{'name' = 'MySubnet'
                                                    'AddressPrefix' = '10.0.0.0/24'
                                                    }
                                     }
                                     

VM Images (classic)

#Create a VM image (classic) resource in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyVMImage' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageAccounts/vmImages' `
                    -Location $location `
                    -PropertyObject @{'Label' = 'MyVMImage Label'
                    'Description' = 'MyVMImage Description'
                    'OperatingSystemDisk' = @{'OsState' = 'Specialized'
                                              'Caching' = 'ReadOnly'
                                              'OperatingSytem' = 'Windows'
                                              'VhdUri' = 'https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'}
                    }

Conclusions or Wot I Think

The requirement to work with the cmdlets shown in this post should only really be a concern for those who are maintaining “classic” resources as part of an ongoing deployment. It is therefore important to emphasise not to use these cmdlets to create resources for new projects. Alongside the additional complexity involved in constructing the New-AzureRmResource cmdlet, there is an abundance of new, updated AzureRM cmdlets at your disposal that enables you to more intuitively create the correct types of resources. The key benefit that these examples provide is the ability to use a single Azure PowerShell module for the management of your entire Azure estate, as opposed to having to switch back and forth between different modules. It is perhaps a testament to how flexible Azure is that cmdlets like the New-AzureRmResource exist in the first place, ultimately enabling anybody to fine-tune deployment and maintenance scripts to suit any conceivable situation.

The Voice of the Customer (VoC) add-on solution for Dynamics 365 Customer Engagement (D365CE) presents a really nice way of incorporating survey capabilities within your existing Dynamics application estate, without any additional cost or significant administrative overhead. I’ve talked about the tool previously, within the context of specific application errors, and I can attest to its capabilities – both as a standalone solution and as one that can be leveraged alongside other D365CE functionality to generate additional value.

One feature that is particularly useful is the ability to include diverse Survey Response controls. This can cover the range of anticipated user inputs that most web developers would be used to – text inputs, ratings, date pickers etc. – along with more marketing specific choices such as Net Promoter Score and even a Smilies rating control. The final one of these really does have to be seen to wholly appreciate:

I hope you agree that this is definitely one of those features that becomes so fun that it soaks up WAY more time than necessary 🙂

One of the final options that VoC provides you is the ability to upload files to a Survey Response, which is stored within the application and made retrievable at any time by locating the appropriate Survey Response record. You can customise the guidance text presented to the user for this control, such as in the example below:

Uploaded files are then saved onto an Azure Blob Storage location (which you don’t have direct access to), with the access URL stored within D365CE. The inclusion of this feature does provide the capability to accommodate several potential business scenarios, such as:

  • Allowing a service desk to create an automated survey that allows error logs or screenshots to be uploaded for further diagnosis.
  • The gathering of useful photographic information as part of a pre-qualification process for a product installation.
  • Enabling customers to upload a photo that provides additional context relating to their experience – either positive or negative.

Putting all of this aside, however, and there are a few things that you should bear in mind when first evaluating this feature for your particular requirements. What follows is my list of major things to be aware of, along with some tips to sidestep any issues.

Privacy concerns…

To better understand why this is relevant, it helps to be aware of exactly how files can be stored on Azure. Azure file storage works on the principle of “blobs” (i.e. files), which can only be created within a corresponding Storage Container. These can be configured using a couple of different options, depending on how you would like to access your data, which is elaborated upon in this really helpful article:

You can configure a container with the following permissions:

  • No public read access: The container and its blobs can be accessed only by the storage account owner. This is the default for all new containers.

  • Public read access for blobs only: Blobs within the container can be read by anonymous request, but container data is not available. Anonymous clients cannot enumerate the blobs within the container.

  • Full public read access: All container and blob data can be read by anonymous request. Clients can enumerate blobs within the container by anonymous request, but cannot enumerate containers within the storage account.

To presumably mitigate the need for complex deployments of the VoC solution, all uploaded Survey Response files are saved in Full public read access storage containers, meaning that anyone with the URL can access these files. And, as mentioned already, administrators have no direct access to the Azure Storage Account to modify these permissions, potentially compounding this access problem. Now, before you panic too much, the VoC solution deliberately structures the uploaded file in the following format:

https://<VoC Region Identifier>.blob.core.windows.net/<Survey Response GUID>-files/<File GUID>-<Example File>

This degree of complexity added during this goes a long way towards satisfying any privacy concerns – it would be literally impossible for a human being or computer to guess what a particular uploaded file path is, even if they did have the Survey Response record GUID – but this still does not address the fact that the URL can be freely accessed and shared by anyone with sufficient permissions over the Survey Response entity in D365CE. You should, therefore, take appropriate care when scoping your security privileges within D365CE and look towards carrying out a Privacy Impact Assessment (PIA) over the type of data you are collecting via the upload file control.

…even after you delete a Survey Response.

As mentioned above, the Blob Storage URL is tagged to the Survey Response record within D365CE. So what happens when you delete this record? The answer, courtesy of Microsoft via a support request:

Deleting Survey Response should delete the file uploaded as part of the Survey Response

Based on my testing, however, this does not look to be the case. My understanding of the VoC solution is that it needs to regularly synchronise with components located on Azure, which can lead to a delay in certain actions completing (publish a Survey, create Survey Response record etc.). However, a file from a Survey Response test record that I deleted still remains accessible via its URL up to 8 hours after completing this action. This, evidently, raises a concern over what level of control you have over potentially critical and sensitive data types that may be included in uploaded files. I would urge you to carry out your own analysis as part of a PIA to sufficiently gauge what impact, if any, this may have on your data collection (and, more critically, storage) activities.

Restrictions

For the most part, file upload controls are not a heavily constrained feature, but it is worthwhile to keep the following restrictions in mind:

  • Executable file types are not permitted for upload (.exe, .ps1, .bat etc.)
  • Larger file types may not upload successfully, generating 404 server errors within the control. There is not a documented size limitation, but my testing would indicate that files as big as 60MB will not upload correctly.
  • Only one file upload control is allowed per survey.

The last of these limitations is perhaps the most significant constraint. If you do have a requirement for separate files to be uploaded, then the best option is to provide instructions on the survey, advising users to compress their files into a single .zip archive before upload.

Conclusions or Wot I Think

Despite what this post may be leaning towards, I very much believe the VoC solution and, in particular, the ability to upload Survey Response files, is all in a perfect, working condition. Going a step further on this, when viewed from a technical standpoint, I would even say that its method of execution is wholly justified. With the advent of the General Data Protection Regulations (GDPR) earlier this year, current attention is all around ensuring that appropriate access controls over data have been properly implemented, that ensures the privacy of individuals is fully upheld. Here is where the solution begins to fall over to a degree and evidence of the journey that VoC has made in becoming part of the Dynamics 365 “family” becomes most apparent. As can be expected, any product which is derived from an external acquisition will always present challenges when being “smushed” with a new application system. I have been informed that there is an update coming to the VoC solution in August this year, with a range of new features that may address some of the data privacy concerns highlighted earlier. For example, the option will be provided for administrators to delete any uploaded file within a Survey Response on-demand. Changes like this will surely go a long way towards providing the appropriate opportunities for VoC to be fully utilised by businesses looking to integrate an effective, GDPR-proof, customer survey tool.