On February 10th 2015, Microsoft published Security Bulletin MS15-011, which detailed a recently discovered critical flaw in every major version of Windows from Server 2003 right the way through to Windows 8.1. The flaw, relating to how Group Policy handles data, potentially allows:

…remote code execution if an attacker convinces a user with a domain-configured system to connect to an attacker-controlled network. An attacker who successfully exploited this vulnerability could take complete control of an affected system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights.

Microsoft was quick to release a corresponding security patch via the Windows Update service and each corresponding Operating System update can be downloaded and installed via the links below

Now, you may be asking at this point, why are you posting about this now in mid-2018? Surely the vulnerability has been addressed on all affected systems that are patched regularly? Well, as I found out very recently, this security patch is one of many released by Microsoft that requires additional administrator intervention after installation to ensure that the exploit hole is properly filled. The modifications required can be completed either via the Local Group Policy Editor for single machines not joined to a domain or Group Policy Management Console from a domain controller. The second option is naturally preferred if you are managing a large estate of Windows machines. Below are summarised steps that should provide the appropriate guidance on applying the fix for both environments

  1. Navigate to the Management Editor and expand and open up the Computer Configuration/Policies/Administrative Templates/Network/Network Provider folder path.

  1. On the right-hand pane, you should see an item called Hardened UNC Paths, marked in a state of Not configured. Click on it to open its properties

  1. There are then a couple of steps that need to be completed on the pop-up window that appears:
    • Ensure that the Enabled box is selected.
    • In the Options tab, scroll down to the Show… button and press it. The options at this stage depend upon your specific environment. For example, let’s assume that you have a domain with a file server called MyServer, which is configured for shared access. The most appropriate option, in this case, would be a Value name of \\MyServer\* with a Value of RequireMutualAuthentication=1, RequireIntegrity=1. Another example scenario could be that multiple Servers are used for sharing out access out to a share called Company. In this case, you could use the Value name option of \\*\Company with a Value of RequireMutualAuthentication=1, RequireIntegrity=1. Both of these examples are reproduced in the screenshot below, for your reference. Press the OK button to confirm the UNC path fields and Apply to make the policy change.

 

  1. The final step will be to enforce a group policy refresh on the target machine and any others on the domain. This can be done by executing the gpupdate /force Command Prompt cmdlet and confirming that no errors are generated in the output.

And that’s it! Your Windows domain/computer should now be properly hardened against the vulnerability 🙂

This whole example represents an excellent case study on the importance of regularly reviewing security bulletins or announcements from Microsoft. The process of carrying out Windows updates can often become one of those thankless tasks that can grind the gears of even the most ardent server administrators. With this in mind, it can be expected that a degree of apathy or lack of awareness regarding the context for certain updates can creep in, leading to situations where issues like this only get flagged up during a security audit or similar. I would strongly urge anyone who is still running one or all of the above Operating Systems to check their group policy configuration as soon as possible to verify that the required changes indicated in this post have been applied.

Once upon a time, there was a new cloud service known as Windows Azure. Over time, this cloud service developed with new features, became known more generally as just Azure, embraced the unthinkable from a technology standpoint and also went through a complete platform overhaul. Longstanding Azure users will remember the “classic” portal, with its very…distinctive…user interface:

Image courtesy of Microsoft

As the range of different services offered on Azure increased and the call for more efficient management tools became almost deafening, Microsoft announced the introduction of a new portal experience and Resource Group Management for Azure resources, both of which are now the de facto means of interacting with Azure today. The old style portal indicated above was officially discontinued earlier this year. In line with these changes, Microsoft introduced new, Resource Manager compatible versions of pretty much every major service available on the “classic” portal…with some notable exceptions. The following “classic” resources can still be created and worked with today using the new Azure portal:

This provides accommodation for those who are still operating compute resources dating back to the days of yonder, allowing you to create and manage resources that may be needed to ensure the continued success of your existing application deployment. In most cases, you will not want to create these “classic” resources as part of new project work, as the equivalent Resource Manager options should be more than sufficient for your needs. The only question mark around this concerns Cloud Services. There is no equivalent Resource Manager resource available currently, with the recommended option for new deployments being Azure Service Fabric instead. Based on my research online, there appears to be quite a feature breadth between both offerings, with Azure Service Fabric arguably being overkill for more simplistic requirements. There also appears to be some uncertainty over whether Cloud Services are technically considered deprecated or not. I would highly recommend reading Andreas Helland’s blog post on the subject and form your own opinion from there.

For both experiences, Microsoft provided a full set of automation tools in PowerShell to help developers carry out common tasks on the Azure Portal. These are split out into the standard Azure cmdlets for the “classic” experience and a set of AzureRM cmdlets for the new Resource Management approach. Although the “classic” Azure resource cmdlets are still available and supported, they very much operate in isolation – that is, if you have a requirement to interchangeably create “classic” and Resource Manager resources as part of the same script file, then you are going to encounter some major difficulties and errors. One example of this is that the ability to switch subscriptions that you have access, but not ownership, to becomes nigh on impossible to achieve. For this reason, I would recommend utilising AzureRM cmdlets solely if you ever have a requirement to create classic resources to maintain an existing deployment. To help accommodate this scenario, the New-AzureRmResource cmdlet really becomes your best friend. In a nutshell, it lets you create any Azure Resource of your choosing when executed. The catch around using it is that the exact syntax to utilise as part of the -ResourceType parameter can take some time to discover, particularly in the case of working with “classic” resources. What follows are some code snippets that, hopefully, provide you with a working set of cmdlets to create the “classic” resources highlighted in the screenshot above.

Before you begin…

To use any of the cmdlets that follow, make sure you have connected to Azure, selected your target subscription and have a Resource Group created to store your resources using the cmdlets below. You can obtain your Subscription ID by navigating to its properties within the Azure portal:

#Replace the parameter values below to suit your requirements

$subscriptionID = 36ef0d35-2775-40f7-b3a1-970a4c23eca2
$rgName = 'MyResourceGroup'
$location = 'UK South'

Set-ExecutionPolicy Unrestricted
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionID
#Create Resource Group
New-AzureRMResourceGroup -Name $rgName -Location $location

With this done, you should hopefully encounter no problems executing the cmdlets that follow.

Cloud Services (classic)

#Create an empty Cloud Service (classic) resource in MyResourceGroup in the UK South region

New-AzureRmResource -ResourceName 'MyClassicCloudService' -ResourceGroupName $rgName `
                    -ResourceType 'Microsoft.ClassicCompute/domainNames' -Location $location -Force
                    

Disks (classic)

#Create a Disk (classic) resource using a Linux operating system in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyClassicDisk' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageaccounts/disks' ` 
                    -Location $location `
                    -PropertyObject @{'DiskName'='MyClassicDisk' 
                    'Label'='My Classic Disk' 
                    'VhdUri'='https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'
                    'OperatingSystem' = 'Linux'
                    } -Force
                    

Network Security groups (classic)

#Create a Network Security Group (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyNSG' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/networkSecurityGroups' `
                    -Location $location -Force
                    

Reserved IP Addresses (classic)

#Create a Reserved IP (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyReservedIP' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/reservedIps' `
                    -Location $location -Force
                    

Storage Accounts (classic)

#Create a Storage Account (classic) resource in MyResourceGroup in the UK South region.
#Storage account with use Standard Locally Redundant Storage

New-AzureRmResource -ResourceName 'MyStorageAccount' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/StorageAccounts' ` 
                    -Location $location -PropertyObject @{'AccountType' = 'Standard-LRS'} -Force
                    

Virtual Networks (classic)

#Create a Virtual Network (classic) resource in MyResourceGroup in the UK South Region

New-AzureRmResource -ResourceName 'MyVNET' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/virtualNetworks' `
                    -Location $location
                    -PropertyObject @{'AddressSpace' = @{'AddressPrefixes' = '10.0.0.0/16'}
                                      'Subnets' = @{'name' = 'MySubnet'
                                                    'AddressPrefix' = '10.0.0.0/24'
                                                    }
                                     }
                                     

VM Images (classic)

#Create a VM image (classic) resource in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyVMImage' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageAccounts/vmImages' `
                    -Location $location `
                    -PropertyObject @{'Label' = 'MyVMImage Label'
                    'Description' = 'MyVMImage Description'
                    'OperatingSystemDisk' = @{'OsState' = 'Specialized'
                                              'Caching' = 'ReadOnly'
                                              'OperatingSytem' = 'Windows'
                                              'VhdUri' = 'https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'}
                    }

Conclusions or Wot I Think

The requirement to work with the cmdlets shown in this post should only really be a concern for those who are maintaining “classic” resources as part of an ongoing deployment. It is therefore important to emphasise not to use these cmdlets to create resources for new projects. Alongside the additional complexity involved in constructing the New-AzureRmResource cmdlet, there is an abundance of new, updated AzureRM cmdlets at your disposal that enables you to more intuitively create the correct types of resources. The key benefit that these examples provide is the ability to use a single Azure PowerShell module for the management of your entire Azure estate, as opposed to having to switch back and forth between different modules. It is perhaps a testament to how flexible Azure is that cmdlets like the New-AzureRmResource exist in the first place, ultimately enabling anybody to fine-tune deployment and maintenance scripts to suit any conceivable situation.

The Voice of the Customer (VoC) add-on solution for Dynamics 365 Customer Engagement (D365CE) presents a really nice way of incorporating survey capabilities within your existing Dynamics application estate, without any additional cost or significant administrative overhead. I’ve talked about the tool previously, within the context of specific application errors, and I can attest to its capabilities – both as a standalone solution and as one that can be leveraged alongside other D365CE functionality to generate additional value.

One feature that is particularly useful is the ability to include diverse Survey Response controls. This can cover the range of anticipated user inputs that most web developers would be used to – text inputs, ratings, date pickers etc. – along with more marketing specific choices such as Net Promoter Score and even a Smilies rating control. The final one of these really does have to be seen to wholly appreciate:

I hope you agree that this is definitely one of those features that becomes so fun that it soaks up WAY more time than necessary 🙂

One of the final options that VoC provides you is the ability to upload files to a Survey Response, which is stored within the application and made retrievable at any time by locating the appropriate Survey Response record. You can customise the guidance text presented to the user for this control, such as in the example below:

Uploaded files are then saved onto an Azure Blob Storage location (which you don’t have direct access to), with the access URL stored within D365CE. The inclusion of this feature does provide the capability to accommodate several potential business scenarios, such as:

  • Allowing a service desk to create an automated survey that allows error logs or screenshots to be uploaded for further diagnosis.
  • The gathering of useful photographic information as part of a pre-qualification process for a product installation.
  • Enabling customers to upload a photo that provides additional context relating to their experience – either positive or negative.

Putting all of this aside, however, and there are a few things that you should bear in mind when first evaluating this feature for your particular requirements. What follows is my list of major things to be aware of, along with some tips to sidestep any issues.

Privacy concerns…

To better understand why this is relevant, it helps to be aware of exactly how files can be stored on Azure. Azure file storage works on the principle of “blobs” (i.e. files), which can only be created within a corresponding Storage Container. These can be configured using a couple of different options, depending on how you would like to access your data, which is elaborated upon in this really helpful article:

You can configure a container with the following permissions:

  • No public read access: The container and its blobs can be accessed only by the storage account owner. This is the default for all new containers.

  • Public read access for blobs only: Blobs within the container can be read by anonymous request, but container data is not available. Anonymous clients cannot enumerate the blobs within the container.

  • Full public read access: All container and blob data can be read by anonymous request. Clients can enumerate blobs within the container by anonymous request, but cannot enumerate containers within the storage account.

To presumably mitigate the need for complex deployments of the VoC solution, all uploaded Survey Response files are saved in Full public read access storage containers, meaning that anyone with the URL can access these files. And, as mentioned already, administrators have no direct access to the Azure Storage Account to modify these permissions, potentially compounding this access problem. Now, before you panic too much, the VoC solution deliberately structures the uploaded file in the following format:

https://<VoC Region Identifier>.blob.core.windows.net/<Survey Response GUID>-files/<File GUID>-<Example File>

This degree of complexity added during this goes a long way towards satisfying any privacy concerns – it would be literally impossible for a human being or computer to guess what a particular uploaded file path is, even if they did have the Survey Response record GUID – but this still does not address the fact that the URL can be freely accessed and shared by anyone with sufficient permissions over the Survey Response entity in D365CE. You should, therefore, take appropriate care when scoping your security privileges within D365CE and look towards carrying out a Privacy Impact Assessment (PIA) over the type of data you are collecting via the upload file control.

…even after you delete a Survey Response.

As mentioned above, the Blob Storage URL is tagged to the Survey Response record within D365CE. So what happens when you delete this record? The answer, courtesy of Microsoft via a support request:

Deleting Survey Response should delete the file uploaded as part of the Survey Response

Based on my testing, however, this does not look to be the case. My understanding of the VoC solution is that it needs to regularly synchronise with components located on Azure, which can lead to a delay in certain actions completing (publish a Survey, create Survey Response record etc.). However, a file from a Survey Response test record that I deleted still remains accessible via its URL up to 8 hours after completing this action. This, evidently, raises a concern over what level of control you have over potentially critical and sensitive data types that may be included in uploaded files. I would urge you to carry out your own analysis as part of a PIA to sufficiently gauge what impact, if any, this may have on your data collection (and, more critically, storage) activities.

Restrictions

For the most part, file upload controls are not a heavily constrained feature, but it is worthwhile to keep the following restrictions in mind:

  • Executable file types are not permitted for upload (.exe, .ps1, .bat etc.)
  • Larger file types may not upload successfully, generating 404 server errors within the control. There is not a documented size limitation, but my testing would indicate that files as big as 60MB will not upload correctly.
  • Only one file upload control is allowed per survey.

The last of these limitations is perhaps the most significant constraint. If you do have a requirement for separate files to be uploaded, then the best option is to provide instructions on the survey, advising users to compress their files into a single .zip archive before upload.

Conclusions or Wot I Think

Despite what this post may be leaning towards, I very much believe the VoC solution and, in particular, the ability to upload Survey Response files, is all in a perfect, working condition. Going a step further on this, when viewed from a technical standpoint, I would even say that its method of execution is wholly justified. With the advent of the General Data Protection Regulations (GDPR) earlier this year, current attention is all around ensuring that appropriate access controls over data have been properly implemented, that ensures the privacy of individuals is fully upheld. Here is where the solution begins to fall over to a degree and evidence of the journey that VoC has made in becoming part of the Dynamics 365 “family” becomes most apparent. As can be expected, any product which is derived from an external acquisition will always present challenges when being “smushed” with a new application system. I have been informed that there is an update coming to the VoC solution in August this year, with a range of new features that may address some of the data privacy concerns highlighted earlier. For example, the option will be provided for administrators to delete any uploaded file within a Survey Response on-demand. Changes like this will surely go a long way towards providing the appropriate opportunities for VoC to be fully utilised by businesses looking to integrate an effective, GDPR-proof, customer survey tool.

I was very honoured and excited to be involved with the very first D365UG/CRMUG North West Chapter Meeting earlier this week, hosted at the Grindsmith just off Deansgate in Manchester. This is the first time that a D365UG/CRMUG event has taken place in the North West, and we were absolutely stunned by the level of interest this event generated – all in all, 37 people attended, representing a broad spectrum of Microsoft partners and organisations of varying sizes.

I very much got the impression that the amount of Dynamics 365 Customer Engagement (D365CE) users in the North West far exceed any number you could assume, and I am really looking forward to seeing how future events develop as we (hopefully!) get more people involved. Despite a few technical glitches with the AV facilities, the feedback we have received to both presentations has been overwhelmingly positive, so a huge thanks to everyone who turned up and to our presenters for the evening

In this post, I wanted to share my thoughts on both sets of presentations, provide an answer to some of the questions that we didn’t get around to due to time constraints and, finally, provide a link to the slide deck from the evening.

Transform Group – The Patient Journey

The first talk of the evening was provided courtesy of Bill Egan at Edgewater Fullscope, who took us through Transform Group’s adoption of D365CE. Bill provided some really useful insights – from both an organisation and a Microsoft partner’s perspective – of the challenges that any business can face when moving across to a system like D365CE. As with any IT project, there were some major hurdles along the way, but Bill very much demonstrated how the business was able to roll with the punches and the very optimistic 16 weeks planned deployment presents an, arguably, essential blueprint in how IT projects need to be executed; namely, targeted towards delivering as much business benefit in a near immediate timeframe.

The key takeaways from me out of all this was in emphasising the importance of adapting projects quickly to changing business priorities and to recognise the continued effort required to ensure that business systems are regularly reviewed and updated to suit the requirements of not just the users, but the wider business.

Power Up Your Power Apps

The second presentation was literally a “head to head” challenge with Craig Bird from Microsoft and Chris “The Tattooed CRM Guy” Huntingford from Hitachi Solutions, seeing who could build the best PowerApps. In the end, the voting was pretty unanimous and Craig was the proud recipient of a prize worthy of a champion. I hope Craig will be wearing his belt proudly at future events 🙂

I found the presentation particularly useful in clearing up a number of worries I had around the Common Data Service and the future of D365CE. The changes that I saw are very much emphasised towards providing a needed facelift to the current customisation and configuration experience within D365CE, with little requirement to factor in migration and extensive learning of new tools to ensure that your D365CE entities are available within the Common Data Service. Everything “just works” and syncs across flawlessly.

https://twitter.com/joejgriffin/status/1009531079492079622

In terms of who had the best app, I think Craig literally knocked the socks off everyone with his translator application. Although I include myself in this category, I was still surprised to see that PowerApps supports Power BI embedded content, courtesy of Chris – a really nice thing to take away for any aspirational PowerApp developer.

Questions & Answers

We managed to get around to most questions for the first presentation but not for the second one. Here’s a list of all the questions that I am able to provide an answer to. I’m still in the process of collating together responses to the other questions received, so please keep checking back if you’re burning question is not answered below:

Presentation

For those who missed the event or are wanting to view the slides without a purple tinge, they will be downloadable for the next 30 days from the following location:

https://jamesgriffin-my.sharepoint.com/:p:/g/personal/joe_griffin_gb_net/EbRAws0urypMkrGyqCzoTdMB4ggjUQI4_npQlEZAYhea4w?e=U3lvf5

Looking Ahead

The next chapter meeting is scheduled to take place on the 2nd of October (venue TBC). If you are interested in getting involved, either through giving a presentation or in helping to organise the event, then please let us know by dropping us a message:

  • Email: crmuguknw@gmail.com
  • Twitter: @CRMUG_UK_NW

If you are heavily involved with the management and deployment of Office 365 Click-to-Run installation packages on a daily basis, the chances are that you have come across all sorts of weird and wonderful error messages throughout your travels. Although I am admittedly a novice in this area, I know there is oft the requirement to resort to registry modifications or similar to achieve certain kinds of management tasks, along with a whole list of other quirks that can test the patience of even the most ardent of IT professionals and frustrate what may appear to be simplistic Office 365 deployments.

Case in point – the installation of the downloadable Click-to-Run Office 365 installation package (available from the Office 365 portal) can be complicated when installing it via the default Administrator account on a Windows machine. When attempting to do this, either by double-clicking the executable file or executing it using Run as administrator privileges, you may get the following error message displayed below:

The error can occur in any version of Windows, up to and including Windows 10 1803 Build. The issue appears to pertain to the default Administrator account that is created on the operating system and can be observed occurring when creating a new Windows 10 Virtual Machine on Azure. It’s possible that the error may also occur in desktop versions of the operating system, depending on how the initial local administrator user account is deployed. There are a couple of ways to resolve this error, depending on your preferred method, familiarity with working with the inner engines of Windows and your inclination towards either a quick or “proper” solution.

The quick and easy solution

This route involves the creation of a new user account on the machine, which can then be logged into and used to install the application via User Account Control elevated privileges. The steps to achieve this are as follows:

  1. Type Windows + R on the start menu to open the Run box.
  2. Enter lusrmgr.msc in the Run box and hit enter. This will open the Local Users and Groups Microsoft Management Console (MMC) snap-in.
  3. Right-click on the Users folder and select New User
  4. Create a new user account with the required details and password. Ensure that the account is not disabled via the Account is disabled button and click Create to…well…create the account. 🙂
  5. Log out of Windows and log back in as the new user. When attempting to run the installation package again, you should be (correctly) prompted to enter Administrator credentials via the UAC control dialog and the installation should start successfully.

The proper solution

Although the above steps are perfectly acceptable and straightforward to follow if you are in a rush, they do not address the underlying problem with the default Administrator account – namely, that it will have issues installing any application that requires elevated privileges. In most cases, where an application requires installation onto a machine, it is generally better to login as the Administrator user account as opposed to relying solely on the UAC elevation prompt. As a consequence, the most ideal solution to save you from any future hassle is to fix the issue with the default Administrator account permanently.

Following some research online and testing on my side, I found this article which goes through the steps that will successfully address the problem and enable you to install Office 365 – and any other program – without issue. In my specific example, I had to follow the instructions listed in Method 2 and 3, followed by a restart of the machine in question, before I was able to install Office 365 successfully. Although the steps involved are a little more complex and error-prone, the article does provide clear instructions, along with screenshots, to help you along the way.

Conclusions or Wot I Think

I recently attended a Microsoft Partner training day covering Microsoft 365 Business and Enterprise and the topic of Office 365 came up regularly, as you can imagine. The deployment settings afforded to you via Microsoft 365 let you perform automated actions in a pinch, with perhaps the most common one being the removal of any local administrator privileges when a new machine is deployed using your organisation’s chosen template. As our instructor pointed out, this is incompatible with how Office 365 installations operate; because, as we have seen, full administrative privileges are required to install the software. We, therefore, find ourselves in this strange state of affairs where the Microsoft 365 solution as a whole is in a glass half full (or half empty, if you are feeling pessimistic) situation and, more generally, Office 365 deployments are hampered due to requiring local or domain level Administrator privileges. I would hope that, eventually, Microsoft would look to providing an installation package of Office that does not require such extensive permissions to install. Doing this would kill two birds with one stone – it would help to make the deployment of all the components of Microsoft 365 a breeze whilst also avoiding the error message discussed in this post. Here’s hoping that we see this change in the future to avoid interesting little side-journeys like this when deploying Office 365.