Welcome to post number nine in my series designed to provide a revision tool for Microsoft Exam 70-778, and for those looking to increase their expertise in Power BI. The topics we have covered so far in the series have all involved Power BI Desktop primarily, and we now move away from this as we evaluate how to Manage Custom Reporting Solutions with Power BI. This focus area for the exam measures the following skill areas:

Configure and access Microsoft Power BI Embedded; enable developers to create and edit reports through custom applications; enable developers to embed reports in applications; use the Power BI API to push data into a Power BI dataset; enable developers to create custom visual.

Despite being, at first glance, a very technically focused area, it is not necessarily a requirement for the exam to know how to work with these features in-depth. However, what this post will try to do is fully explain what Power BI Embedded is (and how it can be tailored accordingly), the capabilities and benefits of the Power BI API and, finally, what options you have available to build custom visualizations, that are then available for use across any Power BI Report.

Power BI Embedded

A potential limitation of using Power BI as your Business Intelligence solution is that you must access your reporting solution through one of the two interfaces, depending on how you have licensed the product:

For strictly organisational only access, this is all fine and dandy; but if you desire to grant external users access to your reports, it would be necessary to open up a door into a critical component of your IT infrastructure, often in tandem with any other systems your solution may contain. For example, if you have developed a support portal for your customers to submit cases with and wish to provide them with a range of Power BI visualizations, you would need to grant and deploy access to two, separate application instances – your support portal and Power BI Online/Report Server. This can lead to a jarring end-user experience and severely limit your capabilities in providing a unified, bespoke and branded solution.

Power BI Embedded seeks to address this scenario by providing application developers the means to embed Power BI reports and visualizations directly within their existing applications. The experience this offers is seamless, to the extent that end-users will not even need to know that you are using Power BI at all. Consequently, this potentially allows you to look exceptionally gifted when you begin to deploy engaging and highly interactive visualizations into your application quickly. As an Azure-based service with a pricing mechanism to suit, you only need to suitably estimate your potential compute capacities, deploy your capacity and any corresponding reports and then build out the appropriate link to your Power BI content within your application.

To get started with using Power BI Embedded, you need to make sure you have the following at your disposal:

To get a feel for the capabilities on offer as part of this offering, you can go to the Power BI Embedded Playground, made available courtesy of Microsoft. This tool allows you to test how the various Power BI Embedded components render themselves, tweak around with their appearance and generate working code samples that are usable within your application. The screenshot below shows an example of how a single Report visual would look when embedding it into a bespoke application:

As the screenshot indicates, there is no loss in core functionality when consuming Power BI in this manner. You can hover over individual areas to gain insights into your data; you can drill-down through the data; data is sortable in the conventional manner; and, finally, you can view the underlying data in the visualization or even export it out into an Excel/CSV document. Also, you have extensive options available that can be used to modify how a visual, report etc. is rendered on your application page, allowing you to ensure that all rendering completes most optimally for your application.

All in all, Power BI Embedded represents a significant boon for application developers, enabling them to very quickly leverage the extensive reporting capabilities Power BI  provides, all of which is cloud-based, highly scalable and minutely tailorable. It is important to highlight that all of this goodness comes with a potentially high cost, namely, that of requiring a sufficiently proficient application developer (preferably .NET focused) to join all of the various bits together. But, if you are already in the position where you have developed an extensive range of Power BI reports for consumption by your customer base, Power BI Embedded is the natural progression point in turning your solution into a real piece of intellectual property.

The Power BI API

If you are finding your feet with Power BI Embedded and need to look at carrying out more complex actions against Power BI content that is pulling through from a workspace, then the API is an area that you will need to gain familiarity in working with too. Microsoft exposes a whole range of functionality as part of the Power BI API, that can assist in a wide variety of tasks – such as automation, deployment and allowing any bespoke application to further leverage benefits out of their Power BI embedded solution. Some examples of the types of things you can do with the API include:

The API requires that you authenticate against the Power BI service, using a corresponding Application Registration on Azure Active Directory, which defines the list of privileges that can be carried out. This component can be straightforwardly created using the wizard provided by Microsoft, and a full tutorial is also available on how to generate an access token from your newly created Application Registration. The key thing as part of all of this is to ensure that your Application Registration is scoped for only the permissions you require (these can be changed in future if needed) and not to grant all permissions needlessly.

Because the API is a REST endpoint, there are a variety of programming languages or tools that you can use from an interaction standpoint. PowerShell is an especially good candidate for this and, in the snippet below, you can see how this can be used to modify the User Name and Password for a SQL Server data source deployed onto Power BI Online:

#Make the request to patch the Data Source with the updated credentials

$sqluser = "MyUserName"
$sqlPass = "P@ssw0rd!"

$patchURI = "https://api.powerbi.com/v1.0/myorg/gateways/cfafbeb1-8037-4d0c-896e-a46fb27ff229/datasources/1e8176ec-b01c-4a34-afad-e001ce1a28dd/"
$person = @{
    credentialType='Basic'
    basicCredentials=@{
        username=$sqluser
        password=$sqlpass
        }
}
$personJSON = $person | ConvertTo-Json
$request3 = Invoke-RestMethod -Uri $patchURI -Headers $authHeader -Method PATCH -Verbose -Body $personJSON

This example deliberately excludes some of the previous steps needed to authenticate with Power BI and is, therefore, provided for strictly illustrative purposes only.

Creating Custom Visuals

Developers have access to primarily two options when it comes to building out bespoke visualizations, which are then includable in a Power BI Online, Embedded and Report Server report:

Last week’s post discussed this topic in more detail from an exam standpoint which, in a nutshell, only requires you to have a general awareness of the options available here; no need to start extensively learning a new programming language, unless you really want to ūüôā

Key Takeaways

  • Power BI Embedded is an Azure hosted offering that allows you add Power BI Report content into bespoke applications. This deployment option can be incredibly useful if you wish to make available your Power BI solution to users outside of your organisation or if you have an existing, bespoke application system that can benefit from utilising Power BI content. An Azure subscription is required to begin working with Power BI Embedded and you are billed based on node size, not individual user licenses. All Power BI content requires publishing to the online service before its contents become available for Power BI Embedded to access. Report developers will, therefore, need granting a Power BI Professional license to carry out these activities.
  • The Power BI API grants access to developers to perform automation or administrative actions programmatically against the Power BI Online service. Utilising a REST API, developers can determine the optimal programming language of choice to interact with the API, allowing them to streamline the deployment of Reports or Dashboards to the Power BI service or leverage additional functionality when utilising Power BI Embedded. The API can also cater to specific data load requirements, although more complex needs in this area would require addressing via alternate means (SSIS, Azure Data Factory etc.)
  • Developers can add their own bespoke visualizations to a Power BI Report by either developing them using Node.js or using the R language. The first of these options facilitate a more streamlined deployment mechanism and allows developers to add their visualizations to AppSource, whereas the second option may be more useful for complex visualization types with an analytical or statistical function.

Compared to the other exam topics, a general awareness of these concepts is more than likely sufficient from a learning perspective and is (arguably) useful knowledge in any event, as it allows you to understand how developers can further extend a Power BI solution to suit a particular business need. In next weeks post, we will move into the final subject area for the exam, as the focus shifts towards how to work with Power BI outside of the Desktop application and the various tools available to integrate on-premise data sources into Power BI Online.

The whole concept of audio conferencing Рthe ability for a diverse range of participants to dial into a central location for a meeting Рis hardly a new concept for the 21st century. Its prevalence, however, has undoubtedly grown sharply in the last 15-20 years; to the point now where, to use an analogy, it very much feels like a DVD when compared to full video conferencing, à la Blu-Ray. When you also consider the widening proliferation of remote workers, globalisation and the meteoric rise of cloud computing, businesses suddenly find themselves having to find answers to the following questions:

  • How can I enable my colleagues to straightforwardly and dynamically collaborate across vast distances?
  • How do I address the challenges of implementing a scalable solution that meets any regulatory or contractual requirements for my organisation?
  • What is the most cost-effective route to ensuring I can have a genuinely international audio-conferencing experience?
  • How do I identify a solution that users can easily adopt, without introducing complex training requirements?

These questions are just a flavour of the sorts of things that organisations should be thinking about when identifying a suitable audio conferencing solution. And there are a lot of great products on the market that address these needs – GoToMeeting or join.me represent natural choices for specific scenarios. But to provide a genuinely unified experience for existing IT deployments that have a reliance on Skype for Business/Microsoft Teams, the audio conferencing add-on for Office 365 (also referred to as Skype for Business Online Audio Conferencing) may represent a more prudent choice. It ticks all of the boxes for the questions above, ensuring that users can continue utilising other tools they know and use every day – such as Microsoft Outlook and Office 365. Admittedly, though, the solution is undoubtedly more geared up for Enterprise deployments as opposed to utilisation by SMBs. It may, therefore, become too unwieldy a solution in the hands of smaller organisations.

I was recently involved in implementing an audio conferencing solution for Skype for Business Online, to satisfy the requirement for international dialling alluded to earlier. Having attended many audio conferences that utilise the service previously, I was familiar with the following experience when dialling in and – perhaps naively – assumed this to be the default configuration:

  1. User dials in and enters the meeting ID number.
  2. An audio cue is played for the conference leader, asking them to enter their leader code (this is optional).
  3. The user is asked to record their name and then press *
  4. The meeting then starts automatically.

For most internal audio conferences (and even external ones), this process works well, mainly when, for example, the person organising the meeting is doing so on behalf of someone else, and is unlikely to be dialling in themselves. However, I was surprised to learn that the actual, default experience is a little different:

  1. User dials in and enters the meeting ID number.
  2. An audio cue is played for the conference leader, asking them to enter their leader code (this is optional).
  3. The user is asked to record their name and then press *
  4. If the leader has not yet dialled in, all other attendees sit in the lobby. The call does not start until the leader joins.

The issue does not relate to how the meeting organiser has configured their meeting in Outlook, regardless of which setting is chosen in Outlook in the These people don’t have to wait in the lobby drop-down box.

After some fruitless searching online, I eventually came across the following article, which clarified things for me:

Start an Audio Conference over the phone without a PIN in Skype for Business Online

As a tenant-level configuration, therefore, there is a two-stage process involved to get this reconfigured for existing Skype for Business Online Audio Conferencing deployments:

  • Set the¬†AllowPSTNOnlyMeetingsByDefault¬†setting to true on the tenant via a PowerShell cmdlet.
  • Configure the¬†AllowPSTNONLYMeetingsByDefault¬†setting to true for every user setup for Audio Conferencing, either within the Skype for Business Administration Centre or via PowerShell.

The second process could be incredibly long-winded to achieve via the Administration Centre route, as you have to go into each user’s audio conferencing settings and toggle the appropriate control, as indicated below:

For a significantly larger deployment, this could easily result in carpal tunnel syndrome and loss of sanity 😧. Fortunately, PowerShell can take away some of the woes involved as part of this. By logging into the Skype for Business Online administration centre via this route, it is possible to both enable the¬†AllowPSTNOnlyMeetingsByDefault setting on a tenant level and also for all users who currently have the setting disabled. The complete script to carry out these steps is below:

#Standard login for S4B Online

Import-Module SkypeOnlineConnector
$userCredential = Get-Credential
$sfbSession = New-CsOnlineSession -Credential $userCredential
Import-PSSession $sfbSession

#Login script for MFA enabled accounts

Import-Module SkypeOnlineConnector
$sfbSession = New-CsOnlineSession
Import-PSSession $sfbSession

#Enable dial in without leader at the tenant level - will only apply to new users moving forward

Set-CsOnlineDialInConferencingTenantSettings -AllowPSTNOnlyMeetingsByDefault $true

#Get all current PSTN users

$userIDs = Get-CsOnlineDialInConferencingUserInfo | Where Provider -eq "Microsoft" | Select DisplayName, Identity

$userIDs | ForEach-Object -Process {
        
    #Then, check whether the AllowPstnOnlyMeetings is false

    $identity = $_.Identity

    $user = Get-CsOnlineDialInConferencingUser -Identity $identity.RawIdentity
    Write-Host $_.DisplayName"AllowPstnOnlyMeetings value equals"$user.AllowPstnOnlyMeetings
    if ($user.AllowPstnOnlyMeetings -eq $false) {
        
        #If so, then enable

        Set-CsOnlineDialInConferencingUser -Identity $identity.RawIdentity -AllowPSTNOnlyMeetings $true
        Write-Host $_.DisplayName"AllowPstnOnlyMeetings value changed to true!"
    }

    else {
        Write-Host "No action required for user account"$_.DisplayName 
    }
}

Some notes/comments before you execute it in your environment:

  • You should comment out the appropriate authentication snippet that is not appropriate for your situation, depending on whether you have enabled Multi-Factor Authentication for your user account.
  • Somewhat annoyingly, there is no way (I found) to extract a unique enough identifier that can be used with the¬†Set-CsOnlineDialInConferencingUser¬†cmdlet when obtaining the details of the user via the Get-CsOnlineDialInConferencingUser cmdlet. This is why the script first retrieves the complete LDAP string using the Get-CsOnlineDialInConferencingUserInfo. Convoluted I know, but it ensures that the script can work correctly and avoids any issues that may arise from duplicate Display Names on the Office 365 tenant.

All being well, with very little modification, the above code can be utilised to enforce a setting across the board or for a specific subset of users, if required. It does seem strange that the option is turned off by default, but there are understandable reasons why it may be desirable to curate the whole meeting experience for attendees. If you are considering rolling out Skype for Business Online Audio Conferencing in your Office 365 tenant in the near future, then this represents one of many considerations that you will have to take when it comes to the deployment. You should, therefore, think carefully and consult with your end-users to determine what their preferred setting is; you can then choose to enable/disable the AllowPSTNOnlyMeetingsByDefault setting accordingly.

The introduction of Azure Data Factory V2 represents the most opportune moment for data integration specialists to start investing their time in the product. Version 1 (V1) of the tool, which I started taking a look at last year, missed a lot of critical functionality that – in most typical circumstances – I could do in a matter of minutes via a SQL Server Integration Services (SSIS) DTSX package. The product had, to quote some specific examples:

  • No support for control low logic (foreach loops, if statements etc.)
  • Support for only “basic” data movement activities, with a minimal capability to perform or call data transformation activities from within SQL Server.
  • Some support for the deployment of DTSX packages, but with incredibly complex deployment options.
  • Little or no support for Integrated Development Environment (IDE’s), such as Visual Studio, or other typical DevOps scenarios.

In what seems like a short space of time, the product has come on leaps and bounds to address these limitations:

Supported now are Filter and Until conditions, alongside the expected ForEach and If conditionals.

When connecting to SQL Data destinations, we now have the ability to execute pre-copy scripts.

SSIS Integration Runtimes can now be set up from within the Azure Data Factory V2 GUI – no need to revert to PowerShell.

And finally, there is full support for storing all created resources within GitHub or Visual Studio Team Services Azure DevOps

The final one is a particularly nice touch, and means that you can straightforwardly incorporate Azure Data Factory V2 as part of your DevOps strategy with minimal effort – an ARM Resource Template deployment, containing all of your data factory components, will get your resources deployed out to new environments with ease. What’s even better is that this deployment template is intelligent enough not to recreate existing resources and only update Data Factory resources that have changed.¬†Very¬†nice.

Although a lot is provided for by Azure Data Factory V2 to assist with a typical DevOps cycle, there is one thing that the tool does not account for satisfactorily.

A critical aspect as part of any Azure Data Factory V2 deployment is the implementation of Triggers. These define the circumstances under which your pipelines will execute, typically either via an external event or based on a pre-defined schedule. Once activated, they effectively enter a “read-only” state, meaning that any changes made to them via a Resource Group Deployment will be blocked and the deployment will fail – as we can see below when running the¬†New-AzureRmResourceGroupDeployment cmdlet directly from PowerShell:

It’s nice that the error is provided in JSON, as this can help to facilitate any error handling within your scripts.

The solution is simple Рstop the Trigger programmatically as part of your DevOps execution cycle via the handy Stop-AzureRmDataFactoryV2Trigger. This step involves just a single line PowerShell Cmdlet that is callable from an Azure PowerShell task. But what happens if you are deploying your Azure Data Factory V2 template for the first time?

I’m sorry, but your Trigger is another castle.

The best (and only) resolution to get around this little niggle will be to construct a script that performs the appropriate checks on whether a Trigger exists to stop and skip over this step if it doesn’t yet exist. The following parameterised PowerShell script file will achieve these requirements by attempting to stop the Trigger called¬†‘MyDataFactoryTrigger’:

param($rgName, $dfName)

Try
{
   Write-Host "Attempting to stop MyDataFactoryTrigger Data Factory Trigger..."
   Get-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -ErrorAction Stop
   Stop-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -Force
   Write-Host -ForegroundColor Green "Trigger stopped successfully!"
}

Catch

{ 
    $errorMessage = $_.Exception.Message
    if($errorMessage -like '*NotFound*')
    {       
        Write-Host -ForegroundColor Yellow "Data Factory Trigger does not exist, probably because the script is being executed for the first time. Skipping..."
    }

    else
    {

        throw "An error occured whilst retrieving the MyDataFactoryTrigger trigger."
    } 
}

Write-Host "Script has finished executing."

To use successfully within Azure DevOps, be sure to provide values for the parameters in the Script Arguments field:

You can use pipeline Variables within arguments, which is useful if you reference the same value multiple times across your tasks.

With some nifty copy + paste action, you can accommodate the stopping of multiple Triggers as well – although if you have more than 3-4, then it may be more sensible to perform some iteration involving an array containing all of your Triggers, passed at runtime.

For completeness, you will also want to ensure that you restart the affected Triggers after any ARM Template deployment. The following PowerShell script will achieve this outcome:

param($rgName, $dfName)

Try
{
   Write-Host "Attempting to start MyDataFactoryTrigger Data Factory Trigger..."
   Start-AzureRmDataFactoryV2Trigger -ResourceGroupName $rgName -DataFactoryName $dfName -TriggerName 'MyDataFactoryTrigger' -Force -ErrorAction Stop
   Write-Host -ForegroundColor Green "Trigger started successfully!"
}

Catch

{ 
    throw "An error occured whilst starting the MyDataFactoryTrigger trigger."
}

Write-Host "Script has finished executing."

The Azure Data Factory V2 offering has no doubt come leaps and bounds in a short space of time…

…but you can’t shake the feeling that there is a lot that still needs to be done. The current release, granted, feels very stable and production-ready, but I think there is a whole range of enhancements that could be introduced to allow better feature parity when compared with SSIS DTSX packages. With this in place, and when taking into account the¬†very¬†significant cost differences between both offerings, I think it would make Azure Data Factory V2 a no-brainer option for almost every data integration scenario. The future looks very bright indeed ūüôā

Once upon a time, there was a new cloud service known as Windows Azure. Over time, this cloud service developed with new features, became known more generally as just Azure, embraced the unthinkable from a technology standpoint and also went through a complete platform overhaul. Longstanding Azure users will remember the “classic” portal, with its very…distinctive…user interface:

Image courtesy of Microsoft

As the range of different services offered on Azure increased and the call for more efficient management tools became almost deafening, Microsoft announced the introduction of a new portal experience and Resource Group Management for Azure resources, both of which are now the de facto means of interacting with Azure today. The old style portal indicated above was officially discontinued earlier this year. In line with these changes, Microsoft introduced new, Resource Manager compatible versions of pretty much every major service available on the “classic” portal…with some notable exceptions. The following “classic” resources can still be created and worked with today using the new Azure portal:

This provides accommodation for those who are still operating compute resources dating back to the days of yonder, allowing you to create and manage resources that may be needed to ensure the continued success of your existing application deployment.¬†In most cases, you will not want to create these “classic” resources as part of new project work, as the equivalent Resource Manager options should be more than sufficient for your needs. The only question mark around this concerns Cloud Services. There is no equivalent Resource Manager resource available currently, with the recommended option for new deployments being Azure Service Fabric instead. Based on my research online, there appears to be quite a feature breadth between both offerings, with Azure Service Fabric arguably being overkill for more simplistic requirements. There also appears to be some uncertainty over whether Cloud Services are technically considered deprecated or not. I would highly recommend reading Andreas Helland’s blog post on the subject and form your own opinion from there.

For both experiences, Microsoft provided a full set of automation tools in PowerShell to help developers carry out common tasks on the Azure Portal. These are split out into the standard Azure cmdlets for the “classic” experience and a set of AzureRM cmdlets for the new Resource Management approach. Although the “classic” Azure resource cmdlets are still available and supported, they very much operate in isolation – that is, if you have a requirement to interchangeably create “classic” and Resource Manager resources as part of the same script file, then you are going to encounter some major difficulties and errors. One example of this is that the ability to switch subscriptions that you have access, but not ownership, to becomes nigh on impossible to achieve. For this reason, I would recommend utilising AzureRM cmdlets solely if you ever have a requirement to create classic resources to maintain an existing deployment. To help accommodate this scenario, the New-AzureRmResource cmdlet really becomes your best friend. In a nutshell, it lets you create¬†any Azure Resource of your choosing when executed. The catch around using it is that the exact syntax to utilise as part of the -ResourceType¬†parameter can take some time to discover, particularly in the case of working with “classic” resources. What follows are some code snippets that, hopefully, provide you with a working set of cmdlets to create the “classic” resources highlighted in the screenshot above.

Before you begin…

To use any of the cmdlets that follow, make sure you have connected to Azure, selected your target subscription and have a Resource Group created to store your resources using the cmdlets below. You can obtain your Subscription ID by navigating to its properties within the Azure portal:

#Replace the parameter values below to suit your requirements

$subscriptionID = 36ef0d35-2775-40f7-b3a1-970a4c23eca2
$rgName = 'MyResourceGroup'
$location = 'UK South'

Set-ExecutionPolicy Unrestricted
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionID
#Create Resource Group
New-AzureRMResourceGroup -Name $rgName -Location $location

With this done, you should hopefully encounter no problems executing the cmdlets that follow.

Cloud Services (classic)

#Create an empty Cloud Service (classic) resource in MyResourceGroup in the UK South region

New-AzureRmResource -ResourceName 'MyClassicCloudService' -ResourceGroupName $rgName `
                    -ResourceType 'Microsoft.ClassicCompute/domainNames' -Location $location -Force
                    

Disks (classic)

#Create a Disk (classic) resource using a Linux operating system in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyClassicDisk' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageaccounts/disks' ` 
                    -Location $location `
                    -PropertyObject @{'DiskName'='MyClassicDisk' 
                    'Label'='My Classic Disk' 
                    'VhdUri'='https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'
                    'OperatingSystem' = 'Linux'
                    } -Force
                    

Network Security groups (classic)

#Create a Network Security Group (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyNSG' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/networkSecurityGroups' `
                    -Location $location -Force
                    

Reserved IP Addresses (classic)

#Create a Reserved IP (classic) resource in MyResourceGroup in the UK South region.

New-AzureRmResource -ResourceName 'MyReservedIP' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/reservedIps' `
                    -Location $location -Force
                    

Storage Accounts (classic)

#Create a Storage Account (classic) resource in MyResourceGroup in the UK South region.
#Storage account with use Standard Locally Redundant Storage

New-AzureRmResource -ResourceName 'MyStorageAccount' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/StorageAccounts' ` 
                    -Location $location -PropertyObject @{'AccountType' = 'Standard-LRS'} -Force
                    

Virtual Networks (classic)

#Create a Virtual Network (classic) resource in MyResourceGroup in the UK South Region

New-AzureRmResource -ResourceName 'MyVNET' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicNetwork/virtualNetworks' `
                    -Location $location
                    -PropertyObject @{'AddressSpace' = @{'AddressPrefixes' = '10.0.0.0/16'}
                                      'Subnets' = @{'name' = 'MySubnet'
                                                    'AddressPrefix' = '10.0.0.0/24'
                                                    }
                                     }
                                     

VM Images (classic)

#Create a VM image (classic) resource in MyResourceGroup in the UK South region.
#Needs a valid VHD in a compatible storage account to work correctly

New-AzureRmResource -ResourceName 'MyVMImage' -ResourceGroupName $rgName -ResourceType 'Microsoft.ClassicStorage/storageAccounts/vmImages' `
                    -Location $location `
                    -PropertyObject @{'Label' = 'MyVMImage Label'
                    'Description' = 'MyVMImage Description'
                    'OperatingSystemDisk' = @{'OsState' = 'Specialized'
                                              'Caching' = 'ReadOnly'
                                              'OperatingSytem' = 'Windows'
                                              'VhdUri' = 'https://mystorageaccount.blob.core.windows.net/mycontainer/myvhd.vhd'}
                    }

Conclusions or Wot I Think

The requirement to work with the cmdlets shown in this post should only really be a concern for those who are maintaining “classic” resources as part of an ongoing deployment. It is therefore important to emphasise¬†not to use these cmdlets to create resources for new projects. Alongside the additional complexity involved in constructing the¬†New-AzureRmResource cmdlet, there is an abundance of new, updated AzureRM cmdlets at your disposal that enables you to more intuitively create the correct types of resources. The key benefit that these examples provide is the ability to use a single Azure PowerShell module for the management of your entire Azure estate, as opposed to having to switch back and forth between different modules. It is perhaps a testament to how flexible Azure is that cmdlets like the New-AzureRmResource exist in the first place, ultimately enabling anybody to fine-tune deployment and maintenance scripts to suit any conceivable situation.

Repeatable and time-consuming tasks are typically an excellent candidate for automation. The range of business benefits that can be realised is perhaps too broad to list, but I think that the simple ability to free up an individuals time to accomplish something better represents the ideal end goal of such activity. I have generally found that the best kind of automation is when there is a degree of human involvement on a very minimal basis – what I would term “keeping the brain involved” and not blindly assuming that the computer will always make the correct choice. A lot of the tools afforded to us when working with Microsoft cloud technologies appear to be very firmly rooted within this mindset, with frameworks such as PowerShell providing the means of carrying out sequence of tasks far quicker than a human could achieve, whilst also providing the mechanism to facilitate human involvement at key steps during any code execution cycle.

When creating a Web App via the Azure portal, you have the option of specifying the creation of an Application Insights resource, which will be automatically associated with your newly created Web App during the deployment. In most cases, you are going to want to take advantage of what this service can deliver to your application in terms of monitoring, usage patterns and error detection; the fact that I am such a major proponent of Application Insights should come as no surprise to regular readers of the blog. Should you find yourself having to deploy both of these resources in tandem via PowerShell, your first destination will likely be the New-AzureRmAppServicePlan & New-AzureRmWebApp cmdlets. For example, the following scripts when executed will create a Basic App Service Plan and Web App in the UK South region called MyWebsite, contained within a resource group with the same name:

New-AzureRMResourceGroup -Name 'MyWebsite' -Location 'UK South'
New-AzureRmAppServicePlan -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'UK South' -Tier 'Basic'
New-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'Basic' -AppServicePlan 'MyWebsite'

Next involves the creation of the Application Insights resource, which you would be forgiven for thinking could be created as part of one of the cmdlets above (à la the portal). Instead, we must resort to a generic cmdlet that can be tinkered with to create any resource on the Azure platform, per the instructions outlined in this article. Therefore, the following cmdlet needs to be executed next to create an Application Insights resource using identical parameters defined for the App Service Plan/Web App:

$appInsights = New-AzureRmResource -ResourceName 'MyWebsite' -ResourceGroupName 'MyWebsite' `
-Tag @{ applicationType = 'web'; applicationName = 'MyWebsite'} `
-ResourceType 'Microsoft.Insights/components' -Location 'UK South' `
-PropertyObject @{'Application_Type'='web'} -Force

It’s worth pointing out at this stage that you may get an error returned along the lines of¬†No registered resource provider found for location… when executing the New-AzureRmResource cmdlet. This is because not all resource providers are automatically registered for use via PowerShell on the Azure platform. This can be resolved by executing the below cmdlets to create the appropriate registration on your subscription. This can take a few minutes to update on the platform:

#Check to see if the Microsoft.Insights provider has a RegistrationState value of Registered.
#If not, execute Register-AzureRmResourceProvider to get it added.
#Then, keep running the first cmdlet until the registration is confirmed.

Get-AzureRmResourceProvider | Where ProviderNamespace -eq 'microsoft.insights'
Register-AzureRmResourceProvider -ProviderNamespace Microsoft.Insights

We now have a Web App and Application Insights resource deployed onto Azure. But, at this juncture, the Web App and Application Insight resources exist in isolation, with no link between them. To fix this, the final step involves updating the newly created Web App resource with the Application Insights Instrumentation Key, which is generated once the resource is created. Because the above snippet is storing all details of the newly created resource within the $appInsights parameter, we can very straightforwardly access this property and add a new application setting via the following cmdlets:

$appSetting = @{'APPINSIGHTS_INSTRUMENTATIONKEY'= $appInsights.Properties.InstrumentationKey}
Set-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -AppSettings $appSetting

With this final step accomplished, the resources are now associated together and this should be reflected accordingly when viewed in the portal. For completeness, the entire script to achieve the above (also including the necessary login steps) can be seen below:

Set-ExecutionPolicy Unrestricted

Login-AzureRmAccount

New-AzureRMResourceGroup -Name 'MyWebsite' -Location 'UK South'
New-AzureRmAppServicePlan -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'UK South' -Tier 'Basic'
New-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -Location 'Basic' -AppServicePlan 'MyWebsite'

#Check to see if the Microsoft.Insights provider has a RegistrationState value of Registered.
#If not, execute Register-AzureRmResourceProvider to get it added.
#Then, keep running the first cmdlet until the registration is confirmed.

Get-AzureRmResourceProvider | Where ProviderNamespace -eq 'microsoft.insights'
Register-AzureRmResourceProvider -ProviderNamespace Microsoft.Insights

$appInsights = New-AzureRmResource -ResourceName 'MyWebsite' -ResourceGroupName 'MyWebsite' `
-Tag @{ applicationType = 'web'; applicationName = 'MyWebsite'} `
-ResourceType 'Microsoft.Insights/components' -Location 'UK South' `
-PropertyObject @{'Application_Type'='web'} -Force

$appSetting = @{'APPINSIGHTS_INSTRUMENTATIONKEY'= $appInsights.Properties.InstrumentationKey}
Set-AzureRmWebApp -Name 'MyWebsite' -ResourceGroupName 'MyWebsite' -AppSettings $appSetting

The above example is interesting in the sense that Application Insights does not have a set of dedicated cmdlets for creating, retrieving and updating the resource. Instead, we must rely on fairly generic cmdlets – and their associated complexity – to work with this resource type. It also seems somewhat counter-intuitive that there is no option as part of the New-AzureRmWebApp cmdlet to create an Application Insights resource alongside the Web App, as we have established the ability to carry this out via the Azure portal. Being able to specify this as an additional parameter (that would also perform the required steps involving the Instrumentation Key) would help to greatly simplify what must be a fairly common deployment scenario. As a service that receives regular updates, we can hope that Microsoft eventually supports one or both of these scenarios to ensure that any complexity towards deploying Application Insights resources in an automated release is greatly reduced.