In most scenarios, a software release will involve several steps that require consistent completion each time you are pushing out an update to your application. That is why tools such as Azure DevOps can have a significant benefit for organisations if implemented correctly, as they can not only give you the confidence to release updates more frequently but also massively reduce the risk of a failed software deployment due to human error. An adequately defined release pipeline within Azure DevOps allows you to set up each of the required steps as part of any application deployment, all of which are executed based on a specified order. With a vast array of different tasks available out of the box, ranging from PowerShell script tasks through to Amazon Web Services deployments, developers can have the confidence that an Azure DevOps release pipeline can fit in with whatever workloads involved.

Despite the endless, monotonous repetition associated with software deployments, there may be occasions where you want to go a little bit freestyle and modify how tasks execute, based on dynamic values supplied at the time of release. Fortunately, there is a way you can do this within Azure DevOps, via the use of release variables and custom task conditions. The range of additional functionality this opens up is vast and, in today’s post, we’ll see how it is possible to get started using them with minimal effort.

Setting up Release Variables

A release variable is defined from the Variables tab when creating a new release pipeline:

From here, you then specify the following settings:

  • Name: Ideally a descriptive name for the variable.
  • Value: The default value for the variable. This property can be overridden at release time, as we’ll see shortly.
  • Padlock: This tells AzureDevOps whether the Value provided is hidden from view once defined. This setting can be particularly useful if, for example, you are setting connection string or password values at release level.
  • Scope: This defines in which part of your release pipeline the variable is accessible from – either from any stage (Release) or within a single one only (Stage 1, for example).
  • Settable at release time: Again, fairly self-explanatory 🙂 Let’s you specify whether the release creator can override the variable value when creating a new release.

Passing Release Variable Values to a Task

Once defined, a variable is accessible within any scoped task within your pipeline. For example, if you set a variable called MySQLDBPassword, you can access its value by using the following syntax:

$(MySQLDBPassword)

So, to pass this to an Azure SQL Database Deployment task as the database login password, we would provide the following value in this field:

Alternatively, we can write out the value of the variable into a PowerShell task:

When this task is then executed during a release, we observe the following behaviour in its logs:

As a consequence, variable values can be passed to almost any task property. Also, for sensitive values, they represent the most prudent route to go down to ensure that passwords and connection strings are handled securely.

Defining Release Variables On A New Release

We saw earlier the specific setting Settable at release time for release variables. If enabled, when creating a new release, you will be allowed to override its original, supplied value. So, by modifying the MySQLDBPassword variable from earlier to enable this property, we now get the following options exposed as part of creating a new release:

As observed above, the default value for this variable – p@ssw0rd123 – is automatically pulled through, and can be reviewed or even submitted without any further changes. For secret variables, the behaviour is slightly different, as expected, although we can still override the value if we wanted to; the only thing is that you won’t be able to see what you type, similar to a password field:

Implementing Conditional Logic Using Release Variables

As alluded to earlier, there may be occasions where you want certain tasks to be carried out or even skipped entirely, based on what value a variable holds. A good recent example that I was involved in illustrates where this may become desirable. We had several release pipelines that implemented a backup of an Azure SQL database before any updates were applied. This extra step was primarily to ensure that a pre-release version of the database was available in case a rollback is required. In situations where a re-release needs to be triggered, due to the pipeline failing because of a misconfiguration or other issue not related to the release artifacts themselves, having to go through the process of backing up the database again appeared to be unnecessary and a waste of time. We, therefore, set up a variable on the pipeline, configured as follows:

Then, on the PowerShell task that performed the database backup, we navigate to the Control Options tab within the task and, firstly, select the Custom conditions option on the Run this task dropdown:

Selecting this option makes a Custom condition field appear directly underneath, which allows you to input a wide array of different conditional logic using an easy-to-understand expression language. Through this, we can straightforwardly define a function that executes the task if the value of the BackupProdDB? variable equals true:

eq(variables[‘BackupProdDB?’], true)

Now, when overriding this parameter value at release-stage to false, the task is skipped entirely:

If a Microsoft hosted VS2017 Agent is used for the task, some additional detail is made available by hovering over the little i icon next to the task name. This tooltip will chiefly indicate how the expression evaluated itself during runtime; quite useful if you are debugging:

The example shown here only scratches the surface of what the expression language can ultimately facilitate. To take things further, you could look at implementing conditional logic that:

  • Performs comparisons against numbers
  • Works with array values.
  • Carries out joins or concatenations on multiple variable values.

As this post has attempted to demonstrate, release variables open up an additional layer of functionality that can take your release pipelines to the next level. You can open up a whole range of functionality that allows those triggering releases to modify the steps involved dynamically and even to skip them entirely, without requiring any pipeline alterations to take place. Hopefully, today’s post has given you a flavour of how to get started using them. Let me know in the comments below if you identify a usage case from them yourself or if you have been able to come up with any ingenious expressions 🙂

When your first starting with Microsoft Azure for straightforward projects or proof of concept designs, the Portal is your go-to destination for reviewing, managing, updating and creating resources. Even for larger scale deployments, you will typically find yourself within there most of the time; what may have changed, in this scenario, is the mechanism through which you deploy new resources or any changes to existing ones. Developers have a range of options at their disposal to programmatically carry out these types of activities:

The last of these options can be of most benefit if you are already using Git source control for your projects (via GitHub, Azure DevOps, BitBucket etc.) and you have a desire to implement Continuous Integration (CI) and automated release management for your Azure templates. Both of these options enable you to validate your templates before deploying, to ensure no obvious errors occur, and to reduce the risk of human error as part of frequent software deployments. Making a move from managing your Azure resources from within the portal to Azure Templates is relatively straightforward, thanks to the options available to us to export our existing resources as templates. Notwithstanding this fact, there will still be times where you find yourself hitting a few stumbling blocks as you begin to fully understand the behaviour when deploying resources out in this manner.

An example better illustrates this issue. Let’s assume we have deployed out an App Service resource manually via the Azure Portal and, over time, we have assigned it the following Application settings:

We now decide it would be desirable to move towards utilising Azure Resource Manager templates and, as such, define the following JSON template for this resource:

{
    "$schema": "http://schema.management.azure.com/schemas/2014-04-01-preview/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "name": {
            "type": "String"
        },
        "hostingPlanName": {
            "type": "String"
        },
        "location": {
            "type": "String"
        },
        "sku": {
            "type": "String"
        },
        "serverFarmResourceGroup": {
            "type": "String"
        },
        "subscriptionId": {
            "type": "String"
        }
    },
    "resources": [
        {
            "type": "Microsoft.Web/sites",
            "apiVersion": "2016-03-01",
            "name": "[parameters('name')]",
            "location": "[parameters('location')]",
            "dependsOn": [
                "[resourceId('Microsoft.Web/serverfarms', parameters('hostingPlanName'))]"
            ],
            "properties": {
                "name": "[parameters('name')]",
                "siteConfig": {
                    "appSettings": []
                },
                "serverFarmId": "[concat('/subscriptions/', parameters('subscriptionId'),'/resourcegroups/', parameters('serverFarmResourceGroup'), '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
            }
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2016-09-01",
            "name": "[parameters('hostingPlanName')]",
            "location": "[parameters('location')]",
            "sku": {
                "name": "[parameters('sku')]"
            },
            "properties": {
                "name": "[parameters('hostingPlanName')]",
                "numberOfWorkers": "1"
            }
        }
    ]
}

And, within Azure DevOps, we have the following Release Pipeline task created:

Note in particular the selection of the Incremental option, recommended if you want to ensure that your deployment does not accidentally delete any resources not defined in the template.

After using the template below as part of a release and upon navigating back to our Application settings for the App Service, we notice that all of them have vanished completely:

With the Incremental option specified above, you would be forgiven for thinking that the template deployment is “broken”, as it would appear to have done the complete opposite of what the setting implies it will do. The fault here lies with the JSON template itself, which has not been updated to include all of the Application settings needed for the App Service. During the deployment step, Azure will compare your template against any existing App Service resource and, if the Application settings are not there, they are permanently deleted. We can observe this behaviour in practice by adding our Application settings back on manually and by only specifying a single setting on our Microsoft.Web/sites resource within our JSON template:

{
    "type": "Microsoft.Web/sites",
    "apiVersion": "2016-03-01",
    "name": "[parameters('name')]",
    "location": "[parameters('location')]",
    "dependsOn": [
        "[resourceId('Microsoft.Web/serverfarms', parameters('hostingPlanName'))]"
    ],
    "properties": {
    "name": "[parameters('name')]",
    "siteConfig": {
        "appSettings": [
			{
			    "name": "MyAppSetting2",
				"value": "value456"
			}
		]
    },
    "serverFarmId": "[concat('/subscriptions/', parameters('subscriptionId'),'/resourcegroups/', parameters('serverFarmResourceGroup'), '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
    }
}

Post-deployment, we can observe the following on the App Service:

So the answer, at this point, is pretty clear; update the entire JSON template to include all required Application Settings as part of your App Service:

{
    "$schema": "http://schema.management.azure.com/schemas/2014-04-01-preview/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "name": {
            "type": "String"
        },
        "hostingPlanName": {
            "type": "String"
        },
        "location": {
            "type": "String"
        },
        "sku": {
            "type": "String"
        },
        "serverFarmResourceGroup": {
            "type": "String"
        },
        "subscriptionId": {
            "type": "String"
        }
    },
    "resources": [
        {
            "type": "Microsoft.Web/sites",
            "apiVersion": "2016-03-01",
            "name": "[parameters('name')]",
            "location": "[parameters('location')]",
            "dependsOn": [
                "[resourceId('Microsoft.Web/serverfarms', parameters('hostingPlanName'))]"
            ],
            "properties": {
                "name": "[parameters('name')]",
                "siteConfig": {
                    "appSettings": [
					{
						"name": "MyAppSetting1",
						"value": "value123"
					},
					{
						"name": "MyAppSetting2",
						"value": "value456"
					},
					{
						"name": "MyAppSetting3",
						"value": "value789"
					}
					]
                },
                "serverFarmId": "[concat('/subscriptions/', parameters('subscriptionId'),'/resourcegroups/', parameters('serverFarmResourceGroup'), '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
            }
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2016-09-01",
            "name": "[parameters('hostingPlanName')]",
            "location": "[parameters('location')]",
            "sku": {
                "name": "[parameters('sku')]"
            },
            "properties": {
                "name": "[parameters('hostingPlanName')]",
                "numberOfWorkers": "1"
            }
        }
    ]
}

As alluded to already, the Incremental deployment option can be a little bit misleading in this context, as you may naturally assume that Azure would take no action to remove anything if this option is specified. The example outlined in this post clearly illustrates that this does not apply to resource properties, which can be subject to unintended alterations if your Azure Template does not specify them explicitly. Take care when migrating across to Azure RM templates to ensure that every single resource setting that you need is copied through and, also, carry out test deployments into dedicated testing/UAT environments to verify the exact behaviour that your templates have on your existing Azure resources.

I’ve been using Azure DevOps (previously known as Visual Studio Team Services/Team Foundation Server) for a long time now and have blogged about it pretty frequently to date:

So, in case it’s not pretty clear already, I’ve got a lot of love and affection for the product. More recently, I’ve just finished some work migrating across numerous, existing Projects to a clean tenant and, as part of this, formalising how we record our Work Items within the application. The primary reasons behind this are so we are both a) tracking everything across the various projects our team is involved in and b) can start keeping the whole damn thing up to date as efficiently as possible 🙂 . A challenge to be sure, but one that I think will drive benefits in the long term as we adjust to working in a more genuinely Agile fashion.

To enable you to categorise your work items across multiple projects more straightforwardly, you can take advantage of the Areas feature within the application. Typically, you would also put in place the various iterations for your sprints alongside this, but both features can be operated in isolation if required. As part of first setting up your project, you would preferably define all of these from the start before populating out your Backlog. Life is not always ideal in this respect, or it could be that having got familiar with the application, you want to go the next step to report more effectively across your various work items. In this case, you may have a few issues getting things displaying how you want them to be when using the Boards feature.

For example, let’s assume we have the following Areas defined within our Azure DevOps project:

With a couple of different work items scattered across these various areas:

All looking good so far. But when we navigate to the Team Board for the project, we see that it is empty:

Likewise, the default teams backlog is also mysteriously lacking in content:

At this point, you may be asking yourself – where the hell are my User Stories? They are still there and fully accessible within the application when running a query, but not in the place where (perhaps) you would like for yourself and members of your team to access them. It turns out that, in addition to defining your Areas within the settings of this project, there is one more setting that needs toggling as well within the Teams configuration sub-area. To do this:

  1. Navigate to the Project settings area and click on the Team configuration option underneath Boards
  2. Select the Areas tab and a screen similar to the one below should appear:
  3. Select the ellipses icon over the default Area and select the Include sub areas option highlighted below, pressing OK on the Are you sure you want to include sub-areas dialog box:

Once confirmed, we can then navigate back to our Board and Backlog and see that our Work Items are now displaying correctly:

Much better!

The great benefit of cloud solutions, like Azure DevOps, is how quickly and easily you can get fully functioning business systems up and running, with an almost limitless amount of options at your disposal. In this scenario, you can very much feel like a child in a sweet shop, as you run around and try out the various features at your disposal. The solution described in this post is perhaps one area where you can steam ahead over-excitedly, but not fully appreciate the impact that implementing Areas may have for other users within your Azure DevOps project. Fortunately, the solution is relatively easy to resolve and, as a result, you can use Areas in tandem with your existing team Boards and Backlog with very little work or upheaval involved.

Functional consultants or administrators who have been using Dynamics CRM / Dynamics 365 Customer Engagement (D365CE) for any considerable length of time will likely have built up a solid collection of FetchXML queries, that are usable for a variety of different scenarios. Such privileged individuals are in the fortunate position of being able to leverage them in the following ways:

In other words, you have a range of useful queries that can potentially meet any needs within the application from a reporting standpoint. This is all well and good if you find yourself working solely within CRM/D365CE all the time, but when you start to bring in separate tools, such as Power BI, there can be some difficulty in migrating these across straightforwardly. Typically, you may find yourself staring down the barrel of a complicated and costly redevelopment exercise, where you have to invest a lot of time within Power Query to replicate your existing FetchXML queries as efficiently as possible; this puts potentially a lot of hard work and investment made into FetchXML query development down the drain almost immediately.

Fortunately, there is a way in which we can leverage our FetchXML queries using Power BI. I did a post on this very subject a few years ago, where I talked through an example from start to finish. The main limitations with this were, however, 1) the inability to return more than 5000 records at a time, given that paging was not correctly incorporated and 2) the fact that you had to manually define code for every query that you wished to utilise, which would take a lot of time to do and increase the risk of human error occurring.

As usual in these situations, the wonderful CRM/D365CE community has delivered a solution to address the first issue raised above. The Power Query (M) Builder tool is a handy plugin within the XrmToolBox that allows you to generate M query code snippets that you can use within Power BI Desktop. Most importantly, the tool incorporates a solution from Keith Mescha and the former Sonoma Partners Power BI Accelerator to get around the paging issue and allow you to return unlimited data from the application. You can find out more about the tool by checking out Ulrik “The CRM Chart Guy” Carlsson’s blog post dedicated to this very subject.

The tool is undoubtedly great, but if you have numerous FetchXML queries in a raw format that you wish to process within Power BI, it could take you some time to get these moved across into Power BI – particularly given that the tool does not currently support the ability to “bring your own” FetchXML queries. By using the example code provided by the tool, and carrying out some further work to address the second concern, it is possible to use the following M query function that will allow you to compartmentalise all of the above functionality in an easy to call Power Query function. Simply open a new blank query within Power Query and copy & paste the below into the window:

/*
    Generate FetchXML Query Results M Function
    Required Parameters:
        crmURL = The URL of your CRM/D365CE instance e.g. https://mycrm.crm11.dynamics.com
        entityName = The OData entity name that you are querying.
        query = The FetchXML query to execute. This should NOT include the top level <fetch> node, but only all subsequent nodes with double quotes escaped e.g. <entity name=""incident""><all-attributes /></entity>
    Credits: Big thanks to the Power Query Builder tool (https://crmchartguy.com/power-query-builder/) and Keith Mescha/Sonoma Partners Power BI Accelerator
             for figuring out the paging issue. Portions of the auto-generated code from the above tool is utilised within this function.
*/



let
    Func = (crmURL as text,entityName as text,query as text) =>
let
    FullURL = Text.Combine({crmURL, "/api/data/v9.1/", entityName},""),
    QueryAll = (z as text, x as number) =>
    let
                Source = Json.Document(Web.Contents(FullURL,
                        [
                            Headers=
                                [
                               #"Prefer"="odata.include-annotations=Microsoft.Dynamics.CRM.fetchxmlpagingcookie"
                                ],
                            Query=
                                [
                                fetchXml="<fetch distinct=""True"" page=""" & Text.From(x) & """ paging-cookie=""" & z & """>" & query & "</fetch>"
                                ]
                        ]
                    )
                ),
    Paging = try Xml.Document(Source[#"@Microsoft.Dynamics.CRM.fetchxmlpagingcookie"]) otherwise null,
    Retrieve = if Paging <> null 
                then List.Combine({Source[value],@QueryAll(Text.Replace(Text.Replace(Text.Replace(Uri.Parts("http://a.b?d=" & Uri.Parts("http://a.b?d=" & Paging{0}[Attributes]{1}[Value])[Query][d])[Query][d], ">", "&gt;"), "<", "&lt;"), """", "&quot;"), x + 1)})
                else Source[value]
    in 
        Retrieve,
    GenerateEmptyTable = (query as text) =>
    let
            XML = Xml.Document(query),
            #"Expanded Value" = Table.ExpandTableColumn(XML, "Value", {"Name", "Namespace", "Value", "Attributes"}, {"Value.Name", "Value.Namespace", "Value.Value", "Value.Attributes"}),
            #"Expanded Value.Value" = Table.ExpandTableColumn(#"Expanded Value", "Value.Value", {"Name", "Namespace", "Value", "Attributes"}, {"Value.Value.Name", "Value.Value.Namespace", "Value.Value.Value", "Value.Value.Attributes"}),
            #"Expanded Value.Attributes" = Table.ExpandTableColumn(#"Expanded Value.Value", "Value.Attributes", {"Name", "Namespace", "Value"}, {"Value.Attributes.Name", "Value.Attributes.Namespace", "Value.Attributes.Value"}),
            #"Filtered Rows" = Table.SelectRows(#"Expanded Value.Attributes", each ([Value.Attributes.Name] = "name")),
            #"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Name", "Namespace", "Value.Name", "Value.Namespace", "Value.Value.Name", "Value.Value.Namespace", "Value.Value.Value", "Value.Value.Attributes", "Value.Attributes.Name", "Value.Attributes.Namespace", "Attributes"}),
            #"Transposed Table" = Table.Transpose(#"Removed Columns"),
            #"Promote Headers" = Table.PromoteHeaders(#"Transposed Table", [PromoteAllScalars=true]),
            #"Added Custom" = Table.AddColumn(#"Promote Headers", "@odata.etag", each ""),
            #"Reordered Columns" = Table.ReorderColumns(#"Added Custom", List.Sort(Table.ColumnNames(#"Added Custom"), Order.Ascending))
    in
        #"Reordered Columns",
    List = QueryAll("",1),
    Table = if List.IsEmpty(List)
        then GenerateEmptyTable(query)
        else #"D365CEData",
    #"D365CEData" = Table.FromList(List, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    Expand = Table.ExpandRecordColumn( #"D365CEData", "Column1", Record.FieldNames(Table.Column(#"D365CEData", "Column1"){0})),
    D365CE = Table.ReorderColumns(Expand, List.Sort(Table.ColumnNames(Expand), Order.Ascending)),
    Results = if List.IsEmpty(List)
        then Table
        else D365CE
    in
        Results
    in 
        Func

When saved, Power BI will then generate a function that should resemble the below screenshot:

From here, you can then populate each of the required parameters as follows:

And then you are good to go! As an example, the following FetchXML query:

  <entity name=""incident"">
    <attribute name=""title"" />
    <attribute name=""ticketnumber"" />
    <attribute name=""createdon"" />
    <attribute name=""incidentid"" />
    <attribute name=""caseorigincode"" />
    <attribute name=""casetypecode"" />
    <order descending=""false"" attribute=""title"" />
    <filter type=""and"">
      <condition attribute=""createdon"" operator=""this-year"" />
      <condition attribute=""casetypecode"" operator=""in"">
        <value>2</value>
        <value>1</value>
      </condition>
    </filter>
  </entity>
  

Would return results similar to the below via the above function:

One limitation with this function, at present, is that I haven’t yet found a way to ensure formatted values return correctly, even when there are no results available. I’ll report back if I figure out a way to do this 🙂 A huge thanks to Keith, Ulrik and Sonoma Partners for kindly supplying the paging code snippet into the community and in helping me to build out the above function.

The Audio Conferencing add-on for Skype for Business / Microsoft Teams can oft be a requirement if you find yourself needing to schedule frequent online meetings that involve external participants. While most tech-savvy organisations these days will be fully equipped for taking calls via their laptops or other devices, a lot of businesses still do rely on traditional telephone headsets as their primary mechanism for making and receiving calls. And while most companies in the UK should already have made the jump across to use SIP / VoIP solutions, due to the impending curtain call on ISDN lines, it is unlikely they will be using a solution akin to Skype for Business / Microsoft Teams. It, therefore, becomes necessary to have a traditional audio conferencing solution in place, that allows attendees to dial in using a fixed telephone number. Microsoft’s Audio Conferencing solution meets these needs surprisingly well and comes equipped with a whole host of options that can be customised. For example, we’ve seen previously on the blog how it is possible to disable the requirement for entering a PIN for meetings, and you have additional options at your disposal, all of which are manageable via the Microsoft Teams Admin center or using PowerShell. Having your audio conferencing solution controllable from directly within the Office 365 portal can assist organisations in reducing the complexity of their IT estates.

As an add-on license, however, there are some limitations around exactly how you can get it provisioned onto your tenant. These limitations may not be too apparent if you are a Cloud Solutions Provider (CSP) with Microsoft, which gives you full control to provision Microsoft online services on behalf of your customer, at a far lower price compared with purchasing the services directly. Direct customers may also struggle to determine what they need to get up and running with Audio Conferencing, particularly if you a small business. The question then becomes: what exactly do I need in place to get started with Audio Conferencing?

As an add-on subscription, Audio Conferencing requires that one of the following base products exist on the Office 365 tenant in question first:

  • Skype for Business (Plan 2)
  • Office 365 Enterprise E3
  • Office 365 Enterprise E5

If you already have one of the above SKUs on your Office 365 tenant, then congratulations! You are ready to get started with using Audio Conferencing within your organisation. However, if you don’t or are, for example, a CSP partner provisioning licenses for a small organisation that uses Office 365 Essentials, but wants to have access to full Audio Conferencing functionality, then you will need to address this deficiency. The two hurdles that you will need to overcome are:

  1. The tenant in question will require one of the SKUs listed above already provisioned on the tenant in question. Attempting to do this beforehand will cause errors, and the Audio Conferencing licenses will not provision correctly.
  2. Each user that requires Audio Conferencing must also be assigned (at least) a Skype for Business (Plan 2) license. At which point, after assigning both licenses, they will then receive an automated email from Microsoft containing their unique conference ID and PIN.

To avoid the potentially prohibitive cost involved as part of both an E3 or E5 plan, the most cost-effective option would be to provision a Skype for Business Online (Plan 2) license (approx. £4 per user, per month) or an Office 365 Business Premium (approx £9 per user, per month) license. The second of these options include the Plan 2 license, alongside a whole range of other functionality, so would be the most natural option to turn to if, for example, you are currently using Office 365 Business or Essentials.

Hopefully, this post has made it clear how exactly you can go about provisioning Audio Conferencing functionality for an organisation, either from an end-user perspective or as a CSP provider offering services to your customer.