A few weeks ago, I did a post on the process involved when migrating Microsoft Azure Cloud Solutions Provider (CSP) subscriptions across tenants. Having done some actual work relating to this since then (shock horror!), I thought I'd follow up with a new post, sharing some additional thoughts and lessons learned.
One of the best things about Azure Data Factory is its ability to incorporate continuous integration and automated deployments quickly alongside your solution. However, if you’re working with SQL Server data sources and are using square brackets to interact with tables, then you may be in for a bumpy ride…
Dealing with “SQL Bulk Copy failed due to received an invalid column length from the bcp client” Errors in Azure Data Factory
When you are amid a pesky IT issue, it can be difficult determining whether the problem is down to a bug/system fault or human error. Like this recent example involving Azure Data Factory illustrates, it is generally best to assume the latter, to avoid any prolonged difficulty.
Sink Limitations with the Dynamics 365 Customer Engagement/Common Data Service Connector for Azure Data Factory
The Dynamics 365 Customer Engagement/Common Data Service connector for Azure Data Factory can, in most cases, fit into your data integration needs. However, it is worth highlighting the two field types which are, specifically, not supported; namely, the Customer and Owner field types.
By default, the Dynamics 365 Customer Engagement connector for Azure Data Factory V2 exposes a predefined list of fields, that must have data mapped to them for any Copy Data task to complete successfully. This behaviour can be impractical depending on your particular scenario; fortunately, there is a way in which you can override this.
Azure Data Factory V2 not just has 1, but three separate connectors that all claim to hook up to Dynamics 365 Customer Engagement/Dynamics CRM! So which connector is the "right" one to use and what differences do they have? With a little help from Alan Partridge, we can clear up any confusion...
When working with Azure templates for the first time, there's always a risk of misconfiguring a setting. All fine when testing, until you realise you have been charged an unexpected amount on your credit card. In this post, I provide an example of this in practice when working with Stream Analytic Job resources.
Depending on the type of data being worked with within Power BI, you may find yourself unable to leverage Power Query to perform any data transformation required. In this scenario, such as when working with Streaming Analytic Power BI datasets, DAX can come to the rescue, and we'll see how in this post.
The introduction of Azure Data Factory V2 represents the most opportune moment for data integration specialists to start investing their time in the product. Version 1 (V1) of the tool, which I started taking a look at last year, missed a lot of critical functionality that - in most typical circumstances - I could do [...]
This may be somewhat obvious from some of my more recent blog posts, but I am a huge fan of Application Insights at the moment. The best analogy I can use for it is that it's a web developer Swiss Army Knife, that really puts you in the driving seat of understanding and maintaining your [...]