SQL Server Integration Services (SSIS) package execution can always throw up a few spanners, particularly when it comes to the task of deploying packages out to a SQL Server SSISDB catalog – principally, a specialised database for the storage of .dtsx packages, execution settings and other handy profile info to assist with automation. Problems can generally start creeping if you decide to utilise non-standard connectors for your package data sources. For example, instead of employing the more oft utilised Flat File Connection Manager for .csv file interaction, there may be a requirement to use the Excel Connection Manager instead. While I would generally favour the latter data Connection Manager where possible, the need to handle .xlsx file inputs (and to output into this file format) comes up more often than you might think. Bearing this in mind, it is, therefore, always necessary to consider the impact that deploying out what I would term a “non-standard Connection Manager” (i.e. a non-Flat File Connection Manager) can have for your package release. Further, you should give some serious thought towards any appropriate steps that may need to be taken within your Production environment to ensure a successful deployment.

With all of this in mind, you may encounter the following error message when deploying out a package that utilises the ADO.NET Connector for MySQL – a convenient driver released by Oracle that lets you connect straightforwardly with MySQL Server instances, à la the default SQL Server ADO.NET connector:

Error: Microsoft.SqlServer.Dts.Runtime.DtsCouldNotCreateManagedConnectionException: Could not create a managed connection manager. at Microsoft.SqlServer.Dts.Runtime.ManagedHelper.GetManagedConnection

Specifically, this error will appear when first attempting to execute your package within your Production environment. The good news is that the reason for this error – and its resolution – can be easily explained and, with minimal effort, resolved.

The reason why this error may be happening is that the appropriate ADO.NET MySQL driver is missing from your target SSISDB server. There is no mechanism for the proper dependent components to be transported as part of deploying a package to a catalog, meaning that we have to resort to downloading and installing the appropriate driver on the server that is executing the packages to resolve the error. Sometimes, as part of long development cycles, this critical step can be overlooked by the package developer. Or, it could be that a different individual/team that is responsible for managing deployments are not necessarily well-briefed ahead of time on any additional requirements or dependencies needed as part of a release.

For this particular example, getting things resolved is as simple as downloading and installing onto the SSISDB Server the latest version of the MySQL Connector Net drivers that can be found on the link below:

MySQL Connector/NET 8.0

If you find yourself in the same situation not involving the above Data Connector, then your best bet is to interrogate the package in question further and identify the appropriate drivers that are needed.

Now, the key thing to remember about all of this is that the driver version on the client development machine and the SSISDB server needs to be precisely the same. Otherwise, you will more than likely get another error message generated on package execution, resembling this:

Could not load file or assembly ‘MySql.Data, Version=6.10.4.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d’ or one of its dependencies. The located assembly’s manifest definition does not match the assembly reference.

In which case, you will need to resolve the version conflict, ideally by assuring that both machines are running the latest version of the corresponding driver. An uninstall, and server reboot could be necessary at this juncture, so be sure to tread cautiously.

SSIS development can often feel like a protracted, but ultimately worthwhile, process. With this in mind, it is natural to expect some bump in the roads and for potentially essential steps to be overlooked, particularly in larger organisations or for more complex deployments. Putting appropriate thought towards release management notes and even dedicated testing environments for deployments can help to mitigate the problem that this post references, ensuring a smooth pathway towards a prosperous, error-free release 🙂

When considering whether or not to shift your existing SQL workloads to a single database offering on Azure SQL, one of the major pros is the breadth of capabilities the service can offer when compared with other vendors or in comparison to SQL Server on an Azure Virtual Machine. A list of these may include:

  • High feature parity with the latest on-premise SQL Server offering.
  • Built-in support for Enterprise product features, such as Transparent Database Encryption.
  • Security management features, such as firewalls and (optional) integration with Azure SQL Database Threat Detection for proactive monitoring.
  • Ability to quickly scale a database from a 2GB database with low CPU consumption to a mammoth 4TB database, with a significant pool of CPU/memory resources to match.

It is the last one of these that makes Azure SQL database a particularly good fit for web application deployments that have unpredictable user loads at the time of deployment or, as we have seen previously on the blog, when you are wanting to deploy out a LOB reporting database that houses Dynamics 365 Customer Engagement instance data. Administrators can very straightforwardly scale or downscale a database at any time within the portal or, if you are feeling particularly clever, you can look to implement automatic scaling based on Database Throughput Unit (DTU) consumption. This can aid towards making your query execution times as speedy as possible.

Database scaling, I have found, is very straightforward to get your head around and works like a charm for the most part…except, of course, when you get rather cryptic error messages like the one demonstrated below:

I got this error recently when attempting to scale an S0 5GB database down to Basic 2GB tier. To cut a long story short, I had temporarily scaled up the database to give me increased DTU capacity for a particularly intensive query, and wanted to scale it back to its original pricing tier. You can perhaps understand my confusion about why this error was occurring. After further research and escalation to Microsoft, it turns out that the database was still consuming unused disk space on the platform, thereby violating any size limits imposed by moving to a lower price tier. To resolve the issue, there are some tasks that need to be performed on the database to get it into a “downscale-ready state”. These consist of a series of T-SQL scripts, which I would caution against using if the database is currently in use, due to potential performance impacts. If you have found yourself in the same boat as me and are happy to proceed, the steps involved are as follows:

  1. To begin with, the script below will execute the DBCC SHRINKDATABASE command against the database, setting the database file max size to the value specified on the @DesiredFileSize parameter. The script is compiled so as to perform the shrinking in “chunks” based on the value of the @ShrinkChunkSize parameter, which may be useful in managing DTU consumption:
SET NOCOUNT ON

DECLARE @CurrentFileSize INT, @DesiredFileSize INT, @ShrinkChunkSize INT, @ActualSizeMB INT,
		@ErrorIndication INT, @dbFileID INT = 1, @LastSize INT, @SqlCMD NVARCHAR(MAX),
		@msg NVARCHAR(100)

/*Set these values for the current operation, size is in MB*/
SET @DesiredFileSize = 2000  /* filesize is in MB */
SET @ShrinkChunkSize = 50 /* chunk size is in MB */

SELECT @CurrentFileSize = size/128 FROM sysfiles WHERE fileid = @dbFileID

SELECT @ActualSizeMB = (SUM(total_pages) / 128) FROM sys.allocation_units

SET @msg = 'Current File Size: ' + CAST(@CurrentFileSize AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET  @msg = 'Actual used Size: ' + CAST(@ActualSizeMB AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET @msg = 'Desired File Size: ' + CAST(@DesiredFileSize AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET @msg = 'Interation shrink size: ' + CAST(@ShrinkChunkSize AS VARCHAR(10)) + 'MB'
RAISERROR(@msg,0,0) WITH NOWAIT

SET @ErrorIndication = CASE
							WHEN @DesiredFileSize > @CurrentFileSize THEN 1
							WHEN @ActualSizeMB > @DesiredFileSize THEN 2
							ELSE 0 END

IF @ErrorIndication = 1  
	RAISERROR('[Error] Desired size bigger than current size',0,0) WITH NOWAIT
IF @ErrorIndication = 2  
	RAISERROR('[Error] Actual size is bigger then desired size',0,0) WITH NOWAIT
IF @ErrorIndication = 0 
	RAISERROR('Desired Size check - OK',0,0) WITH NOWAIT

SET @LastSize = @CurrentFileSize + 1

WHILE @CurrentFileSize > @DesiredFileSize /*check if we got the desired size*/ AND @LastSize>@CurrentFileSize /* check if there is progress*/ AND @ErrorIndication=0
BEGIN
	SET @msg = CAST(GETDATE() AS VARCHAR(100)) + ' - Iteration starting'
	RAISERROR(@msg,0,0) WITH NOWAIT
	SELECT @LastSize = size/128 FROM sysfiles WHERE fileid = @dbFileID
	SET @sqlCMD = 'DBCC SHRINKFILE('+ CAST(@dbFileID AS VARCHAR(7)) + ',' + CAST(@CurrentFileSize-@ShrinkChunkSize AS VARCHAR(7)) + ') WITH NO_INFOMSGS;'
	EXEC (@sqlCMD)
	SELECT @CurrentFileSize = size/128 FROM sysfiles WHERE fileid  =@dbFileID
	SET @msg = CAST(getdate() AS VARCHAR(100)) + ' - Iteration completed. current size is: ' + CAST(@CurrentFileSize AS VARCHAR(10))
	RAISERROR(@msg,0,0) WITH NOWAIT
END
PRINT 'Done' 
  1. With the database successfully shrunk, verify that the size of the database does not exceed your target @DesiredFileSize value by running the following query:
SELECT * FROM sys.database_files
 
SELECT (SUM(reserved_page_count) * 8192) / 1024 / 1024 AS DbSizeInMB
FROM    sys.dm_db_partition_stats
  1. Although by this stage, the database file sizes should be underneath 2GB, the maximum size of the database is still set to match the pricing tier level. To fix this, execute the following script, substituting the name of your database where appropriate:
ALTER DATABASE MyDatabase MODIFY (MAXSIZE=2GB) 

 You can confirm that this command has been executed successfully by then running the following query and reviewing the output:

SELECT CAST(DATABASEPROPERTYEX ('MyDatabase', 'MaxSizeInBytes') AS FLOAT)/1024.00/1024.00/1024.00 AS 'DB Size in GB'
  1. With the above commands executed, you are now in a position to scale down your database without issue. There are a few ways this can be done but, as you likely already have SQL Server Management Studio or similar open to run the above queries, you can modify the tier of your database via this handy script:
--Scaling down to Basic is easy, as there is only one Max Size/Service Level Objective
--Therefore, just specify Edition

ALTER DATABASE MyDatabase MODIFY (EDITION = 'Basic');

--For other tiers, specify the size of the DB.
--In this example, we are scaling down from Premium P1 1TB to Standard S2 250GB tier

ALTER DATABASE MyDatabase MODIFY (EDITION = 'Standard', MAXSIZE = 250 GB, SERVICE_OBJECTIVE = 'S2');

Although the script will likely execute immediately and indicate as such in any output, the actual scaling operation on the backend Azure platform can take some time to complete – usually about 5-10 minutes for lower sized databases.

Whilst I was relieved that a workaround was available to get the database scaled down correctly, it would have been useful if the above error message was signposted better or if there was some kind of online support article that detailed that this could be a potential issue when moving a database between various pricing/sizing tiers. Hopefully, by sharing the above steps, others who in the same boat can very quickly diagnose and resolve the issue without hammering your credit card with increased database usage charges in the process. 🙂

This week’s blog post is sponsored by ActiveCrypt Software.

Encryption appears to be a topic of near constant discussion at the moment, spearheaded primarily by the impending deadline of the General Data Protection Regulations (GDPR). These are, in essence, a new set of data protection rules that will apply to all organisations operating within the European Economic Area (EEA). A key aspect of them concerns implementing appropriate technical controls over sensitive data categories, to mitigate against any damage resulting from a data breach. Now, the key thing to highlight around this is the “proportionality” aspect; i.e. any technical controls implemented should be reasonably expected, based on the size of the organisation in question and the nature of their data processing/controlling activity. You should, therefore, be carefully evaluating your organisation to identify whether the lack of encryption could result in damage to a data subject.

I’ve had a look previously at database encryption in the context of Dynamics 365 Customer Engagement. What is nice about the application, and nearly all of Microsoft’s Software as a Service (SaaS) products at the moment, is that GDPR is very much at the centre of each individual offering. I have been genuinely impressed to see the level of effort Microsoft has been devoting to GDPR and in ensuring their SaaS product lines are compliant with the regulations – often without the need for charging customers an arm and a leg in the process. The same can perhaps not be said for any on-premise equivalent of a particular SaaS product. This is, to be fair, expected – Microsoft has been incredibly vocal about adopting a “cloud first” strategy in all things. But for organisations who do find themselves having to support on-premise applications or database systems, the journey towards implementing the required technical solutions for encryption could be rocky.

Case in point – SQL Server has long provided the capability to implement Transparent Database Encryption (TDE), which satisfies the requirement for at rest encryption without the need to redevelop applications from the ground up. Setting up Transparent Database Encryption can be an onerous process (more on this in a second), and requires the involvement of manual scripting. The following script outlines all the steps involved:

--First, a Master Key should be created on the Server instance

USE master;  
GO  
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'mymasterkey';  
GO

--Next, a Certificate for the Server should be created.

CREATE CERTIFICATE MyCert WITH SUBJECT = 'DEK Certificate for testing purposes';  
GO

--This then allows for a Database Encryption Key to be created for encrypting a database. This needs to be created for
--EVERY database that requires encryption

USE EncryptionTest;  
GO  
CREATE DATABASE ENCRYPTION KEY  
WITH ALGORITHM = AES_256  
ENCRYPTION BY SERVER CERTIFICATE MyCert;  
GO  

--Once created, Encryption can then be enabled/disabled using the snippets below

ALTER DATABASE MyTestDatabase 
SET ENCRYPTION ON;  
GO

ALTER DATABASE MyTestDatabase 
SET ENCRYPTION OFF;
GO

--The Server Certificate should be backed up for disaster recovery scenarios or to enable databases to be restored to
--other SQL Server instances. First, backup the certificate with an encrypted private key...

USE master;
GO
BACKUP CERTIFICATE MyCert TO FILE = 'C:\MyCert.cer'  
    WITH PRIVATE KEY ( FILE = 'C:\MyCert.pvk',
    ENCRYPTION BY PASSWORD = 'mypassword');
GO  

--Once saved, execute the following code on the target instance to restore the certificate...

CREATE CERTIFICATE MyCert FROM FILE ='C:\MyCert.cer'
	WITH PRIVATE KEY(FILE='C:\MyCert.pvk', DECRYPTION BY PASSWORD='mypassword');

Whilst TDE is a neat solution, it does have some issues:

  • It’s important to keep in mind any potential disaster recovery scenario, when working with TDE, by backing up the server certificate to a separate physical location. The above script provides the necessary snippet to accomplish this, so it is imperative that this is done for every certificate you plan to work with.
  • All required configuration steps have to be accomplished via scripting and the feature is not enabled by default, unlike Azure SQL Databases. Depending on your level of expertise when working with SQL Server, you may have to leverage assistance from other sources to get up and running with the feature.
  • Perhaps the biggest barrier to adopting TDE is the version restrictions. It is only made available as part of the Developer and Enterprise editions of SQL Server. As the name suggests, the Developer edition is licensed strictly for non-Production environments and the Enterprise edition has a staggering cost, licensed based on the number of cores the target server is running. To put this into better context, I was recently quoted a whopping £68,000 through Microsoft Volume Licensing! For most organisations, this can result in an incredibly high cost of ownership just to satisfy a single requirement.

Fortunately, for those who are wanting to implement database encryption via an accessible interface, there are a number of products available on the market to assist. The best one I have come across is DbDefence from ActiveCrypt, which offers a simple to use and efficient means of configuring encryption for your databases. In fact, depending on your database size, you can more than likely have your databases encrypted in less than 5 minutes 🙂 Let’s take a closer look at how straightforward the software is to use by encrypting a database from scratch:

  1. After downloading the installation package, you will need to run it on the server where your SQL Server instance resides. During the installation process, the Full installation option can be selected and you will also need to specify the SQL Server instance that you wish to utilise with the software:

  1. After the installation completes successfully, launch the application and then connect to your target SQL Server instance. Next, select the database that you want to encrypt. You should see a window similar to the below if done correctly:
  2. At this point, you could choose to accept the default Encryption and Protection options and proceed to the next step. However, I would recommend changing the options as follows:
    • Modify the AES Encryption Options value to 256-bit. Whilst the risk of a successful brute force between 128 and 256 bit is effectively zero, 256 still supports longer keys and is, therefore, more secure.
    • In most cases, you just need to ensure data is encrypted at rest and not provide any additional access restrictions beyond this. In these situations, I would recommend setting the required level of protection to Only Encryption. Maximum Transparency. This negates the need for any additional configuration after encryption to ensure your client applications still work successfully.

  1. To encrypt the database, a password/key is required. You should always ensure you utilise a random, sequential password that contains upper/lower case letters, numbers and symbols. I would also recommend having a seperate password for each database you encrypt and to ensure that these are all stored seperately (as they may be required to decrypt the databases at a later date). The length of the password to use will depend on the AES encryption mode, but if you are using 256 bit, then an 18 character password is recommended.
  2. When you are ready to start the encryption process, press the Encrypt button and confirm the warning box that appears:

Give it a few minutes and you will then be able to see in the main window that your database has been encrypted successfully:

If you ever have the requirement to decrypt the database, then you can return to the application at any time, connect up to the database, enter the password and then press Decrypt:

  1. As a final step, you can then test that your database files have been encrypted successfully by attempting to mount the encrypted database files onto a seperate SQL Server instance. You should get an error message similar to the below, indicating that your database has been encrypted successfully:

Conclusions or Wot I Think

The world of encryption can be a veritable nightmare to those approaching for the first time, and GDPR can be blamed – but also, I would argue, welcomed – in raising the profile of the topic recently. As with a lot of things concerning GDPR, there is a real opportunity for organisations to get a handle on the personal data they work with every day and to implement the required processes and systems to ensure the right thing is being done when handling sensitive data. Database encryption is one weapon in your arsenel when it comes to satisfying a number of areas within GDPR; but, as we have seen, the total cost of ownership and technical expertise required to implement such a solution could – regrettably – force many to simply look the other way when it comes to securing their databases. DbDefence assists greatly in both these regards – by significantly reducing cost and providing a simplified, easy to use interface, to deploy database encryption within minutes. What’s great as well is that, as part of evaluating the software, I found the support team at ActiveCrypt incredibly reactive and helpful in dealing with the queries I had around the product. If you are looking for a cheaper, yet wholly effective, solution to implement database encryption for SQL Server, then I would not hesitate to recommend the DbDefence product.

Slight change of pace with this week’s blog post, which will be a fairly condensed and self-indulgent affair – due to personal circumstances, I have been waylaid somewhat when it comes to producing content for the blog and I have also been unable to make any further progress with my new YouTube video series. Hoping that normal service will resume shortly, meaning additional videos and more content-rich blog posts, so stay tuned.

I’ve been running the CRM Chap blog for just over 2 years now. Over this time, I have been humbled and proud to have received numerous visitors to the site, some of whom have been kind enough to provide feedback or to share some of their Dynamics CRM/365 predicaments with me. Having reached such a landmark now seems to be good a time as any to take a look back on the posts that have received the most attention and to, potentially, give those who missed them the opportunity to read them. In descending order, here is the list of the most viewed posts to date on the crmchap.co.uk website:

  1. Utilising SQL Server Stored Procedures with Power BI
  2. Installing Dynamics CRM 2016 SP1 On-Premise
  3. Power BI Deep Dive: Using the Web API to Query Dynamics CRM/365 for Enterprise
  4. Utilising Pre/Post Entity Images in a Dynamics CRM Plugin
  5. Modifying System/Custom Views FetchXML Query in Dynamics CRM
  6. Grant Send on Behalf Permissions for Shared Mailbox (Exchange Online)
  7. Getting Started with Portal Theming (ADXStudio/CRM Portals)
  8. Microsoft Dynamics 365 Data Export Service Review
  9. What’s New in the Dynamics 365 Developer Toolkit
  10. Implementing Tracing in your CRM Plug-ins

I suppose it is a testament to the blog’s stated purpose that posts covering areas not exclusive to Dynamics CRM/365 rank so highly on the list and, indeed, represents how this application is so deeply intertwined with other technology areas within the Microsoft “stack”.

To all new and long-standing followers of the blog, thank you for your continued support and appreciation for the content 🙂

If you are looking at automating the execution of SQL Server Integration Services .dtsx packages, then there are a few options at your disposal. The recommended and most streamlined route is to utilise the SSIDB catalog and deploy your packages to the catalog from within Visual Studio. This gives you additional flexibility if, when working with SQL Server 2016 or greater, on whether to deploy out single or multiple packages together. An alternative approach is to deploy your packages to the file system and then configure an Agent Job on SQL Server to execute the job based on a schedule and with runtime settings specified. This is as simple as selecting the appropriate dropdown option on the Agent Job Step configuration screen and setting the Package source value to File system:

Deploying out in this manner is useful if you are restricted from setting up the SSISDB catalog on your target SQL Server instance or if you are attempting to deploy packages to a separate Active Directory domain (I have encountered this problem previously, much to my chagrin). You also have the benefit of utilising other features available via the SQL Server Agent, such as e-mail notifications on fail/success of a job or conditional processing logic for job step(s). The in-built scheduling tool is also pretty powerful, enabling you to fine tune your automated package execution to any itinerary you could possibly conjure up.

I encountered a strange issue recently with a DTSX package configured via the SQL Agent. Quite randomly, the package suddenly started failing each time it was scheduled for execution, with the following error generated in the log files:

Failed to decrypt an encrypted XML node because the password was not specified or not correct. Package load will attempt to continue without the encrypted information.

The issue was a bit of a head-scratcher, with myself and a colleague trying the following steps in an attempt to fix the issue:

  • Forcing the package to execute manually generated the same error – this one was a bit of a longshot but worth trying anyway 🙂
  • When executing the package from within Visual Studio, no error was encountered and the package executed successfully.
  • After replacing the package on the target server with the package just executed on Visual Studio (same version) and manually executing it, once again the same error was thrown.

In the end, the issue was resolved by deleting the Agent Job and creating it again from scratch. Now, if you are diagnosing the same issue and are looking to perform these same steps, it may be best to use the Script Job as option within SQL Server Management Studio to save yourself from any potential headache when re-building your Job’s profile:

Then, for good measure, perform a test execution of the Job via the Start Job at Step… option to verify everything works.

I am still stumped at just what exactly went wrong here, but it is good to know that an adapted version of the ancient IT advice of yore can be referred back to…