SQL Server Integration Services (SSIS) package execution can always throw up a few spanners, particularly when it comes to the task of deploying packages out to a SQL Server SSISDB catalog – principally, a specialised database for the storage of .dtsx packages, execution settings and other handy profile info to assist with automation. Problems can generally start creeping if you decide to utilise non-standard connectors for your package data sources. For example, instead of employing the more oft utilised Flat File Connection Manager for .csv file interaction, there may be a requirement to use the Excel Connection Manager instead. While I would generally favour the latter data Connection Manager where possible, the need to handle .xlsx file inputs (and to output into this file format) comes up more often than you might think. Bearing this in mind, it is, therefore, always necessary to consider the impact that deploying out what I would term a “non-standard Connection Manager” (i.e. a non-Flat File Connection Manager) can have for your package release. Further, you should give some serious thought towards any appropriate steps that may need to be taken within your Production environment to ensure a successful deployment.

With all of this in mind, you may encounter the following error message when deploying out a package that utilises the ADO.NET Connector for MySQL – a convenient driver released by Oracle that lets you connect straightforwardly with MySQL Server instances, à la the default SQL Server ADO.NET connector:

Error: Microsoft.SqlServer.Dts.Runtime.DtsCouldNotCreateManagedConnectionException: Could not create a managed connection manager. at Microsoft.SqlServer.Dts.Runtime.ManagedHelper.GetManagedConnection

Specifically, this error will appear when first attempting to execute your package within your Production environment. The good news is that the reason for this error – and its resolution – can be easily explained and, with minimal effort, resolved.

The reason why this error may be happening is that the appropriate ADO.NET MySQL driver is missing from your target SSISDB server. There is no mechanism for the proper dependent components to be transported as part of deploying a package to a catalog, meaning that we have to resort to downloading and installing the appropriate driver on the server that is executing the packages to resolve the error. Sometimes, as part of long development cycles, this critical step can be overlooked by the package developer. Or, it could be that a different individual/team that is responsible for managing deployments are not necessarily well-briefed ahead of time on any additional requirements or dependencies needed as part of a release.

For this particular example, getting things resolved is as simple as downloading and installing onto the SSISDB Server the latest version of the MySQL Connector Net drivers that can be found on the link below:

MySQL Connector/NET 8.0

If you find yourself in the same situation not involving the above Data Connector, then your best bet is to interrogate the package in question further and identify the appropriate drivers that are needed.

Now, the key thing to remember about all of this is that the driver version on the client development machine and the SSISDB server needs to be precisely the same. Otherwise, you will more than likely get another error message generated on package execution, resembling this:

Could not load file or assembly ‘MySql.Data, Version=6.10.4.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d’ or one of its dependencies. The located assembly’s manifest definition does not match the assembly reference.

In which case, you will need to resolve the version conflict, ideally by assuring that both machines are running the latest version of the corresponding driver. An uninstall, and server reboot could be necessary at this juncture, so be sure to tread cautiously.

SSIS development can often feel like a protracted, but ultimately worthwhile, process. With this in mind, it is natural to expect some bump in the roads and for potentially essential steps to be overlooked, particularly in larger organisations or for more complex deployments. Putting appropriate thought towards release management notes and even dedicated testing environments for deployments can help to mitigate the problem that this post references, ensuring a smooth pathway towards a prosperous, error-free release 🙂

If you are looking at automating the execution of SQL Server Integration Services .dtsx packages, then there are a few options at your disposal. The recommended and most streamlined route is to utilise the SSIDB catalog and deploy your packages to the catalog from within Visual Studio. This gives you additional flexibility if, when working with SQL Server 2016 or greater, on whether to deploy out single or multiple packages together. An alternative approach is to deploy your packages to the file system and then configure an Agent Job on SQL Server to execute the job based on a schedule and with runtime settings specified. This is as simple as selecting the appropriate dropdown option on the Agent Job Step configuration screen and setting the Package source value to File system:

Deploying out in this manner is useful if you are restricted from setting up the SSISDB catalog on your target SQL Server instance or if you are attempting to deploy packages to a separate Active Directory domain (I have encountered this problem previously, much to my chagrin). You also have the benefit of utilising other features available via the SQL Server Agent, such as e-mail notifications on fail/success of a job or conditional processing logic for job step(s). The in-built scheduling tool is also pretty powerful, enabling you to fine tune your automated package execution to any itinerary you could possibly conjure up.

I encountered a strange issue recently with a DTSX package configured via the SQL Agent. Quite randomly, the package suddenly started failing each time it was scheduled for execution, with the following error generated in the log files:

Failed to decrypt an encrypted XML node because the password was not specified or not correct. Package load will attempt to continue without the encrypted information.

The issue was a bit of a head-scratcher, with myself and a colleague trying the following steps in an attempt to fix the issue:

  • Forcing the package to execute manually generated the same error – this one was a bit of a longshot but worth trying anyway 🙂
  • When executing the package from within Visual Studio, no error was encountered and the package executed successfully.
  • After replacing the package on the target server with the package just executed on Visual Studio (same version) and manually executing it, once again the same error was thrown.

In the end, the issue was resolved by deleting the Agent Job and creating it again from scratch. Now, if you are diagnosing the same issue and are looking to perform these same steps, it may be best to use the Script Job as option within SQL Server Management Studio to save yourself from any potential headache when re-building your Job’s profile:

Then, for good measure, perform a test execution of the Job via the Start Job at Step… option to verify everything works.

I am still stumped at just what exactly went wrong here, but it is good to know that an adapted version of the ancient IT advice of yore can be referred back to…

Perhaps one of the most fiendish aspects of working with SQL Server Integration Services (SSIS) is the inevitable data transformation/conversion issues that get thrown up, even as part of relatively simplistic Extract, Transform & Load (ETL) packages. It doesn’t help as well if, having come from a strictly T-SQL focused background, you are then having to familiarise yourself with the differently named data types that SSIS has in comparison to SQL Server. Ultimately, whether you are still a noobie or season veteran in creating .dtsx packages, you should never be disheartened if you find yourself having to tackle data conversion issues during package development – put another way, there is always going to be a new system or data file format that comes out of nowhere to test your patience 🙂

I had a rather strange occurrence of this issue recently when working to import Globally Unique Identifier (GUID) data into SQL Server’s equivalent data type – the uniqueidentifier. GUIDs are very much the first choice these days if you are building large-scale applications requiring unique values to distinguish database records. Whereas back in the old days, you could get away with an integer column using the IDENTITY seed, the potential for current datasets to contain billions or more records make this option less practical compared with GUID’s – a data type that is almost always certainly going to be unique, even you are generating them at an insane pace, and which has the headroom to accommodate huge datasets.

Going back to strange occurrence I mentioned above – perhaps the best way to explain the issue (and its resolution) is to show the steps involved. To do this, access to a SQL Server database instance, interfaced with via SQL Server Management Studio (SSMS), is required. Once this has been obtained, a database needs to be created and the following example script executed against it to create the table used during this post:

CREATE TABLE [GUIDImportTest]
(
	[UID] UNIQUEIDENTIFIER NOT NULL,
	[TestCol1] NVARCHAR(MAX) NULL,
	[TestCol2] NVARCHAR(MAX) NULL
)

We then also have our test import file, saved as a .csv file:

With both of these ready, we can then get the error to generate using the SQL Server Import and Export Wizard – a handy tool that enables you to straightforwardly move uncomplex data between applications and file formats. This tool can be accessed via SSMS by right-clicking on any database and selecting Tasks -> Import Data…

Begin the wizard as indicated above and, when specifying the Data Source settings, select Flat File Source. In the Advanced tab, you should also override the default data type settings for the UID field and set it to unique identifier (DT_GUID):

The Target destination (accessed further along the wizard) should be set to SQL Server Native Client and to the server/database where the table created above resides.

On the Select Source Tables and Views screen, be sure that the correct table on the Destination drop-down. By default, if your import source does not match the destination name, then the wizard will assume you want to create a brand new table:

On the Review Data Type Mapping tab, a data conversion warning be will flagged up for the two TestCol fields; these can be safely disregarded, as the import package will successfully convert these values for you without further complaint:

After clicking Next and letting the package, we can then see the titular error of this post occur, which halts the package execution:

Initially, I thought the error was generating because the GUID values in the .csv file were not in upper case (when selecting uniqueidentifier data via a SQL query, this is always returned in this format), but the same error is thrown when importing data in this exact format. It turns out the issue was down to something that I should have readily realised based on my experience working with Dynamics CRM/Dynamics 365 for Customer Engagement. When working with URL’s and query string parameters in the application involving individual records, GUID values require special URL encoding to convert curly brace values – { and } respectively – into “URL friendly” format. So for example, the following:

{06e82887-9afc-4064-abad-f6fb60b8a1f3}

Is converted into:

%7B06e82887-9afc-4064-abad-f6fb60b8a1f3%7D

What does this have to do with SSIS and the task at hand? Well, it turns out that when importing uniqueidentifier data types into the application, the application expects the data to be in the above format, surrounded by curly braces. Our source data, therefore, needs to resemble the following image below to import successfully:

After making the appropriate changes to the source data, the package will then execute successfully, loading the data into the desired SQL table:

I guess the lesson here is that never take for granted any knowledge you may have garnered from a particular source  – even when dealing with what may be at first glance a completely disparate challenge. In all likelihood, it just might be that this past experience could present a means of thinking differently about a problem and, ultimately, overcome the challenge you are faced with.