Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging – and potentially alienating – part of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server. Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of £’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance 🙂 Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select the AccountLead and Product entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick – for example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed – often tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.

Those who have experience working with an RDMS system like SQL Server will become accustomed towards a certain way of going about things. These can often involve a mixture of “lazy” query writing (e.g. using SELECT *… as opposed to SELECT Column1, Column2…), the manner in which you write your query (ALL CAPS or lower case) and best practice approaches. One arguable example of a best practice approach is the use of Stored Procedures. An illustration of how to use a Stored Procedure can most readily demonstrate their benefits. Take a look at the T-SQL query below, which should execute fine against the AdventureWorks2012 sample database:

SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
FROM [HumanResources].[Employee] AS E
 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
 INNER JOIN [HumanResources].[Department] AS D
  ON DH.[DepartmentID] = D.[DepartmentID]
 INNER JOIN [Person].[Person] AS P
  ON E.[BusinessEntityID] = P.[BusinessEntityID]
WHERE DH.[EndDate] IS NULL
AND E.[JobTitle] = 'Production Technician - WC50'

The query returns the data we need, but not in an efficient manner. Consider the following:

  • Executing a query like the above, in-code, as part of an end-user application could expose your database to the risk of an SQL Injection attack or similar.
  • The query compromises a lot of information regarding our underlying database structure, information which any underlying client executing the query neither cares or should have to worry about.
  • The example is a very precise query, with a specific function – i.e. get me all the current employees who have the Job Title of Production Technician – WC50. If we wanted to modify it to instead obtain all Senior Tool Designers, we would have to write a completely separate query to accommodate this.

Implementing a Stored Procedure to encapsulate our query logic immediately addresses the above concerns, by providing us with a single-line query into the database, giving us just the data we need and enables us to utilise the query for other scenarios as well. Setting one up is very straight forward via the CREATE PROCEDURE command – the rest is pretty much what we have put together already:

CREATE PROCEDURE dbo.uspGetEmployeesByJobTitle 
	@JobTitle NVARCHAR(50)
AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

	SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
	FROM [HumanResources].[Employee] AS E
	 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
	  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
	 INNER JOIN [HumanResources].[Department] AS D
	  ON DH.[DepartmentID] = D.[DepartmentID]
	 INNER JOIN [Person].[Person] AS P
	  ON E.[BusinessEntityID] = P.[BusinessEntityID]
	WHERE DH.[EndDate] IS NULL
	AND E.[JobTitle] = @JobTitle
END
GO

By utilising a parameter for our WHERE clause filter on the Job Title, we can pass any valid value back to our stored procedure, immediately making our initial query more versatile across our reporting/business application. And, as a primary bonus, we can now safely take a 10 line query down to 1:

EXECUTE dbo.uspGetEmployeesByJobTitle @JobTitle = 'Senior Tool Designer'

So we have established that Stored Procedures are wicked cool awesome – but what does this have to do with PowerBI?!? 

Having worked with SQL Server Reporting Services (SSRS) extensively in the past, I have become accustomed to using Stored Procedures as a mechanism for storing underlying query logic within the database and having a straightforward means of referencing this from my .rdl file. I can only assume from this that this is the “norm” and preferred method of querying SQL data, as opposed to a direct SELECT statement.

When recently doing some work within PowerBI involving Azure SQL Databases, I was, therefore, surprised that there was no option to return data via a stored procedure as default. Instead, PowerBI would prefer me to directly query underlying table/view objects:

Thankfully, when inspecting the underlying PowerQuery used to return an example table from the above, it doesn’t use any kind of SELECT query to get the data:

let
    Source = Sql.Databases("mydatabaseinstance"),
    AdventureWorks2012 = Source{[Name="AdventureWorks2012"]}[Data],
    Production_ProductModel = AdventureWorks2012{[Schema="Production",Item="ProductModel"]}[Data]
in
    Production_ProductModel

Unfortunately, the same cannot be said for if you select the Advanced options area and input your own SQL query directly:

let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]#(lf)FROM [HumanResources].[Employee] AS E#(lf) INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH#(lf)  ON E.[BusinessEntityID] = DH.[BusinessEntityID]#(lf) INNER JOIN [HumanResources].[Department] AS D#(lf)  ON DH.[DepartmentID] = D.[DepartmentID]#(lf) INNER JOIN [Person].[Person] AS P#(lf)  ON E.[BusinessEntityID] = P.[BusinessEntityID]#(lf)WHERE DH.[EndDate] IS NULL#(lf)AND E.[JobTitle] = 'Senior Tool Designer'"])
in
    Source

I do NOT recommend you use the above method to query your SQL Server data!

I have spoken previously on the blog in respect to conventions around working with datasets i.e. only get what you need, and nothing else. As I work more and more with PowerBI, the tool very much seems to be geared towards flipping this mentality on its head. PowerBI has a number of built-in tools that seem to scream out “Just get ALL your data in here, we’ll worry about the rest!”. I realise that the difference between MB and GB these days, from a storage/cost point of view, is minimal; nevertheless, I still believe it is prudent not to put all your eggs in one basket and ensure that your business data is not being stored cavalier-esque within a multitude of different cloud services.

With this in mind, it is good to know that you can utilise stored procedures in PowerBI. You basically have two ways in which this can be achieved:

  • Going back to the Advanced options screen above on the SQL Server database wizard, you can EXECUTE your stored procedure directly using the following SQL Statement:
DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''
EXEC (@SP)

Be sure to specify your database and don’t forget the double quotes!

  • If you prefer to use PowerQuery as opposed to the wizard above, then the following code will also work:
let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''#(lf)EXEC (@SP)"])
in
    Source

In both cases, you will be required to authenticate with the database and your result set should return as follows if using the AdventureWorks2012 example database/code:

Finally, as a best-practice security step, you should ensure that the account connecting to your SQL Server instance is restricted to only EXECUTE the procedures you have specified. This can be achieved via the following T-SQL snippet, executed against your database instance:

GRANT EXECUTE ON OBJECT::dbo.uspMyStoredProcedure  
    TO MySQLServerLogin 

Conclusions or Wot I Think

PowerBI is increasingly becoming a more relevant tool for traditional Business Intelligence/Reporting Services experienced professionals. The bellwether for this can surely be seen in the current Technical Preview for SQL Server Reporting Services, which includes PowerBI reports built-in to the application. Although we have no timescales at this stage at when the next major version of SQL Server will be released, it is reasonable to assume by the end of this year at the earliest, bringing PowerBI reports as a new feature. I am really excited about the introduction of PowerBI into SSRS, as it would appear to be a match made in heaven – giving an opportunity for those with experience in both products the flexibility to develop a unified, best of breed solution, using traditional .rdl reporting capability and/or PowerQuery/DAX functionality.

With the above on the horizon, the importance of being able to integrate seamlessly with SQL Server and having support for traditional/well-proven practices become crucial indicators of whether this match will be over before the honeymoon. And so, I would hope to see the option to access SQL Server data via Stored Procedures become standard when using the built-in data connector within PowerBI. Based on the feedback I have seen online, I’d warrant towards how welcome this feature could be and an excellent way of reducing the need for direct coding to achieve a common requirement within PowerBI.

Getting to grips with how to use Dynamics CRM/365 for Enterprise (D365E) is no easy feat. You can imagine just how difficult it is for an end user to get to grips with how the application works and functions; with more detailed knowledge around customisation and development being an entirely different ball game altogether. Compounding this problem further is the fact that the product has evolved at an increasingly more rapid pace over recent years, to the point where it is literally impossible to become a master of everything that you can do within CRM/D365E. Those venturing into the product for the first time may find their learning journey significantly simplified if they already have a good general knowledge about some of the underlying technology that powers CRM/D365E. This was certainly true in my case; I had a good background already in managing Office 365, writing SQL queries/reports and some experience with C#. This is all incredibly useful knowledge to have in your arsenal and is all directly applicable towards CRM/D365E in some way. For those who are getting to grips with the product for the first time, either without this previous experience or as part of an apprentice/graduate type role, your journey may not be as swift and issue-free. With this in mind, here’s my list of essential knowledge that you can add to your own “swiss army knife” of personal knowledge. Experience and good knowledge of these technologies will not only help you greatly in working with CRM/D365E, but present an excellent learning opportunity for Microsoft technologies more generally and something that you can add to your CV with pride:

SQL Server

What is it? : SQL Server is Microsoft’s proprietary database knowledge, based on the ANSI standard. SQL stands for Structured Query Language and is one the most widely used database programming languages on the planet.

Why Knowing It Is Useful: The underlying database technology that CRM/D365E uses is SQL Server, so having a general awareness of relational database systems work and how SQL Server differs from the standard goes a long way in understanding what is capable from a customisation/development viewpoint. For example, you can very quickly grasp which data types the application supports, as they will all ultimately be based on a supported SQL Server column data type. If you are running on-premise versions of CRM/D365E, then knowledge of SQL Server immediately moves from being a nice bonus to an essential requirement. This is because administrators will need to have good knowledge of how to manage their Dynamics CRM database instance, perform backups and also, potentially, write transact-SQL (T-SQL) queries against your database for reporting or diagnostic work.

Recommended Area of Study: Focusing your attention towards SQL Server Reporting Services (SSRS) report writing will benefit you the most. Through this, you can begin to establish good knowledge of how SQL Server databases work generally, and be in a position to write FetchXML for Online/On-Premise deployments of the application or Transact-SQL (T-SQL) based reports for On-Premise only. Having a good awareness of what is capable via a standard SQL query will also hold your good stead when working with FetchXML, as you can immediately make a number of assumptions about what is possible with FetchXML (for example, filtering results using an IN block containing multiple values and performing grouping/aggregate actions on datasets)

Office 365

What is it? : Office 365 is Microsofts primary – and perhaps most popular – cloud offering for businesses, individuals or home users. Through a wide array of different subscription offers, home and business users can “pick ‘n’ mix” the range of solutions they require – from Exchange-based email accounts to licenses for Microsoft Visio/Project, through to PowerBI.

Why Knowing It Is Useful: Although it is arguable that knowledge of Office 365 is not essential if you anticipate working with on-premises versions of the application, you may be doing yourself a disservice in the long term. Microsoft is increasingly incentivising organisations to move towards the equivalent cloud versions of their on-premise applications, meaning that as much knowledge as possible of how CRM/D365E Online works in the context of Office 365 is going to become increasingly more mandatory. If you are looking to secure a career change in the near future, and have not had much experience with Office 365, then this is definitely an area that you should focus on for future learning. From a day-to-day management point of view for the Online version of the product, some basic awareness of how to navigate around and use Office 365 is pretty much essential if you are going to succeed working with the product on a day-to-day basis.

Recommended Area of Study: Spin up a D365E trial, and you can very quickly start getting to grips with how the product sits within the Office 365 “ecosystem”. Practice licensing users, configuring security group level access to your D365E trial tenant and modify the details on Office 365 user accounts to see how these details are synced through into D365E. The Microsoft Virtual Academy also has a number of general courses related to Office 365 however, due to the frequent updates, it may not always be in-line with the current version. The official curriculum/certification paths for Office 365 may also suffer the same from this but are worthwhile in demonstrating your experience and ability to integrate D365E with the various related Office 365 services.

Active Directory

What is it? : For the rookie, intermediate and experienced IT admins, Active Directory needs no introduction. It is essentially Microsoft’s implementation of the Lightweight Directory Access Protocol (LDAP), having first being introduced in Windows Server 2000, providing a means of managing user, security and access rights for domain objects. There are now two distinct versions of Active Directory that are available – the more traditional, Windows server based, on-premise Active Directory and Azure Active Directory, which is utilised primarily by Office 365.

Why Knowing It Is Useful: User account records for both On-Premise and Online versions of CRM/D365E use Active Directory objects, with a number of important information synchronised between an Active Directory user and the equivalent User entity record. For example, as indicated in this MSDN article, the only way in which you can synchronise a user’s Job Title through to CRM/D365E is by updating the equivalent field on the Azure Active Directory. Active Directory objects are also the only way in which you can authenticate with the application via the Web Interface or other means, with no option to create a database user or other kind of authenticated user type.

Recommended Area of Study: It’s free to set up your own Azure Active Directory, so this is an excellent starting point for getting to grips with the technology. There’s also nothing preventing you from downloading a trial of Windows Server and installing the Active Directory server role on this machine. Once configured, you can then start to create users, update attributes, configure permissions and setup roles that contain collections of privileges. If you already have an Office 365 tenant with CRM/D365E Online, then you can use the Office 365 portal to manage your user accounts and test the synchronisation of attribute values from the Active Directory through to the application.

PowerShell

What is it? : A good way to remember what PowerShell is that it is essentially a blue command prompt window 🙂 . Traditionally only being relevant and important for those working extensively with Windows Server or Exchange, PowerShell is now increasingly important as part of administrating on-premise CRM/D365E, Office 365 and Azure, to name a few. Indeed, one of the major shock announcements this year was that PowerShell became open sourced and can be installed on Linux; representing the increasing demand and importance of Linux-based resources within the Microsoft cloud.

Why Knowing It is Useful: Similar to SQL Server, PowerShell is something that is instantly more applicable for on-premise CRM/D365E deployments. For example, the only way to modify the default number of Dashboard items is via executing the Get-CRMSetting cmdlet against your on-premise organisation. I would also, again, argue having a general awareness of PowerShell can help greatly when performing administration work against an Office 365 tenant that contains a CRM/D365E organisation, such as user provisioning or license assignment. If you are utilising the Azure Service Bus to integrate CRM/D365E for Azure-based applications, then PowerShell immediately becomes a desirable skill to have in your arsenal, allowing you to remotely administer, deploy or update Azure resources programmatically.

Recommended Area of Study: The fact that PowerShell is now open sourced means that there is a plethora of online tools and guides to refer to, and you can be assured that you can get it working on your platform of choice. The GitHub page for PowerShell is a great place to get started. Beyond that, you have a few options about how you can practice further. If you have spun up a D365E trial, then you can choose to hook up PowerShell to Office 365 to see what you can do from a remote management perspective (such as granting Send On Behalf permissions for a shared mailbox). Alternatively, you can run it from your local Windows machine, connect it up to a Windows Server instance or attempt to create new services in Azure and experiment that way.

For those businesses or individuals who are currently considering Dynamics CRM, one of the decisions that you will ultimately need to make is regarding whether you intend to use CRM Online or CRM On-Premise. For those whose first reaction to the previous statement is “Say what?”, heres a brief breakdown of the two different options:

  • CRM Online: An instance of CRM that is accessible via Office 365.
  • CRM On-Premise: An installation of CRM on your own server(s).

The word on the street these days is all around cloud computing and services, and that all organisations should have most, if not all, of their infrastructure within a hybrid public/private cloud configuration. However, it could be that you are required to host your CRM within a specific location due to regulatory or contractual requirements for your business. Or that you already have existing costs allocated towards server infrastructure that must be used as part of your CRM project.  The list of potential reasons are endless, which is why CRM On-Premise exists in the first place and is still an essential requirement for many organisations.

So here’s a breakdown of some of the factors to consider, and my recommendation on the best approach to go for:

If you are already using Exchange Online with other Office 365 services, then CRM Online is the way to go

One of the potential headaches when it comes to configuring CRM for first use is around e-mail synchronisation, something which I have hinted at in a previous blog post. If you already use Exchange Online, then the setup steps involved are greatly reduced, as CRM will automatically detect your Exchange Online profile and settings if it is on the same tenant as your CRM Online instance. If not, then you’re going to have to look at other options such as the E-mail Router in order to link up your On-Premise Exchange Server, SMTP or POP3 e-mail system. These can be fiddly to setup and maintain, as you will require a dedicated machine that hosts the E-mail Router software and you may potentially have to liaise with other third party e-mail system providers in order to troubleshoot any issues.

Sometimes it’s nice having the latest new product without paying extra for it

For On-Premise CRM, you would need to factor any future upgrade plans as part of your initial cost-investment into the system. Given the increased frequency of CRM releases, this could start adding up to big bucks after the first year or two. As a On-Premise customer as well, you will also miss out on any major updates in between versions, such as Update 1 last year for CRM Online. This was quite a fundamental and significant update, in my opinion, that helped to make CRM even easier to use. With CRM Online, you are always guaranteed to get the latest updates and thereby take advantage of some of the latest and best features available within CRM. The trade off with this is that you must upgrade to the latest version of CRM eventually. You’ll be offered a date and time for the upgrade and can delay it, but you can’t stay in the past forever! This could present issues if, for example, you have written bespoke customisations that are no longer supported or deprecated. Be sure you have read and fully understand how updates work in CRM Online before making your decision.

Evaluate your internal resources first

It could be, for example, that your organisation is moving from an internal application system that uses SQL Server as the backend database system and that you have several team members who you have invested heavily on T-SQL administration/development training. The great thing about On-Premise CRM is that this skillset will not be lost, as you will need to maintain and manage your organisation(s) databases. And, if you’re really nice, you can also let them write beautifully bespoke SSRS reports directly against your databases and let them do all sorts of other fun data integration pieces using SQL Server.

The flip side of this should be obvious, but if you and your organisation don’t know your SELECT’s from your WHERE’S, then CRM Online could be the best choice as you don’t have to worry about managing and maintaining a SQL database, as Microsoft handles all of this for you. You can instead focus yours and your team’s attention and learning potentially more relevant things relating to CRM (Online) directly

Do you trust Microsoft?

Its a serious question nowadays. In the world of cloud computing, can you say with 100% certainty that the organisation(s) where you are hosting or storing some of your business’ key data and applications will a) ensure your data is kept securely and b) able to offer satisfactory guarantees in relation to service availability? Whilst (touching wood) Office 365/CRM Online outages have been few and far between, the risk is still ever omnipotent. You will therefore need to evaluate what the maximum amount of outage time is acceptable for your business and put in place procedures to ensure that your business can keep working (for example, nightly backups of your CRM Data so that you can still access your Data via a spreadsheet/database export). The benefit of having an On-Premise CRM system is that you will more than likely have control over your server machines, as well as all the data that is stored on them, and ensure your infrastructure is built to satisfy any concerns around outages or system failures.

Nice words for your Finance Team: Operational Expenditure is better than Capital Expenditure! 

Deciding to go with CRM Online could significantly simplify your organisations visibility over your month-by-month costs. If done correctly, you could even make the bold claim that you have successfully eliminated all capital expenditure (i.e. upfront software costs) costs relating to CRM systems within your business. Based on experience, most finance departments are happier knowing they have to pay X amount over a 12 month period, as opposed to being hit by large and unwieldy costs in a sporadic and uncertain way. So if you want to be make BFF’s within your finance/account team, then CRM Online is the fastest and best way to achieve this.

Legacy Systems or that annoying finance system that’s 15 years old, but runs 20% of our business work and cannot be replaced

Without traditional database access that On-Premise provides, it may prove difficult integrating your CRM system with any legacy system, particularly if it’s a non-SQL database. That’s not to say that it’s not possible to find a solution using CRM Online, but you may have to expand significantly more resources setting up staging environments with your CRM Online/Legacy System, or look at writing customised code that performs the tasks that you need in order to “glue” both of the systems together.

Make sure your technical team understand the limitations of CRM Online clearly, and that their feedback is factored in as part of the decision making process

I’ve already mentioned the most obvious limitation for CRM Online in the form of not having access to the CRM SQL Database. But there are other limitations too. For example, you are unable to directly query all of the information contained within the audit data entity.  There is a very good (although outdated) article on TechNet which gives a flavour of some of the limitations within CRM Online. It is very important that as part of any scoping exercise that your technical team is fully aware of the limitations of CRM Online, so that any potential difficulties around integration or data access can be mitigated from the outset.

Conclusions – or Wot I Think

If you are a small to medium business who are already using Office 365 or planning to move across in the near future, then CRM Online is the obvious and best choice to ensure the most streamlined user experience and ease of management and setup. If, however, you are a much larger organisation or are required to operate under specific compliance or regulatory requirements in respect to your business applications/data, then these are the types of scenarios where On-Premise CRM is pretty much an absolute requirement.

The Scenario: You are running CRM Online in conjunction with some legacy database/application systems. These systems are setup with a SQL Server Reporting Services instance that is looking to either an SQL Server, OLE DB etc. database.

The Problem: You need to make data from your legacy systems visible within your CRM. The information needs to be displayed on the Entity Form and show specific information from the legacy database that relates to the CRM record.

Admittedly, the above is perhaps somewhat unlikely situation to find yourself in, but one which I recently had to try and address. I suppose the most straightforward resolution to the above is to just say “Get rid of the legacy system!”. Unfortunately, the suggestion didn’t go down to well when I voiced it myself…

So at this point the next best answer looked to be try and utilise what we have within the existing infrastructure: an all singing, all-dancing SSRS and SQL Server database instance.

What if we were to try uploading an .rdl file that includes a FetchXML and our SQL/OLE DB database data source into CRM? Whenever you try to perform this, you will get this error message:

ReportUploadError_NoFetchXML

 

Rats! So there is no way in which we can include a non-fetch XML Data Source to our separate SSRS report instance. So is there anything else within CRM that can be utilised to help in this situation? Let’s first take a quick look at the following nifty little feature within CRM, courtesy of our good friend MSDN:

You can use an IFRAME to display the contents from another website in a form, for example, in an ASP.NET page. Displaying an entity form within an IFrame embedded in another entity form is not supported.

Use the getValue method on the attributes that contain the data that you want to pass to the other website, and compose a string of the query string arguments the other page will be able to use. Then use a Field OnChange event, IFRAME OnReadyStateComplete event, or Tab TabStateChange event and the setSrc method to append your parameters to the src property of the IFRAME or web resource.

You may want to change the target of the IFRAME based on such considerations as the data in the form or whether the user is working offline. You can set the target of the IFRAME dynamically.

Source: https://msdn.microsoft.com/en-gb/library/gg328034.aspx

Having worked extensively with SSRS in the past, I am also aware that you can use an SSRS URL string in order to specify properties about how the report is rendered, its size and – most crucially – what the value of required parameters should be. The friend that keeps on giving has a great article that goes through everything that you can do with an SSRS report URL and also how to use Parameters as part of your URL. So in theory therefore, we can place an IFRAME on our form and then use JScript to access form-level field values and modify the IFRAME URL accordingly.

Here are the steps involved:

  1. Go into Form Editor and add a new IFRAME to the form, specifying the following settings:

Name: The Logical name of the control, this will be required as part of the JScript code used later, so make a note of it.

URL: As this is a mandatory field, you can specify any value here as it will change when the form is loaded by the user. This is not practical as we don’t want this to be displayed if, for example, the field that we are passing to the URL has no value in it. Our JScript code will sort this out in a few moments

Label: This can be anything, and defaults to whatever is entered into the Name field

Restrict cross-frame scripting, where supported: Untick this option

Ensure that ‘Visible by default’ is ticked

Your settings should look something like this:

IFRAMESettings

  1. Create or modify an existing JScript Library for the form, adding in the following function (after modifying the values accordingly):
function onLoad_LoadSSRSReport() {

    //First get the page type (Create, Update etc.)

    var pageType = Xrm.Page.ui.getFormType();
    
    //Then, only proceed if the Form Type DOES NOT equal create, can be changed depending on requirements. Full list of form types can be found here:
    
    //https://msdn.microsoft.com/en-us/library/gg327828.aspx#BKMK_getFormType

    if (pageType != "1") {

        //Get the value that you want to parameterise, in this case we are on the Account entity and need the Account Name

        var accountName = Xrm.Page.getAttribute("name").getValue();

        //In order to "accept" the parameter into the URL, spaces need to be replaced with "+" icons

        accountName = accountName.replace(/ /g, "+");

        //Now, get the the name of the IFRAME we want to update

        var iFrame = Xrm.Page.ui.controls.get("IFRAME_myssrsreport");

        //Then, specify the Report Server URL and Report Name.

        var reportURL = "https://myssrsserver/ReportServer?/My+Reports/My+Parameterised+Report&MyParameter=";

        //Now combine the report url and parameter together into a full URL string

        var paramaterizedReportURL = reportURL + accountName;

        //Finally, if there is no value in the Account Name field, hide the IFRAME; otherwise, update the URL of the IFRAME accordingly.

        if (accountName == null) {
            iFrame.setVisible(false);
        }
        else {

            iFrame.setSrc(paramaterizedReportURL);
        }
    }
}
  1. Add the function to the OnLoad event handler on the form. Now, when the form loads, it will update the IFRAME with the new URL with our required parameter.

And there we go, we now have our separate SSRS instance report working within CRM! A few things to point out though:

  • If the report parameter supplied does not load any matching records, then SSRS will display a standard message to this effect. You would need to modify the report settings in order to display a custom message here, if desired.
  • It is recommended that you have https:// binding setup on your report instance and supply this to as part of the setSrc method. http:// binding works, but you may need to change settings on your Web Browser in order to support mixed mode content. Full instructions on how to set this up can be found here.
  • This may be stating the obvious here, but if your SSRS instance is not internet-facing, then you will get an error message in your IFRAME if you are not working from the same network as your SSRS instance. Fortunately, SSRS can be configued for an Internet deployment.
  • The steps outlined in 1) can also be used to specify a non-parameterised SSRS report within an IFRAME dashboard too. I would recommend using the following SSRS system parameters as part of the URL though:
    • rs:ClearSession=true
    • rc:Toolbar=false

e.g.

https://myssrsserver/ReportServer/Pages/ReportViewer.aspx?%2fMy+Reports%2fMy+Non+Parameterised+Report&rs:ClearSession=true&rc:Toolbar=false

One of the most challenging things about any system migration is ensuring that information from other business systems can be made available, and it is good to know that CRM has supported approaches that can help to bridge the gap.