More recently I have been writing little bit more about data migration techniques and best practices. I have to say, it is a point where we all make mistakes. If you don’t map migrate the data properly, whatever the solution you may deliver, will not work. Especially when the current work of the client is dependent on existing data. If your CRM solution is not working, then you will get the blame no matter how good your solution is.
As I mentioned in my previous articles, Data Import wizard of Dynamics 365 is good for small set of data. But, what is you have tera bites of data? What should you do? In this kind of enterprise scale data migration tasks, there are many tools available. But my preference is to use SSIS. Following are the two commonly used solutions:
2. CozyRoc – http://www.cozyroc.com/products
My personal preference is cozyroc. I have used it many times. It reliable and easy to use. Also the product support is awesome. In this article I will describe a solution that could be utilized for data migration scenario. So this what you should do.
1. First thing you should do is to identify data in the data source and create flat raw data entities in CRM.1. First thing you should do is to identify data in the data source and create flat raw data entities in CRM.
2. Create temporary entities to store these flat raw data. Make sure you add status field. Because you need to trigger workflows based on the status.
3. Create the SSIS packages to read data from the source and migrate the data to CRM raw data entities. Make sure that you set the record status to trigger the workflow. For instance, when the record is uploaded to CRM, set the status to New. There should be a workflow that triggers on this record for the entity that goes and update the actual entity.
4. As mentioned above create the workflows to trigger on the status. There should be steps in the workflow, to set the raw data record to In Progress state while it is being processed. Once the processing is completed, set the record status to complete. So that, next time the workflow executes, it will not pick the completed records. By this time the record is properly being updated.
One important thing to remember here is that, to make this solution to execute efficiently, you should allocate adequate resources. This depends on the magnitude of the data you are migrating. For instance you could use Azure resources that can easily scale.
Also make sure that your other workflows or plugins which executes on actual entities are deactivated until all the migration is completed. Because, if any of them are activated, they will get executed and might end up with incomplete data updates, which is the last thing you want.