A Data Snapshot Framework for Dynamics CRM

Although Dynamics CRM offers several different ways to report on data stored in the system, there is no out-of-the-box mechanism for reporting on how data changes over time. That is to say, while you can easily report on how many active accounts are owned by a particular sales rep today, you can't report on how the number of active account compares to last week, last month or even just yesterday. Salesforce.com supports this type of historical summary reporting with a feature called "analytic snapshots," but Dynamics CRM has typically required the use of an external data warehouse that is populated with summary data on a periodic basis. In this post, I will show how to set up a basic data snapshot framework inside Dynamics CRM that behaves like the analytic snapshot functionality of Salesforce.

Please note that this is not intended to replace a true enterprise data warehouse (EDW), but rather my solution is designed to complement existing CRM functionality by offering basic trend and summary data that can be used in reports and dashboard components. Using a separate data warehouse gives report writers a lot of power and flexibility, and I highly recommend using one to maximize your analytical capabilities.

Analytic Snapshots Explained

According to the Salesforce help documentation:

An analytic snapshot lets you report on historical data. Authorized users can save tabular or summary report results to fields on a custom object, then map those fields to corresponding fields on a target object. They can then schedule when to run the report to load the custom object's fields with the report's data. Analytic snapshots enable you to work with report data similarly to how you work with other records in Salesforce.

Translated to Dynamics CRM-speak, basically the an administrator creates a new custom entity to hold the summary data, maps the fields from an existing report to the custom object and then schedules the snapshot to run daily, weekly or monthly. Once the summary data is captured in the custom object records, the data is available for reporting.

My solution achieves the same thing using a FetchXML query, a custom snapshot target entity, a custom snapshot definition entity, a workflow and a custom workflow activity. The custom snapshot definition entity, workflow and custom workflow activity constitute the snapshot processing framework, and then you will need to write a FetchXML query, create a snapshot definition record and define a custom target entity for each snapshot you want to set up. Let's take a closer look at how these pieces fit together.

Getting Started

You can download my solution from the MSDN code gallery here: Taking Scheduled Data Snapshots in Dynamics CRM. Make sure to register the custom workflow assembly using the CRM SDK plug-in registration tool, and then import and publish the "DataSnapshots" CRM solution. The solution also contains a custom snapshot definition and target entity for this example.

If you would prefer to put together your own solution, skip ahead to the Snapshot Definition section and set up your own custom snapshot definition entity, workflow and custom workflow activity as explained. Once you're done, come back here to read about how to create the query and the target entity.

FetchXML Query

Because Dynamics CRM doesn't think of reports the same way as Salesforce, I decided to use a FetchXML query as the data source for my snapshot solution. This query will be stored on the snapshot definition record, but you need to prepare it before you define your target entity so you can make sure to create all the right fields to store your snapshot data.

Here is a basic query that aggregates contact count and sum of annual income by owner and state:

<fetch mapping="logical" aggregate="true" distinct="false">  
 <entity name="contact">  
 <attribute name="fullname" alias="fullname_count" aggregate="count"></attribute>  
 <attribute name="annualincome" alias="annualincome_sum" aggregate="sum"></attribute>  
 <attribute name="address1_stateorprovince" groupby="true" alias="state_province"></attribute>  
 <attribute name="ownerid" groupby="true" alias="owner"></attribute>  
 </entity>  
</fetch>  

Custom Snapshot Target Entity

After you figure out the data you want to capture, you have to create a custom entity to store it. For the query I showed above, I have created a custom entity called "Snapshot - Contacts by State" (la_snapshotcontactsbystate). You can name your entities anything you want, but I suggest some sort of common prefix to keep them organized. I have also chosen organization ownership, but user/team ownership shouldn't cause any problems.

Snapshot target entity

On this entity, I have created custom fields as shown in the following screenshot:

Snapshot target custom fields

Obviously your custom fields will vary depending on the data your query returns. One critically important thing to remember is that you must use the exact same data types for your custom fields as are returned by the query. My custom workflow activity code is relatively unforgiving, so even something like trying to store an integer value in a decimal field, which would not have resulted in data loss, caused errors in my testing.

Snapshot Definition

Once you have your target entity defined, you are now ready to create a snapshot definition record to store the query, field mappings and frequency/scheduling details. This shows a snapshot definition that corresponds to the query and target that I explained above.

Snapshot definition

  1. The target entity name is the system name for the target entity, in this case la_snapshotcontactsbystate.
  2. The next run date field is when the snapshot will execute. The workflow that executes the snapshot runs according to a timeout whenever this value changes, and it sets the next run date in the future according to the value of the frequency field (add one day/seven days/one month) at the end of the process, thus triggering the next run. If want to enable different frequency choices, you can update the picklist options and the next run date update logic in the workflow.
  3. The query field is your FetchXML query.
  4. The field mappings are in the format of source>target, each on a separate line. You can use an alternate separator character than ">" if you update the custom workflow activity code and redeploy the assembly.

Custom Workflow Activity

The custom workflow activity behaves a lot like the custom workflow activity I created for my Scheduling recurring Dynamics CRM workflows with FetchXML post. It takes a FetchXML query as an input parameter, executes the query and then loops through the results to do some additional work. For the data snapshots, we also need to know the name of the target entity and the field mappings. Here are the relevant input parameters:

[Input("FetchXML query")]  
public InArgument FetchXMLQuery { get; set; }  
  
[Input("Snapshot mappings")]  
public InArgument SnapshotMappings { get; set; }  
  
[Input("Target entity")]  
public InArgument TargetEntity { get; set; }  

In the interest of saving space, I'm not posting the entire Execute method, but here is the section that does the query/loop/create bits:

try  
{  
 //get the source->target field mappings from the input parameters  
 string mappingsText = SnapshotMappings.Get(executionContext);  
  
 //split up the lines into an array  
 string[] mappingLines = mappingsText.Split(Environment.NewLine.ToCharArray());  
  
 //create a dictionary object to hold the mappings  
 Dictionary<string, string> mappingDictionary = new Dictionary<string,string>();  
  
 //for each line in the array, split it on the ">" character and store the source->target as a key-value pair  
 foreach(string s in mappingLines)  
 {  
 try   
 {  
 string targetfield = s.Split(">".ToCharArray())[1];  
 string sourcefield = s.Split(">".ToCharArray())[0];  
 tracingService.Trace("line: " + s + " source: " + sourcefield + " target: " + targetfield);  
 mappingDictionary.Add(sourcefield, targetfield);  
 }  
 catch(Exception ex)  
 {  
 throw new InvalidPluginExecutionException("Error parsing source->target mappings.");  
 }  
 }  
  
 //get the target entity name from the input parameters  
 string targetEntityType = TargetEntity.Get(executionContext);  
  
 //execute the fetchxml query and loop through the results  
 EntityCollection recordsToProcess = service.RetrieveMultiple(new FetchExpression(FetchXMLQuery.Get(executionContext)));  
 recordsToProcess.Entities.ToList().ForEach(a =>  
 {  
 tracingService.Trace("instantiating new entity of type: " + targetEntityType);  
 Entity targetRecord = new Entity(targetEntityType);  
   
 //loop through the mapping key-value pairs  
 foreach (string key in mappingDictionary.Keys)  
 {  
 //check to see if attribute exists in this particular result  
 //fetchxml results don't return fields with null values  
 if (a.Contains(key))  
 {  
 //if the result field is an aliasedvalue, we need to handle it differently than non-aliasedvalue types  
 if (a[key].GetType().FullName.ToUpperInvariant().EndsWith("ALIASEDVALUE"))  
 {  
 //cast it to aliasedvalue  
 var resultfield = (AliasedValue)a[key];  
  
 //set the target field to the casted resultfield value  
 targetRecord[mappingDictionary[key]] = resultfield.Value;  
  
 //trace just in case  
 tracingService.Trace("new source column [" + key + "]: destination [" + mappingDictionary[key] + "]: " + (resultfield.Value).ToString());  
 }  
 else  
 {  
 //set the target field to the resultfield  
 //target field needs to be of EXACT same type as result field  
 //for example, this code doesn't work if you try to store an integer in a decimal field  
 targetRecord[mappingDictionary[key]] = a[key];  
  
 //trace just in case  
 tracingService.Trace("new source column [" + key + "]: destination [" + mappingDictionary[key] + "]: " + (a[key]).ToString());  
 }  
 }  
 }  
 try  
 {  
 //create the record  
 service.Create(targetRecord);  
 }  
 catch (FaultException e)  
 {  
 //catch orgservice faults  
 tracingService.Trace("Exception: {0}", e.ToString());  
  
 // Handle the exception.  
 throw new InvalidPluginExecutionException("Could not create snapshot record.");  
 }  
 catch (Exception ex)  
 {  
 //catch anything else  
 throw new InvalidPluginExecutionException("Could not create snapshot record.");  
 }  
  
 });  
}

The Snapshotting Workflow

Finally, I created a workflow process called "Take Snapshot." As explained above, it runs whenever the next run date field on a snapshot definition record changes. Here's the business logic it executes at a high level:

  1. Timeout until snapshot definition's next run date
  2. Execute the Take Snapshot custom workflow activity with input parameters taken from the snapshot definition query, mapping and target entity fields
  3. Update the next run date based on the frequency

Snapshot Results

Once you set everything up and the workflow executes, you'll then be able to query snapshot data. Here's a screenshot of an advanced find view with the snapshot from this example.

Snapshot results

Happy data snapshotting!

comments powered by Disqus