Till CRM 2011 we used to use Xrm.page.getControl(“grid”) to get a grid control available on form and perform any runtime activity like changing the view/query, etc. Also we were required to write the complete script on OnLoad even of form and add a timeout till the subgrid is loaded as subgrids are loaded asynchronously after the form is loaded. There was no way we can have a trigger on load of a subgrid till CRM 2015 Update 1. What we used to do is explained in older post here.
But with CRM 2015 update 1, we now have an option to execute scripts when data is loaded in subgrids. Because of which we are not required to have any timeout added in the script and iterate it till the data is loaded into subgrid.
With addition on GridControl.addOnLoad method, we can now add event handler for OnLoad event of subgrid which is more reliable than the form OnLoad event for form with a timeout. This event will get triggered whenever data is bound to a subgrid.
The steps to add this event is not similar to adding a form OnLoad event (from form customization popup), the way to add it is by invoking it using a code from other events (e.g. form OnLoad) by making use of GridControl.addOnLoad method. Similarly, use GridControl.removeOnLoad to remove event handlers.
While working on one of the project, we were required to move data from legacy system into a live production system with limited amount of black out window. One can calculate the time required based on number of records, server configuration, etc. and can even have a buffer. But you never know when Murphy will play his role.
Below are few points that, if considered can help Murphy stick to its seat and don’t show up. (Just some pre-checks that helps the processJ)
- Confirm disk space on SQL, web and application server. Have some buffer space as logs will grow.
- As your target system is a live production system, surly there will be some scheduled maintenance jobs running to maintain server health as part of disaster management. These jobs are life savers but problem with them is that these are scheduled in down time as they consume high resources. And unfortunately this is the only time when we can perform our import. Thus you may have to consider pausing the maintenance jobs. But do remember to turn them on once done with the import. Some of the resource consuming jobs are: consistency check, database backup, async-operation cleanup, POA, etc. Pausing this job helps the import utilize maximum available server resources.
- Check the database log size and clear. The log will grow with the import and if it reaches the maximum available threshold the import processes throws timeout errors. Also check the shrink process, preferred if simple.
- Check for CPU and Memory usage on SQL, web and application server.
- Check for any blockage on SQL server, if any script is blocking or slow running queries.
- Clear AsyncOperationBase table.
- You may run into scenario where it is required to restart the SQL service. Make sure that this do not affect any other process. Also in case of NLB, upon restarting the SQL service switches the active node. Thus you will also have to consider that node for performance check.
- You may have to disable the user logging and turn off the workflows and plugins.
And last but not the least is FULL database backup before start of the process. Also you make to do this in several passes due to amount of data and limited amount of down time. Identity the steps that can be performed outside black out that do not affect live system. This provides some extra time for completing the critical steps.
I would recommend to do the import in multiple small passes which helps in keeping the buffer and reduces the chances of breaking things or running on edges. After all “Rome was not built in a day”.
These are some of the steps that helped me. As always, these may not match exactly to your requirement but some of them will surly. And I don’t guaranty of anything from the steps as risk will be yours as it’s your production system.
If you have anything to add, please write in comment and I will update the content. Thanks!
Note: Contents re-blogged as it is from TechNet blog.
The Configuration Migration tool enables you to move configuration data across Microsoft Dynamics CRM instances and organizations.
The Configuration Migration tool allows you to:
- Select the entities and fields from where you want to export the configuration data.
- Avoid duplicate records on the target system by defining a uniqueness condition for each entity based on a combination of fields in the entity, which is used to compare against the values on the target system. If there are no matching values, a unique record is created on the target system. If a matching record is found, the record is updated on the target system. If no duplicate detection (uniqueness) condition is specified for an entity that is being exported, the tool uses the primary field name of the entity to compare against the existing data on the target system.
- Disable plug-ins before exporting data and then re-enable them on the target system after the import is complete for all the entities or selected entities.
- Validate the schema for the selected entities to be exported to ensure that all the required data/information is present.
- Reuse an existing schema to export data from a source system.
- Embed the exported modules created from this tool (schema and data files) in other programs. For example, you can use the exported data in Microsoft Dynamics CRM Package Deployer along with other solutions files and data to create and deploy packages on a CRM instance.
- The Configuration Migration tool does not support filtering of records in an entity. By default, all the records in the selected entity will be exported.
The following diagram illustrates how the Configuration Migration tool is used for migrating configuration data.
Note: This is all about UNSUPPORTED database level updates. These kind of updates are not recommended (unless you are left with no options).
Though any kind of database level updates are not recommended in CRM, there are scenarios where we are required to do so. I had one such requirement where I was required to perform a simple update (setting two options field value to Yes) for millions of records. Doing same using a supportive way (bulk update or bulk workflow execution) used to take days due to CRM service calls but same updates using SQL script was done in few hours.
I had auditing enabled for these records in CRM, but as this was unsupported db level update I was not expecting any help from audit history. What was expected form auditing was either of below:
- Change track as new update request in audit history. (If there is any kind of trigger return at db level to maintain auditing)
- No updates at all in audit history. (Assuming auditing are maintained by some kind of workflow/plugin that trigger at UI level)
Unfortunately, none of this happened and what happened was not at all audit friendly or expected. This is what happened:
The change made by SQL script was tracked at create event itself in audit history. But only the field value change was tracked and not the change date/time. The change date/time was still showing when the record was created.
Though these kind of changes are not recommended, but I still feel that in some cases these are required and this behavior of auditing will affect the consistency. I feel that things should be either completely baked or not baked at all than having them half baked. But again this is only what I feel. J
I created a Contact record on 7/6/2015 at 11:52 AM with ‘First R~~~~~ Created’ set to ‘No’. Audit history shows something like below.
Now on 7/6/2015 at 12:00 Noon, I update the value for ‘First R~~~~~ created’ to ‘Yes’ using a database level update script. Ideally there should have been a new entry made in audit log to track this change and there should be no change made to existing audit history records. But CRM updates the changed value in create event itself without even tracking the change time.
Hope this helps and thanks if you have come all the way down here reading the article.
I have an OOB workflow process developed some time back and activated. Now when I try to open the process, I get generic CRM error (below image). Also the process status is changed to “Draft” automatically. There were no updates made to the process.
I have seen these issues frequently especially when working with Child Workflow Processes where I have a child workflow called from a Master workflow. If you have a Master workflow calling a child workflow and for some reason the child workflow is activated as “Process Template”, then user gets this error when they try to open the master workflow.
CRM throws generic exception and its status is changed to Draft if it linked/related to a “Process Template” instead of “Process”.
Other common issue faced while working with workflow process is “Invalid Expression” error in “Condition Expression” step. This occurs if the field used in condition expression is either deleted from system or if Value is changed. Once you fix the field/value issue, all conditions in the process are automatically fixed.
We all know that whenever we create an entity in CRM, CRM creates few fields by default. Some of these like Owner, Owning Business Unit, Created On, Modified On, etc. are used extensively while working with Dynamics CRM. But there are some fields that we use very rarely or even never; thus we tend to forget about them. But some of these fields are really important and can save a lot custom work, it’s just that we should be aware of them.
Recently there was a question on Dynamics Community regarding use of same fields which inspired me to search for them and have it documented. Few fields can be understood from their name but some fields don’t even have proper description in CRM. For such fields, below table can help. Links against the fields has more explanation and scenarios where these can be used.
||User who created the record.
||Date and time when the record was created.
||To create records on behalf of another user.
Used for impersonation in CRM 2011
||When I create a record, such as an opportunity,
and use a non-base currency, such as GBP in this
hypothetical scenario, this is what happens:
- I set the Currency field to GBP
- I put a value in the money field, called Estimated
- I save the record. During the save operation,
the following occurs:
- The Exchange Rate field is populated with the
- The Estimated Revenue_base field is populated
with the value from the Estimated Revenue field
converted to its inflated USD value
- Both of these updates occur whether the Exchange
Rate and Estimated Revenue_base fields are exposed
on the form or not
Afterwards, when the dollar makes that incredible comeback,
I need to update my exchange rate on the GBP currency.
I do so – it’s now 1, reflecting a 1:1 conversion rate.
What happens to my opportunity? Nothing, at least not immediately.
The Exchange Rate field on my opportunity is not automatically populated
with the new value of 1. However, as soon as I change the value of ANY money
field on the opportunity and save the form, the Exchange Rate field is updated
with the new value from the GBP currency record, and the Estimated Revenue_base
field is updated with the new converted value (and any other _base fields for other
money fields on the form – remember, there is only 1 Currency and 1 Exchange Rate field,
so these values apply to all money fields on the form). This also happens if I change the state
of the record, such as closing the Opportunity as won or lost, activating or closing a quote, etc.
||The import sequence number is a whole number field. The range can be customized if needed.
The basic idea behind this field is to store the sequence number (ID) of the source record during
data import to CRM. If this field is mapped during migration package/script design,
it provides a one-to-one link between source row and destination CRM record.
||User who modified the record.
||Date and time when the record was modified.
||To create records on behalf of another user. Used for impersonation in CRM 2011
||Date and time that the record was migrated.
||The definition of a recurrence pattern of how local time is converted to/from Universal Coordinated Time (UTC). This includes information about Daylight Savings Time (DST) versus Standard Time. Time zone rules can change over time, thus a time zone can have multiple rules from a historic point of view, but can have only one rule that is current and in effect.
||Currency associated with the entity.
||Time zone code that was in use when the record was created.
||This column is used mainly for concurrency support. The VERSIONNUMBER field is a unique value that gets incremented as records are updated – it can be very useful.
One of my college recently had a requirement to import few records into CRM with back dated values. Isn’t it a simple task? Just map/set the back dated value for ‘overriddencreatedon’ field while create/import of records. But has anyone checked what happens to the value for ‘createdon’ field?
All views and fields in CRM displays ‘createdon’ field and not ‘overriddencreatedon’. Thus, wherever we set value for ‘overriddencreatedon’, the value is set to ‘createdon’ field and the actual date/time when the record is created into CRM is set to ‘overriddencreatedon’.
E.g.: I’m creating a record on 2015-06-16 11:45:23.000 UTC into CRM with value for overriddencreatedon set to 2015-06-16 11:44:39.000 UTC. Upon completion, if you query the database you will see that the value what I set to overriddencreatedon is set to ‘createdon’ and the value for ‘overriddencreatedon’ is replaced with actual date/time when the record was created.
Update request at 2015-06-16 11:45:23.000 UTC:
Createdon = null
overriddencreatedon = 2015-06-16 11:44:39.000 UTC
Createdon = 2015-06-16 11:44:39.000 UTC
Overriddencreatedon = 2015-06-16 11:45:23.000 UTC
Note: Setting ‘overriddencreatedon’ on update of record is not supported.
Update 20th Feb 2016:
The Microsoft Dynamics CRM content pack for Power BI Preview allows you to easily access and analyze CRM data in Power BI. The content pack uses the OData feed to create a descriptive model, with all the entities and measures needed such as Accounts, Activities, Opportunities, Product, Leads, Users and more.
After you have Power BI subscription, Click Get Data on the welcome screen.
Select Microsoft Dynamics CRM and click Connect.
Make sure your popup blocker is disabled or is set to allow popups from app.powerbi.com.
Provide the OData URL associated with your account. This will be in the form “https://mytenant.crm.dynamics.com/XRMServices/2011/OrganizationData.svc ”
When prompted, provide your credentials (this step might be skipped if you are already signed in with your browser). For Authentication Method, enter oAuth2 and click Sign In:
After connecting, you’ll see a dashboard customized for a Sales Manager, populated with your own data:
For more information, see below link. Happy analyzing the data J
**[Original Post from Microsoft CRM Team Blog]
CRM 2015 introduced a new API for enabling creation of applications across multiple platforms.
This new Web API is available as a preview feature to Microsoft Dynamics CRM Online organizations that use Dynamics CRM Online 2015 Update 1.
To support this preview feature we have put together documentation and samples you can find at https://msdn.microsoft.com/dynamics/crm/webapipreview.
In this topic
While working on one upgrade project, after successfully completion of all activities there were multiple performance issues experienced. While we were trying to troubleshoot the issue, may be in wrong direction; found this great article today from Sudhir Nory which explains the root cause. This may not be the issue for all performance issue on upgrade projects but is surly helpful to know.
When a database is upgraded to SQL Server 2014 from any earlier version of SQL Server, the database retains its existing compatibility level if it is at least 100(SQL Server 2008 R2). If the compatibility-level setting is 110 or lower it uses the old query optimizer which may not be effective. We found the current compatibility level displayed in the Compatibility level list box was SQL server 2008 for the ORG_MSCRM Database SQL Server 2014 includes substantial improvements to the component that creates and optimized query plans and gets utilized only when database compatibility level is reset to 120 (SQL server 2014).