Sonoma Partners Microsoft CRM and Salesforce Blog

Power BI Online Integration with Dynamics CRM On-Premise

Today's blog post was written by Hayden Thomas, Associate Developer at Sonoma Partners.

Integrating Power BI online with Dynamics CRM On-Premise is not currently supported natively. Recently we had a need to integrate a Power BI Report with a Dynamics CRM On-Premise environment, so we needed to create a custom solution to enable embedding reports and dashboards from Power BI into CRM.

Power BI natively allows reports to be ‘Published to Web.’  Doing this would allow us to simply IFrame the report on a Dynamics form or dashboard, but this makes it accessible to anyone who may have the link which is not very secure. This is unsuitable as the reports we’re looking to embed may have sensitive data which we want to make absolutely sure no one has outside access to.

In our case, we are connecting to Power BI through Azure. Azure uses OAuth 2.0 and Active Directory services for authentication. We need to be able to store:

  1. A Client ID that represents a connection to Power BI through Azure.
  2. A Tenant ID for our Azure Active Directory.
  3. An access token that will allow us to request a report from Power BI.
  4. A refresh token that will allow us to programmatically keep our authentication alive, so that we don’t need to continuously keep putting in our username and password.
  5. The lifespan of the authentication and the date-time we obtained it, so we can check if our current authentication is still good.

The Client and Tenant ID are the same for all of the users in our org, so we simply created a configuration record to hold these values.

The other fields will either be created for every user and we need to ask the user to enter their authentication to populate these fields, or we need to create a service account that can be authenticated in the background without the user needing any information on how the reports are being displayed.

Our solution used the second option. Our reports are shared amongst a group in Power BI. In order to have everyone log in and have their own tokens, it would require them to all be added to that group (and in turn, require everyone to have a Power BI Pro license). This also allowed us to just add the authentication fields to the configuration record, along with the service account’s username and password.

Powerbi brendan 1

Our next step is to make sure we actually have an app registered to our Azure AD that we can authenticate this user against. If we log in to, we can register one directly to ensure that it’s set up correctly. In order to make sure we don’t need to handle anything with redirect URLs, since we expect to move this around to different orgs without much issue, we use Native app from the App Type drop down, and for our redirect URL we use this link. For our case, where we only want to be able to read dashboards/reports, we only give it the read all dashboards and read all reports access levels. Once done, we can click Register App to obtain the Client ID we will use for our configuration record.

We have our app created, but haven’t yet given permission from our service account to the app to be able to log the user in programmatically. In order to give permission, we wrote a LINQPad script that does nothing but connect to the Client ID of our app, and allow the user to log in to give access.

Powerbi brendan 2

Running the script will pop up a dialog to allow a user to log into the App created with the specified client ID.

Powerbi brendan 3

In order to connect and display the report, we look to the Power BI documentation on how to show a report in an IFrame. We see that we need an embed URL and an access token. Since we need to send information to the IFrame after it’s already set, and because we want to be able to use different reports in different areas, we create an HTML web resource that’s got an Iframe in it, and set the frame contents accordingly using JavaScript.

Powerbi brendan 4

Excess code for styling and other libraries used in JavaScript removed for brevity.

The JavaScript in this page does a number of things. When the resource initially loads it parses the report ID and the group ID, in which the report is stored, from the query string. This lets us use the same web resource on the same page to load multiple reports. In the web resource properties on dynamics, we can set this report and group ID field accordingly in the custom parameters.

Powerbi brendan 5

It then triggers a custom action that takes in both of those parameters. The custom action triggers some plugin code that loads the configuration record, ensures the authentication is up to date, and then queries Power BI for the embed URL for that report.

For ensuring our authentication is up to date, we see if we have an access token or if our token has expired (based on the authentication lifespan and authentication obtained date time fields we have on our configuration). If we don’t have a refresh token, we need to use the password grant_type along with the service account username and password. (If we can refresh, we do something similar using grant_type refresh and the refresh token that we have stored in our configuration record. More details on Azure OAuth operations can be found here: In this example, we deserialize into an AccessToken model class that’s described by the documentation above.

Powerbi brendan 6

With our access token, we can query Power BI for the reports which are shared with the Group ID we sent added as a parameter by doing an HTTP GET request against with an Authorization: Bearer ACCESS_TOKEN header.

Powerbi brendan 7

This will give us a response that’s a JSON string which will be an array of all of the reports for the group. Each entry in the array will contain the report ID and the embed URL. There are additional fields, such as the display name for the report, but they’re unimportant for what we’re doing. We simply need to find the entry that has the report ID that we passed in, and return the embed URL and access token back to the web resource. Powerbi brendan 8

Once we have those fields in the client side, we can simply set the source of our Iframe to the embed URL we received, and post the access token to it.

Now we can see our Power BI report as an iFrame.  In this case we embedded as a Dashboard in Dynamics CRM.


Topics: Microsoft Dynamics 365 Microsoft Dynamics CRM

Async is Smooth, Smooth is Fast

Today's blog post was written by William "Dibbs" Dibbern, Principal Developer at Sonoma Partners.

Having worked with Dynamics 365 for Sales (CRM) for nearly seven years now, we've seen a lot of things which cannot be unseen when it comes to client-side code. Whenever we get called in to evaluate the current state of an implementation, a good gauge by which to judge the state of the union is a glance at the JavaScript. Since JavaScript as a language is so forgiving, you can end up with a lot of code that "just works" but suffers in the areas of durability, maintainability, and performance.

What's the number one offense we see? Synchronous service calls.

When code degrades the end user experience, it shoots straight to the top of our "fix it now" list.

What's so bad about synchronous service calls?

Synchronous service calls have been prooven to be detrimental to the user experience. The browser literally stops everything to wait for the result of that call. When the browser is locked up waiting for the result of a synchronous service call, the user can't click anywhere else, enter any other information, cancel the operation, nor see any updates. Accordingly, users can't be given a glimpse into any synchronous operation's progress. In certain browsers, like Google Chrome, these types of synchronous requests are even becoming deprecated so you won't be able to do them much longer.

How do we fix this on forms?

Synchronous requests from the form are fairly straight forward to transition to asynchronous calls. Either continue processing the result in a callback function, or even better, you can implement promises for a cleaner way of dealing with multiple service calls. This does require a bit of thinking, but after a few refactories it quickly becomes second nature. Hastily written code almost always has a side effect, a few minutes saved coding could cost users hours of productivity in the long run.

Below is a contrived example where we retrieve the full name of the primary contact for the current record's parent account. Phew that's a mouthful. Long story short: let's retrieve a value from a related record. This example is designed as such purely to demonstrate the difference when multiple service calls are required, so yes, you can actually retrieve the required information in one service call. Also note that I've abstracted away all of the underlying XMLHttpRequest code, expecting that you are using a library to wrap that as well (though hopefully not jQuery, but that's another subject for another post).

  /* Bad */
  function onParentAccountChanged(parentAccountId) {
  var parentAccount,
  if (!parentAccountId) {
  parentAccount = WebAPI.get('/accounts?$filter=accountid eq ' + parentAccountId);
  if (!parentAccount || !parentAccount._primarycontactid_value) {
  primaryContact = WebAPI.get('/contacts?$filter=contactid eq ' + parentAccount._primarycontactid_value);
  if (primaryContact && primaryContact.fullname) {
  /* Good (using callbacks) */
  function onParentAccountChanged(parentAccountId) {
  if (!parentAccountId) {
  WebAPI.get('/accounts?$filter=accountid eq ' + parentAccountId,
  function onSuccess(parentAccount) {
  WebAPI.get('/contacts?$filter=contactid eq ' + parentAccount._primarycontactid_value,
  function onSuccess(primaryContact) {
  if (primaryContact && primaryContact.fullname) {
  function onError(e) { /* handle error */ });
  function onError(e) { /* handle error */ });
  /* Good (using promises) */
  function onParentAccountChanged(parentAccountId) {
  if (!parentAccountId) {
  WebAPI.get('/accounts?$filter=accountid eq ' + parentAccountId)
  .then(function onSuccess(parentAccount) {
  return WebAPI.get('/contacts?$filter=contactid eq ' + parentAccount._primarycontactid_value);
  .then(function onSuccess(primaryContact) {
  if (primaryContact && primaryContact.fullname) {
  .catch(function onError(e) { /* handle error */ });


Did you notice how with the last good example, using promises, that the flow is actually very similar to how you would do things synchronously? That's one of many reasons why we love promises.

Did you also notice how the code in the callbacks example, with all the indentation, starts to look like a pyramid? That's one of the many reasons why we don't like callback as much as promises. You could flatten that out by pulling out the callback functions and defining them alongside onParentAccountChanged but let's be honest here, that doesn't usually happen until it's too late.

What about the command bar (ribbon)?

Ok. Let's address the tricky bit: custom enable rules. You might think you need your code to immediately return the result of your service call so that Dynamics knows to show the button as enabled or disabled, but this is not the case. You can return a smart default (usually default to disabled), do your service calls to determine what the actual state of it should be, and then refresh the ribbon (by invoking Xrm.Page.ui.refreshRibbon()) such that it presents that newly determined state. A bit steppy, with a potential for a flash of disabled -> enabled or vice versa, but overall a better experience than having the form lockup.

Looking for an example? Have a look below. The following example checks if the parent account's primary contact field is set:

  /* Bad */
  function isPrimaryContactSet(accountId) {
  var account;
  if (!accountId) {
  throw new Error('`accountId` is a required parameter for `isPrimaryContactSet`');
  account = WebAPI.get('/accounts?$filter=accountid eq ' + accountId);
  if (account._primarycontactid_value) {
  return true;
  return false;
  /* Good */
  var isPrimaryContactCheckComplete = false,
  function isPrimaryContactSet(accountId) {
  if (!accountId) {
  throw new Error('`accountId` is a required parameter for `isPrimaryContactSet`');
  if (isPrimaryContactCheckComplete) {
  return isPrimaryContactSetResult;
  WebAPI.get('/accounts?$filter=accountid eq ' + accountId)
  .then(function onSuccess(parentAccount) {
  if (parentAccount._primarycontactid_value) {
  isPrimaryContactSetResult = true;
  else {
  isPrimaryContactSetResult = false;
  .catch(function onError(e) { /* handle error */ });
  return false;


Let's take note of a few key differences found in the "good" example:

  1. The service call is only ever run once as the result is cached.
  2. We default the button to disabled as the first time it hasn't run so we end up returning false.
  3. The variables storing the state of the call are stored outside the function being invoked. I would expect those variables not to be global variables however, they should instead be local to a parent scope that is not the global scope. This will help avoid conflicts.

A Special Note on Progress Indicators

I'm guessing by now you may have had the thought that since we're not locking up the browser any more, the user is now free to double click buttons and duplicate actions. I believe there are certain time frames in which that is fine. As long as a user sees a relatively instanteous response of any positive form, they are not very likely to anger click again and again.

However, if you do have a longer running request being executed, you'll probably need to introduce some form of progress notification in order to keep the user informed. For example, when you have a ribbon button that processes a bunch of records, perhaps have the button pop open a dialog which would display a progress indicator, and let the actual operation run inside the dialog's code. This way the user sees something "productive" happening immediately.

What's the cut off that determines if you need a progress indicator? The Nielsen Norman Group provides a good rule of thumb that if the operation is going to take longer than 10 seconds, you should provide a progress indicator, and consider displaying detail of changes in progress if possible. If the operation averages between 2 and 10 seconds, the recommmendation is to provide an indeterminate progress indicator.

In Summary

You should (almost) never be using synchronous service calls. 99.99% of the time you can, with a little bit more elbow grease and human processing power, accomplish the same exact thing with the application of an asynchronous pattern.

Topics: Microsoft Dynamics 365

Dynamics 365 Demo Video: Auto Capture

Today's blog post and video were created by Bryson Engelen, Sales Engineer at Sonoma Partners.

I'd like to share one of my favorite features of Dynamics 365 in this demo video.

Auto Capture lets you see emails relevant to an Account or Contact from within Dynamics 365 before they are tracked into the system.

Dynamics 365 now finds email messages within Microsoft Exchange to or from relevant email addresses and displays them right against the related records in Dynamics 365. Then, with just a click, you can move them from the personal intelligence of your inbox up to the collective intelligence of Dynamics 365, making them available to your team and for use by other Relationship Insights features.

You can still track messages in Dynamics 365 in all the ways that you could before, this just adds another way to bring emails from Exchange into Dynamics.  The emails found by auto capture show up inline with the rest of the record’s activities and are displayed on Contact, Opportunity, Lead, Account, Case, and Custom entities.

It should be noted that each email is private and visible only to you unless you choose to convert it to a tracked email.  It displays in gray with a dotted border and includes a Track link and private email label. Once you choose to track the email, all you need to do is click the track link to bring that message into Dynamics 365.  To see Auto Capture in action, watch the video below. 

Questions or comments? Drop us a line.

Topics: Microsoft Dynamics 365

Bulkify Bulk Edit

Today's blog post was written by Mike Dearing, Development Principal at Sonoma Partners.

Microsoft Dynamics’ bulk edit functionality is a convenient way to mass update multiple records of the same entity at the same time.

While some additional avenues exist for doing bulk edits, such as native excel imports, or excel online and editable grids for Dynamics 365 Online customers, the bulk edit dialog still provides efficient means for quickly mass updating one or two fields across a filtered set of records. There are a couple of limitations, however, that clients tend to ask about:

How do I blank out values?

There unfortunately isn’t a ‘clear’ keyword like you may be familiar with via workflow functionality. Since every field from the most recently visited main form for the entity currently being edited is available for updating, only those fields which have had values supplied are actually updated once you apply your edit. There are a few workarounds here though:

  1. Create a workflow/plugin that executes when the field you want to clear has had a specific value populated within it. For instance, if you have a text field that you want to clear through a bulk edit, perhaps entering the phrase <Clear> triggers logic that leverages the workflow <Clear> action, a plugin to blank out the value. This approach is pretty limited to specific field types, such as single line of text or multi-line of text fields. For example, if you want to clear out Lead’s Description field when a value of <Clear> is entered, you would configure your workflow as follows:

    1. When description changes

      Bulk 1

    2. Then clear Description

      Bulk 2

    3. If Description's text is <Clear>

      Bulk 3

      If you enter <Clear> into the Description field and press 'Change,' you'll see that each of the selected reords have had their Description value cleared.

  2. Create a ‘Clear X’ field and a workflow/plugin to support it. This could be one additional bitfield per field that you’d like to clear, or an option set that lists out field names that correspond to a list of fields that you want to enable for clearing. Similar to the first approach, you’ll create a workflow/plugin that executes here to do the heavy lifting for you, but this time it will be based on the value within your ‘Clear X’ field. You should also remove the selected value within the ‘Clear X’ field, so that it remains stateless. Once you have this field on the form, it will appear on the bulk edit dialog as well, and users can select it as they’d select any other field within the dialog. You’ll want to hide this field on the main entity form since it doesn’t serve much purpose within a normal edit form, but make sure that it still is available from the bulk edit form. I discuss how to do this later in the ‘How do I control which fields are available for bulk edit?’ part of this post. For example, if you want to clear out Lead’s Description field or Lead’s Industry field, you would do the following:

    1. Create a new ‘Clear Value’ option set.

    2. Add 2 options, one called ‘Description’ and one called ‘Industry’

      Bulk 4

    3. Place ‘Clear Value’ on the lead form

    4. Create your workflow to clear ‘Description’ if Clear Value is set to Description, or to clear ‘Industry’ if Clear Value is set to Industry. Also revert Clear Value back to blank.

      Bulk 5
      Bulk 6
      Bulk 7
      Bulk 8
      Bulk 9

      If you select ‘Description’ for Clear Value and press ‘Change’, you’ll see that each of the selected records have had their Description value cleared. Similarly, if you select ‘Industry’ for Clear Value and press ‘Change’, you’ll see that each of the selected records have had their Industry value cleared.

  3. Create an on demand workflow per field that needs to be cleared. Using the <Clear> action mentioned above, this can be accomplished without much effort. This detracts from the seamless experience of managing all mass edits through the bulk edit dialog, and also doesn’t scale well if there are many fields that you want to enable for bulk clear functionality.

The second option above is the most elegant and flexible if you have the time to implement it. For the first or second option, I prefer plugins over workflows due to the efficiencies of doing these clears in the pre operation pipeline, meaning the changes are written to the database as a part of the same update initiated by the bulk edit, whereas workflows execute post operation, kicking off a second update once the bulk edit’s update has completed. I’ve also considered leveraging dialogs, but their single record execution prevents this from being a viable approach for bulk edits.

How do I control which fields are available for bulk editing?

Rather than there being a ‘bulk edit’ form type, Dynamics leverages the main form type. In the case of multiple main forms, the most recent form that the user has visited will be the basis of their bulk edit form. Even though you can’t edit subgrids and certain other fields, those will still appear on the bulk edit dialog, as well as fields that you conditionally hide via business rules or JavaScript, since neither business rules nor JavaScript execute on bulk edit forms. The suggested approach here is as follows:

  1. To set a field’s visibility such that it doesn’t appear on the bulk edit dialog, hide fields via the main form’s customizations. These fields presumably need to be displayed on the main form still, so define a business rule ‘show’ action per field that should be displayed, or JavaScript to similarly show these fields on load of the main form.

    For example, if you’d like to hide the Website field from Lead’s bulk edit, but still want it to appear on the main form, you would define a business rule as follows:

    Bulk 10

    You would also need to set this field as not visible through the form’s customizations.

    Bulk 11

  2. To set a field’s visibility such that it only appears on the bulk edit dialog, you’ll want to do the opposite of above. Show fields via the main form’s customizations, then define a business rule ‘hide’ action per field that should be hidden, or JavaScript to similarly hide these fields on load of the main form.

    For example, if you’d like to show the field that we created above named ‘Clear Value’ on the bulk edit form, but don’t want it to appear on the main form, you would define a business rule as follows:

    Bulk 12

Since business rules and JavaScript don’t execute on bulk edit forms, the visibility option that you specify for that field through the main form’s customizations will be the visibility of that field on the bulk edit dialog.

With minimal effort, you’ve now enhanced your bulk edit dialogs to be more powerful and more user-friendly. Happy editing!

Topics: Microsoft Dynamics 365

Dynamics 365: Miscellaneous Security Permissions

Today's blog post was written by Jen Ford, Principal QA at Sonoma Partners.

There are so many permissions to consider when you are setting up access for your users. Should I remove delete privileges from Contacts? Should a user be able to view all Cases or should some roles have no access? Should I restrict Account permissions to only see those that the user owns? In addition to making these decisions for entity-specific permissions, there are a slew of Miscellaneous Privileges on each tab of the Security Role that we can set for additional access to special privileges that aren’t a blanket permission on whether or not a user has read, write, or delete privileges to a specific entity. Some of them are very straightforward: Publish Reports or Publish Duplicate Detection Rules. But some of them are more nuanced, or their function doesn’t easily match the name of the permission. What is the difference between the Browse Availability and the Search Availability permissions? What are these, anyway? Let’s take a look at the Miscellaneous permissions on each tab of the Security Role:

Core Records Tab

  • Add Report Services Reports
    • Ability to publish reports.
  • Bulk Delete
    • Ability to delete data in bulk (under Settings > Data Management).
  • Delete Audit Partitions
    • Ability to delete Audit Partitions from Settings > Auditing > Audit Log Management.
  • Manage Data Encryption key – Activate
    • In order to support server-side sync and Yammer integration capabilities, Dynamics 365 needs to store passwords for email services and Yammer authentication tokens. Dynamics 365 uses standard Microsoft SQL Server cell level encryption for a set of default entity attributes that contain sensitive information, such as user names and email passwords. Under Settings > Data Management > Data Encryption (ability to set this value initially).
  • Manage Data Encryption key – Change
    • In order to support server-side sync and Yammer integration capabilities, Dynamics 365 needs to store passwords for email services and Yammer authentication tokens. Dynamics 365 uses standard Microsoft SQL Server cell level encryption for a set of default entity attributes that contain sensitive information, such as user names and email passwords. Under Settings > Data Management > Data Encryption (the "Change" button).
  • Manage Data Encryption key – Read
    • In order to support server-side sync and Yammer integration capabilities, Dynamics 365 needs to store passwords for email services and Yammer authentication tokens. Dynamics 365 uses standard Microsoft SQL Server cell level encryption for a set of default entity attributes that contain sensitive information, such as user names and email passwords. Under Settings > Data Management > Data Encryption (ability to read the Data Encryption Key and view the encrypted data).
  • Manage User Synchronization Filters
    • Manage Offline and Outlook sync filters.
  • Promote User to Microsoft Dynamics CRM User Administrator Role
    • For Online only. Allows you to elevate the privileges of a specific user to System Administrator with the "Promote to Admin" button in the ribbon.
  • Publish Duplicate Detection Rules
    • Ability to publish duplicate detection rules.
  • Publish Email Templates
    • Ability to make Email Templates available to the organization. Under Settings > Templates > Email Templates, there is an option on the Actions menu on the Email Template form for "Make Template Available to Organization."
  • Publish Mail Merge Templates to Organization
    • Ability to make Mail Merge Templates available to the organization. Under Settings > Templates > Mail Merge Templates, there is an option on the More Actions menu for "Make Available to Organization."
  • Publish Reports
    • Ability to set "Viewable By" = "Organization" on the Report Administration tab.
  • Run SharePoint Integration Wizard
    • Allows the user to run the "Enable Server-based Authentication" wizard in Dynamics 365.
  • Turn on Tracing
    • User is able to generate trace files for the organization.
  • View Audit History
    • Ability to view Audit History records off of a related record.
  • View Audit Partitions
    • Able to view the Audit Partitions (under Settings > Auditing > Audit Log Management).
  • View Audit Summary
    • Ability to view Audit History via Settings > Auditing > Audit Summary View.

Marketing Tab

  • Configure Internet Marketing module
    • Internet Lead Capture for CRM 2011. No longer available.
  • Use internet marketing module
    • Internet Lead Capture for CRM 2011. No longer available.
  • Create Quick Campaign
    • Ability to create a Quick Campaign.

Sales Tab

  • Override Invoice Pricing
    • Allows the user to select a Write-In Product, or select 'Override Pricing' on the Invoice Product.
  • Override Opportunity Pricing
    • Allows the user to select a Write In Product, or select 'Override Pricing' on the Opportunity Product.
  • Override Order Pricing
    • Allows the user to select a Write In Product, or select 'Override Pricing' on the Order Product.
  • Override Quote Order Invoice Delete
    • Allows the user to delete an inactive Quote, Order, or Invoice.
  • Override Quote Pricing
    • Allows the user to select a Write In Product, or select 'Override Pricing' on the Quote Product.

Service Tab

  • Approve Knowledge Articles
    • Ability to click "Approve" on a Knowledge Article. If this permission is not granted, the user will not see this button.
  • Publish Articles
    • Ability to publish an Article. This is the old Article entity, not the newer Knowledge Article entity.
  • Publish Knowledge Articles
    • Ability to click "Publish" on a Knowledge Article. If this permission is not granted, the user will not see this button.

Business Management Tab - Privacy Related Privileges

  • Document Generation
    • Allows the user to download a template from CRM (Templates > Document Templates).
  • Dynamics 365 for mobile
    • Allows access to the Dynamics 365 app on a mobile device.
  • Dynamics 365 for phones express
    • Allows access to the Dynamics 365 for phones express app on a mobile phone.
  • Export to Excel
    • Ability to export data from Views and Advanced Find to excel. If this permission is not granted, the user will not see this button.
  • Go Offline in Outlook
    • Allow users to sync offline while they are using Dynamics for Outlook. If this permission is not granted, the user will not see an option to 'Go Offline' in the Outlook client.
  • Mail Merge
    • Able to perform a Mail Merge in the Outlook client. The Web Mail Merge permission is required to perform a Mail Merge in the web client.
  • Print
    • Able to create a printer-friendly display of a grid, by selecting Print Preview in the personal Settings Menu.
  • Sync to Outlook
    • Allow users to sync Contacts and Activities to Outlook.
  • Use Dynamics 365 App for Outlook
    • Allows access to the Dynamics 365 app for Outlook.

Business Management Tab - Miscellaneous Privileges

  • Act on Behalf of Another User
    • Needed to publish workflows. Also can be used for impersonation.
  • Approve Email Addresses for Users or Queues
    • Able to click on 'Approve Email' and 'Reject Email' from the User record or the Queue record.
  • Assign manager for a user
    • Able to set the Manager field on a User record.
  • Assign position for a user
    • Able to set or change a Position for a User, using Hierarchy Modeling.
  • Assign Territory to User
    • Able to set the Territory field on a User record.
  • Bulk Edit
    • Ability to select multiple records at the same time, and click Edit.
  • Change Hierarchy Security Settings
    • Able to change from Position to Manager Hierarchy, Enable Hierarchy Modeling, and set the Entities to include in Hierarchy Modeling.
  • Dynamics 365 Address Book
    • Able to search on Dynamics 365 Contacts in the To, From, and Bcc fields of an Email opened through the Dynamics 365 App for Outlook.
  • Enable or Disable Business Unit
    • Able to select Enable/Disable on a Business Unit (under Settings > Security).
  • Enable or Disable User
    • Able to select Enable/Disable on a User (under Settings > Security).
  • Language Settings
    • Able to provision other Languages (under Settings > Administration).
  • Merge
    • Ability to merge records. If this permission is not granted, the user will not see this button.
  • Override Created on or Created by for Records during Data Import
    • Allows user to set Created On & Created By during import, instead of setting these to the import time and import User, respectively.
  • Perform in sync rollups on goals
    • Permits the user to roll up goal data on demand, instead of waiting for the next scheduled update period, by using the 'Recalculate' button on the Goal record.
  • Read License info
    • Able to access information about the CRM License via the API.
  • Reparent Business unit
    • Able to change the Parent Business field on a Business Unit record.
  • Reparent team
    • Able to change the Business Unit on a Team record (Under Settings > Security).
  • Reparent user
    • Able to change the Business Unit on a User record (Under Settings > Security).
  • Send Email as Another User
    • Able to change "From" on an Email to be a different User.
  • Send Invitation
    • Able to click 'Send Invitation' to a User record when using CRM Online (pre-integration with O365). Doesn't apply to On Premise.
  • Update Business Closures
    • Create / Update Business Closure records (under Settings > Business Management).
  • Web Mail Merge
    • Able to perform a Mail Merge in the web client. If this is not set, and the Mail Merge permission is set, the user will only be able to perform a Mail Merge in the Outlook client. The user can initiate the Mail Merge request from Advanced Find results.

Service Management Tab

  • Browse Availability
    • Able to view the Service Calendar (in the Service area).
  • Control Decrement Terms
    • Able to determine if a Case should not decrement from the Entitlement Terms. User will receive a permissions error when selecting "Do Not Decrement Entitlement Terms" on a Case if they do not have this permission.
  • Create own calendar
    • Able to set up a New Weekly Schedule, a Work Schedule for One Day, or Time Off in the logged in User's Calendar (open a User record, and look for Calendar in the related entities. When the Calendar displays, these options are under the Setup menu).
  • Delete own calendar
    • Able to delete a New Weekly Schedule, a Work Schedule for One Day, or Time Off in the logged in User's Calendar (open a User record, and look for Calendar in the related entities. When the Calendar displays, this is displayed as an X).
  • Read own calendar
    • Able to view the logged in User's Calendar (open a User record, and look for Calendar in the related entities).
  • Search Availability
    • Permits the user to search for available times when scheduling a Service activity.
  • Update Holiday Schedules
    • Able to create/update Holiday Schedule (under Settings > Service Management).
  • Write own calendar
    • Able to update the Weekly Schedule, Work Schedule for One Day, or Time Off in the logged in User's Calendar (open a User record, and look for Calendar in the related entities. When the Calendar displays, these options are under the Setup menu).

Customization Tab

  • Activate Business Process Flows
    • Able to click 'Activate' when setting up a business process flow (in customizations, under Processes).
  • Activate Business Rules
    • Able to click 'Activate' when setting up Business Rules (in the entity customizations).
  • Activate Real-time Processes
    • Able to click 'Activate' when setting up a workflow, dialog, or action (in customizations, under Processes).
  • Configure Yammer
    • Able to configure Yammer to work with Dynamics CRM.
  • Execute Workflow Job
    • Able to run a workflow over a record/set of records.
  • Export Customizations
    • Ability to export a solution.
  • Import Customizations
    • Able to import customizations and solutions into the environment.
  • ISV Extensions
    • Not currently in use.
  • Learning Path Authoring
    • Ability to create Learning Path training: contextual training that can include videos and walkthroughs.
  • Modify Customization constraints
    • Not currently in use.
  • Publish Customizations
    • Ability to publish customization updates.
  • Retrieve Multiple Social Insights
    • Used in conjunction with Microsoft Social Listening.

Any questions? Let us know.

Topics: Microsoft Dynamics 365

Import and Export Better than Art Vandelay

Today's blog post was written by Nick Costanzo (Principal Consultant) and Nathan Williams (Consultant) at Sonoma Partners.

If you've ever had to use the native import tool for Dynamics 365, you've more than likely had the experience of running into import errors of some sort. These errors are not always easy to resolve, and if you're importing large volumes of data, sorting through the errors can be very time consuming. Here at Sonoma Partners, we've had situations where client import files have 50k+ records and have resulted in thousands of errors on the initial import into a test environment. Dynamics 365 offers the ability to export the errored rows, but it doesn’t include the error codes. The export only includes the rows with the data you had already included in your import, which is not very helpful. Furthermore, you cannot access the error logs through Advanced Find.

Our team set out to find a better way to tackle this situation using Power BI.

Through our efforts, we came up with the following approach to better analyze these errors and resolve them more quickly. After all, we don’t want you to start yelling, “George is getting angry!” while dealing with import errors.

Here’s the approach we took:

1. First connect to CRM by choosing Get Data > OData Feed:

Nick c 1

2. Then choose the Import Logs and Import Files entities.

3. Next pull in the Web Service Error Codes published on MSDN, by choosing Get Data > Web:

Nick c 2

a. Note: Power BI will recognize the table of error codes on this page, but you will need to massage the data to get the Error IDs and text into their own columns:

Nick c 3

4. Now you can create your data model with relationships between these 3 tables:

Nick c 4

5. With your relationships in place, you can now create a report with visualizations to categorize your errors:

Nick c 5

  1. Create a slicer for the Import Date.
  2. Create a slicer for the File Name, in the event you have multiple files to import.
  3. Create a slicer for the Target Entity.
  4. Create a bar chart to count the errors per Field and Error Name.
  5. Create a bar chart to group the error by the field Value (i.e.  GUID from the source system).
  6. Create a table to display the record(s) based on which slicers have been selected.

6. The report now allows you to easily focus on which errors need to be fixed. In this case, we can see that 2 records were responsible for 1468 and 305 errors where the lookup could not be found. By fixing these 2 values, we’re much closer to a clean data set and can move on to the next ones.

7. Once you have resolved all errors in your source files, you can now reimport with a much higher level of confidence that the job will be successful.

If you wanted to take this a step further, you could set this up to analyze your data before importing to make sure it's clean. You would need to setup your lookup tables as data sources, and update the data model with those as well.  If you’d like help with these feel free to contact us, and our Power BI team would be glad to help!  Either way, you can certainly do more importing and exporting than Art Vandelay ever did!

Download our infographic on D365 for manufacturing

Topics: Microsoft Dynamics 365

Data Migration Testing 101

Today's blog post was written by Sid Thakkar, Senior QA at Sonoma Partners.

The concept of the data migration is very simple; testing is conducted to compare the source data to the migrated data. In other words, we try to discover any discrepancies that take place when moving the data from one database system to another. As simple as it might sound, the testing effort involved in data migration project is enormous, and it often ends up taking a lot of time.

A well-defined testing strategy is essential for delivering a successful data migration.

One of the important aspects of a successful data migration test can be archived using an “Automated” approach of testing. It also saves significant time, minimizes the typical iterative testing approach, and gives us the ability to test 100% of the migrated data. Different phases of data migration testing include:

  1. Data Migration Design Review
  2. Pre-Data Migration Testing
  3. Post-Data Migration Testing

Data Migration Design Review

It is important for a Quality Analyst to understand the design review of the migration specification during the early stage of the migration implementation/configuration. The QA should go through the detail analysis of Data Mapping requirement document prior to the start of any sort of testing. Ideally, we would want to note if any of the columns or fields match the below criteria.

  1. Change in data type from source to target (e.g. data in source may be represented as a character but in target table the same is represented as an integer)
  2. Modifying the existing data (e.g.  requirement of migrating “status = in progress” in source system to be migrated as “Status = lost” or “telephone = 1234567890” to be migrated as “telephone = 123-456-7890”)
  3. Document all Option Set values, lookups, and user mappings

Pre-Data Migration Testing

Before we jump into any kind of data testing, one should test source and target system connection from the migration platform.

Pre-Data migration testing can also be called Definition testing. Definition testing is something that doesn’t take place during the data migration testing. During definition testing, we should check the Data type and length of all fields in Source Database table to target. For example, Address_line1 field in source is of data type Varchar and has length of 50 whereas Address_line1 field in target is listed as Varchar(30). This basically means that there can be a potential issue with the data that has a length more than 30 in source table.

For each entity, run a similar SQL query to the one listed below for both source and target table in order to confirm that the definition of fields between both tables are correct.

Sid 1

Post-Data Migration Testing

Post-data migration testing is by far the most important phase of the migration testing. In a situation where we do not have enough time assigned for testing, we can directly jump into this phase of testing. The testing is divided in two parts:

  1. Record Counts
  2. Data Mapping
    1. Unmapped Record Counts
    2. Unmapped Record Values

This could be really easy to test once you understand the data structure of the migration process. In order to successfully automate some of the testing, you will need to find out database names, table names, primary Keys for the entity you are testing. For example, let’s assume that you are testing account migration, and the source table name is “Source_Accounts,” the target table name is “Target_Accounts,” and the primary key for both the table is “Account_ID.”

Record Counts

I prefer using Microsoft Excel to automate some of the testing. But you can write programs to do the same. As you can see in the image, I have listed source and target table names, columns and primary key in “sheet1” of an excel file.

Sid 2
Image 1

You can create a new excel sheet and write this command to auto generate record count queries (see image below).

="select "&Sheet1!B5&" = count ("&Sheet1!B5&") From "&Sheet1!$A$5&" where "&Sheet1!B5 &" is not null"

Sid 3
Image 2

select Address1_AddressId = count (Address1_AddressId)
From Project_Database.[dbo].[Source_Accounts]
where Address1_AddressId is not null

Next step is to run these queries in SQL window, and then store the result. Once you repeat the same process for target table, you should be able to compare record counts for all fields between the source and target tables.

Data Mapping

Once we have done the row count testing, we can go one step further to verify if the content matches as well. During this phase of the testing, we basically will cover all the testing we have done so far (which is one of the reasons why we jump directly to the data mapping testing in time-crunch situations).

Unmapped Record Counts

Let’s use the image1, create a new tab in the same excel file, and write below listed command to auto generate data mapping queries. It’s easier and safer to first find out the record counts that did not match and then dive into finding those records. Counting unmapped records is the first step towards this process.

="select count(*) From "&Sheet1!$A$5&" t1 join "&Sheet1!$D$5&" t2 on t1."&Sheet1!C$5&"= t2."&Sheet1!$F$5&" where t1."&Sheet1!B5& " <>  t2."&Sheet1!E5& " and t2."&Sheet1!E5& " is not null"

Sid 4

Sid 5

Unmapped Record Values

If the above query for unmapped record count returns zero for all fields, then the possibility of a successful migration is greater. But it isn’t really wise to leave the testing efforts just yet. I highly recommend that regardless of the result of above queries, one should go a step further and run below query to find out exact value mapping between source and target table.

Let’s use the image1 again and create a new tab in the same excel file to auto-generate the query for unmapped record values.

="select t1."&Sheet1!B5&" , t2."&Sheet1!E5&" From "&Sheet1!$A$5&" t1 join "&Sheet1!$D$5&" t2 on t1."&Sheet1!C$5&"= t2."&Sheet1!$F$5&" where t1."&Sheet1!B5& " <> t2."&Sheet1!E5& " and t2."&Sheet1!E5& " is not null"

Sid 6
Sid 7


In the next blog, I will be discussing how a QA can be involved in writing SSIS packages to be more self-dependent during any sort of data migration projects.

Topics: Microsoft Dynamics 365

Tips on Submitting a Dynamics 365 App to Microsoft AppSource

Microsoft's updated business app store, AppSource, has launched and as we mentioned before has been steadily gaining  momentum. We’ve submitted a few apps to the store for the Dynamics 365 CRM product, and I to share some tips to get through the evolving process more efficiently.

Note: Developing an app for AppSource is outside the scope of this article. Instead, I will focus on the submission process once you have a managed solution developed and ready to submit.


Process Overview

The app submission process encompasses more than just your Dynamics 365 managed solution file. Microsoft AppSource expects you to have your marketing, support, and image files ready for submission in addition to your solution. Because of this, please consider the following tips:

  • Start your marketing efforts in parallel to your solution packaging efforts, this includes the creation of marketing data sheets, product images, and application icons.
  • Application Icons
    • Icon & image sizing should match exactly the sizes the submission process requests.
    • Try to create all of the app icons requested.
    • Your solution package requires a 32x32.png logo file. Don't forget to get this completed, otherwise, you can't complete the solution process.
  • You will need an Azure subscription to store your solution package for the submission process to retrieve and test it. Use this handy tool for this process.
    • Note: Microsoft hosts your final solution file for AppSource. This is a temporary location for the submission process to evaluate your solution prior to publishing.
  • You will need to have a license, privacy statement, and supporting marketing data sheet documentation.
  • Don't select CSV for your lead source. This will create daily Excel files and they end up being difficult to manage. Since you have an Azure subscription for the file storage, you can use Azure table or select your cloud-based CRM system.
  • The AppSource review team will send you a document to complete the end-to-end testing steps. This will happen during the process, so be prepared to see it and send to them when requested.

Solution Packaging

You need to take your managed solution file and 'package' it using the solution packaging tool. Follow the article steps for more detail, but the part that might confuse a Visual Studio novice (like me) was the part to update your references. Here are the minimum steps you need to get a packaged file ready to zip.

  1. Assuming you have installed the package from the link above, Create a new CRM project
  2. Click References, right-click and select Manage NuGet Packages
  3. Click Updates and select all and update (this will update your references with the latest files from NuGet)
  4. Copy your managed solution to PkgFolder
  5. Update ImportConfig.xml with package name (and any other settings necessary)
  6. Build and note the location of your debug output file

AppSource Packaging

Microsoft provides you with detailed instructions for this process. This is a lengthy document, so here are the steps I take when preparing an app for the store submission.

First, the sample template zip file originally sent to me was incorrect. It fails to include the required ImportConfig.xml in the PkgFolder. And, while not a mistake, you don't need the privacy.htm file included. Here are the steps I take AFTER I have the solution package built from Visual Studio.

  1. Need to have a 32x32 logo file! Be sure to get that ahead of time and the size must be 32x32.
  2. Create a folder called Package.
  3. Copy the dll and PkgFolder files from your debug build.
  4. Inside the PkgFolder, delete the /en folder. The only two files necessary are the managed solution zip. and the ImportConfig.xml file.
    BE SURE THE ImportConfig.xml file is properly updated with your values.

    image2017-2-16 16-12-43
  5. IMPORTANT: Add a the content_types xml file! Grab this file from the template folder (or a previous submission).
    image2017-2-15 14-58-28
  6. Zip this and call it Be careful when you zip you don't get the parent folder. The inside of the zip should match the screenshot of step 5 exactly.
  7. Create another folder (I usually name it AppSource_<AppName>).
  8. In this folder, copy your file you just created.
  9. Add the content types xml again, a license file, input.xml file, and the logo. All of these files are required.
  10. Be sure to update the input.xml file with your specific settings.

    image2017-2-15 15-0-33

    image2017-2-15 15-0-57
  11. Zip up the contents of this folder. The zip file will need to be placed on Azure and then a url created from to Azure storage which will be entered into the AppSource submission request.

Wrap Up

The process of loading your app to the AppStore may appear intimidating at first. However, Microsoft and your service partner can assist you throughout the process and Microsoft is continuing to improve the entire submission process experience. While these tips don't cover every step required, hopefully they provide a jumpstart to some of the more common missteps we see.

Topics: Microsoft Dynamics 365

How to: Migrating Unified Service Desk Configuration Data

Today's blog post was written by Michael Maloney, Principal Developer at Sonoma Partners.

As with many projects, we typically follow a development, staging, and production model of deployments. On larger projects, it’s not unheard of to have four, five, or even more environments. When it comes to deploying Unified Service Desk, this can be a challenge due to the heavy reliance on data as configuration. Today, we are going to walk through how you can easily migrate this configuration data from one environment to another. For the purposes of this walk-through, we will assume the environment(s) already have the required USD solutions installed. If not, take a look at one of our previous posts on how to get Unified Service Desk up and running.

Before getting started, be sure to download the latest version of the Dynamics CRM and UII SDK from here and extract each to a designated folder, e.g., D365\SDK and D365\UII.

Exporting Unified Service Desk Configuration Data from the Source Environment

To export the configuration data, run the DataMigrationUtility.exe file found in the D365\SDK\Tools\ConfigurationManager folder and choose Export Data on the main screen, then click Continue.

Maloney 1

Enter credentials for the organization you would like to export data from and click Login.

On the next screen, select the default Unified Service Desk configuration data schema file (USDDefaultSchema.xml) to be used for the data export. This is found in the UII\USD Developer Assets\USD Configuration Tool Schema folder.

Specify the name and location of the data file to be exported.

Maloney 2

Click Export Data. The screen displays the export progress and the location of the exported file at the bottom of the screen once the export is complete.

Maloney 3

Click Exit to return to the main menu.

Importing Unified Service Desk Configuration Data to the Target Environment

Before importing the USD configuration data to the target environment, be sure to import the necessary packages and/or solutions first.

From the main screen of the CRM Configuration Manager, select Import Data then click Continue.

Maloney 4

Enter credentials for the organization you would like to export data from and click Login.

The next screen prompts you to provide the data file (.zip) to be imported. Browse to the data file, select it, and then click Import Data.

The next screen displays the import status of your records. The data import is done in multiple passes to first import the foundation data while queuing up the dependent data, and then import the dependent data in the subsequent passes to handle any data dependencies or linkages. This ensures clean and consistent data import.

Maloney 5

Click Exit to close the tool.

To verify the changes in the target environment, open up the Unified Service Desk app and click the “Change Credentials” link on the loading screen.

Maloney 6

If you have more complex customizations involving many solutions and configuration data, you can opt to create a custom package instead. These packages bundle everything up so that you can then run them from the Package Deployer Tool, just as the original Unified Service Desk packages you see when setting up for the first time. We’ve written in the past on how to get started creating your own package, and you can find more detail on MSDN on how to include your configuration data along with the package. 

Dynamics 365: Editable Grids

Topics: Microsoft Dynamics 365

SystemForm with Id Does Not Exist

Today's blog post was written by Matt Dearing, Principal Developer at Sonoma Partners.

I had a customer reach out recently saying they were trying to open contact records from one of their sandbox Dynamics 2016 online instances and were getting the following popup:

Matt dearing 1

The log file showed the following:

"systemform With Id = 04238d8a-dbf8-467c-805f-4af4b757870 Does Not Exist"

I asked the user if they had deleted any forms recently. They said they had deleted a secondary "test" form in that org. My thought was that something had cached that old form id and CRM was continuing to try and load it even though it no longer existed. I asked the user to clear their browser cache, but they still received the same error. I asked them to try and load the same record in a secondary browser while I went ahead and queried "userentityuisettings.lastviewedformxml" via a fetch xml query and noticed that the old form's id was still there.

lastviewedformxml <MRUForm><Form Type="Main" Id="04238d8a-dbf8-467c-805f-4af4b757870f" /></MRUForm>


I did a "publish all" and queried again and saw that the correct form id was now stored.

lastviewedformxml <MRUForm><Form Type="Main" Id="1fed44d1-ae68-4a41-bd2b-f13acac4acfa" /></MRUForm>


Which meant the publish all may have triggered a refresh, or it was a coincidence and what actually refreshed "lastviewedformxml" was the user's secondary browser. Either way I asked the user to try again in the primary browser, expecting everything to work, but they still received the same error. I navigated to the same record, which loaded fine, so I decided to take a quick look at local storage via the dev tools. I noticed form ids were cached there.

I had the user run "localStorage.clear()" from the console window of the dev tools instance on their primary browser, then reload the page and everything loaded correctly.Although the user had cleared their cache it appears some browsers tie local storage to clearing cookies, so depending on what your cache clear is actually doing, it may not be clearing local storage.

The need for deleting a form rarely arises, but if you find yourself in a similar situation be very careful. If the form must be deleted and users have been using it, you may need them to fully clear their browser cache in order to get the correct form loaded.

Topics: Microsoft Dynamics 365