Sonoma Partners Microsoft CRM and Salesforce Blog

Bulkify Bulk Edit

Today's blog post was written by Mike Dearing, Development Principal at Sonoma Partners.

Microsoft Dynamics’ bulk edit functionality is a convenient way to mass update multiple records of the same entity at the same time.

While some additional avenues exist for doing bulk edits, such as native excel imports, or excel online and editable grids for Dynamics 365 Online customers, the bulk edit dialog still provides efficient means for quickly mass updating one or two fields across a filtered set of records. There are a couple of limitations, however, that clients tend to ask about:

How do I blank out values?

There unfortunately isn’t a ‘clear’ keyword like you may be familiar with via workflow functionality. Since every field from the most recently visited main form for the entity currently being edited is available for updating, only those fields which have had values supplied are actually updated once you apply your edit. There are a few workarounds here though:

  1. Create a workflow/plugin that executes when the field you want to clear has had a specific value populated within it. For instance, if you have a text field that you want to clear through a bulk edit, perhaps entering the phrase <Clear> triggers logic that leverages the workflow <Clear> action, a plugin to blank out the value. This approach is pretty limited to specific field types, such as single line of text or multi-line of text fields. For example, if you want to clear out Lead’s Description field when a value of <Clear> is entered, you would configure your workflow as follows:

    1. When description changes

      Bulk 1

    2. Then clear Description

      Bulk 2

    3. If Description's text is <Clear>

      Bulk 3

      If you enter <Clear> into the Description field and press 'Change,' you'll see that each of the selected reords have had their Description value cleared.

  2. Create a ‘Clear X’ field and a workflow/plugin to support it. This could be one additional bitfield per field that you’d like to clear, or an option set that lists out field names that correspond to a list of fields that you want to enable for clearing. Similar to the first approach, you’ll create a workflow/plugin that executes here to do the heavy lifting for you, but this time it will be based on the value within your ‘Clear X’ field. You should also remove the selected value within the ‘Clear X’ field, so that it remains stateless. Once you have this field on the form, it will appear on the bulk edit dialog as well, and users can select it as they’d select any other field within the dialog. You’ll want to hide this field on the main entity form since it doesn’t serve much purpose within a normal edit form, but make sure that it still is available from the bulk edit form. I discuss how to do this later in the ‘How do I control which fields are available for bulk edit?’ part of this post. For example, if you want to clear out Lead’s Description field or Lead’s Industry field, you would do the following:

    1. Create a new ‘Clear Value’ option set.

    2. Add 2 options, one called ‘Description’ and one called ‘Industry’

      Bulk 4

    3. Place ‘Clear Value’ on the lead form

    4. Create your workflow to clear ‘Description’ if Clear Value is set to Description, or to clear ‘Industry’ if Clear Value is set to Industry. Also revert Clear Value back to blank.

      Bulk 5
      Bulk 6
      Bulk 7
      Bulk 8
      Bulk 9

      If you select ‘Description’ for Clear Value and press ‘Change’, you’ll see that each of the selected records have had their Description value cleared. Similarly, if you select ‘Industry’ for Clear Value and press ‘Change’, you’ll see that each of the selected records have had their Industry value cleared.

  3. Create an on demand workflow per field that needs to be cleared. Using the <Clear> action mentioned above, this can be accomplished without much effort. This detracts from the seamless experience of managing all mass edits through the bulk edit dialog, and also doesn’t scale well if there are many fields that you want to enable for bulk clear functionality.

The second option above is the most elegant and flexible if you have the time to implement it. For the first or second option, I prefer plugins over workflows due to the efficiencies of doing these clears in the pre operation pipeline, meaning the changes are written to the database as a part of the same update initiated by the bulk edit, whereas workflows execute post operation, kicking off a second update once the bulk edit’s update has completed. I’ve also considered leveraging dialogs, but their single record execution prevents this from being a viable approach for bulk edits.

How do I control which fields are available for bulk editing?

Rather than there being a ‘bulk edit’ form type, Dynamics leverages the main form type. In the case of multiple main forms, the most recent form that the user has visited will be the basis of their bulk edit form. Even though you can’t edit subgrids and certain other fields, those will still appear on the bulk edit dialog, as well as fields that you conditionally hide via business rules or JavaScript, since neither business rules nor JavaScript execute on bulk edit forms. The suggested approach here is as follows:

  1. To set a field’s visibility such that it doesn’t appear on the bulk edit dialog, hide fields via the main form’s customizations. These fields presumably need to be displayed on the main form still, so define a business rule ‘show’ action per field that should be displayed, or JavaScript to similarly show these fields on load of the main form.

    For example, if you’d like to hide the Website field from Lead’s bulk edit, but still want it to appear on the main form, you would define a business rule as follows:

    Bulk 10


    You would also need to set this field as not visible through the form’s customizations.

    Bulk 11

  2. To set a field’s visibility such that it only appears on the bulk edit dialog, you’ll want to do the opposite of above. Show fields via the main form’s customizations, then define a business rule ‘hide’ action per field that should be hidden, or JavaScript to similarly hide these fields on load of the main form.

    For example, if you’d like to show the field that we created above named ‘Clear Value’ on the bulk edit form, but don’t want it to appear on the main form, you would define a business rule as follows:

    Bulk 12

Since business rules and JavaScript don’t execute on bulk edit forms, the visibility option that you specify for that field through the main form’s customizations will be the visibility of that field on the bulk edit dialog.

With minimal effort, you’ve now enhanced your bulk edit dialogs to be more powerful and more user-friendly. Happy editing!

Topics: Microsoft Dynamics 365

Dynamics 365: Miscellaneous Security Permissions

Today's blog post was written by Jen Ford, Principal QA at Sonoma Partners.

There are so many permissions to consider when you are setting up access for your users. Should I remove delete privileges from Contacts? Should a user be able to view all Cases or should some roles have no access? Should I restrict Account permissions to only see those that the user owns? In addition to making these decisions for entity-specific permissions, there are a slew of Miscellaneous Privileges on each tab of the Security Role that we can set for additional access to special privileges that aren’t a blanket permission on whether or not a user has read, write, or delete privileges to a specific entity. Some of them are very straightforward: Publish Reports or Publish Duplicate Detection Rules. But some of them are more nuanced, or their function doesn’t easily match the name of the permission. What is the difference between the Browse Availability and the Search Availability permissions? What are these, anyway? Let’s take a look at the Miscellaneous permissions on each tab of the Security Role:

Core Records Tab

  • Add Report Services Reports
    • Ability to publish reports.
  • Bulk Delete
    • Ability to delete data in bulk (under Settings > Data Management).
  • Delete Audit Partitions
    • Ability to delete Audit Partitions from Settings > Auditing > Audit Log Management.
  • Manage Data Encryption key – Activate
    • In order to support server-side sync and Yammer integration capabilities, Dynamics 365 needs to store passwords for email services and Yammer authentication tokens. Dynamics 365 uses standard Microsoft SQL Server cell level encryption for a set of default entity attributes that contain sensitive information, such as user names and email passwords. Under Settings > Data Management > Data Encryption (ability to set this value initially).
  • Manage Data Encryption key – Change
    • In order to support server-side sync and Yammer integration capabilities, Dynamics 365 needs to store passwords for email services and Yammer authentication tokens. Dynamics 365 uses standard Microsoft SQL Server cell level encryption for a set of default entity attributes that contain sensitive information, such as user names and email passwords. Under Settings > Data Management > Data Encryption (the "Change" button).
  • Manage Data Encryption key – Read
    • In order to support server-side sync and Yammer integration capabilities, Dynamics 365 needs to store passwords for email services and Yammer authentication tokens. Dynamics 365 uses standard Microsoft SQL Server cell level encryption for a set of default entity attributes that contain sensitive information, such as user names and email passwords. Under Settings > Data Management > Data Encryption (ability to read the Data Encryption Key and view the encrypted data).
  • Manage User Synchronization Filters
    • Manage Offline and Outlook sync filters.
  • Promote User to Microsoft Dynamics CRM User Administrator Role
    • For Online only. Allows you to elevate the privileges of a specific user to System Administrator with the "Promote to Admin" button in the ribbon.
  • Publish Duplicate Detection Rules
    • Ability to publish duplicate detection rules.
  • Publish Email Templates
    • Ability to make Email Templates available to the organization. Under Settings > Templates > Email Templates, there is an option on the Actions menu on the Email Template form for "Make Template Available to Organization."
  • Publish Mail Merge Templates to Organization
    • Ability to make Mail Merge Templates available to the organization. Under Settings > Templates > Mail Merge Templates, there is an option on the More Actions menu for "Make Available to Organization."
  • Publish Reports
    • Ability to set "Viewable By" = "Organization" on the Report Administration tab.
  • Run SharePoint Integration Wizard
    • Allows the user to run the "Enable Server-based Authentication" wizard in Dynamics 365.
  • Turn on Tracing
    • User is able to generate trace files for the organization.
  • View Audit History
    • Ability to view Audit History records off of a related record.
  • View Audit Partitions
    • Able to view the Audit Partitions (under Settings > Auditing > Audit Log Management).
  • View Audit Summary
    • Ability to view Audit History via Settings > Auditing > Audit Summary View.

Marketing Tab

  • Configure Internet Marketing module
    • Internet Lead Capture for CRM 2011. No longer available.
  • Use internet marketing module
    • Internet Lead Capture for CRM 2011. No longer available.
  • Create Quick Campaign
    • Ability to create a Quick Campaign.

Sales Tab

  • Override Invoice Pricing
    • Allows the user to select a Write-In Product, or select 'Override Pricing' on the Invoice Product.
  • Override Opportunity Pricing
    • Allows the user to select a Write In Product, or select 'Override Pricing' on the Opportunity Product.
  • Override Order Pricing
    • Allows the user to select a Write In Product, or select 'Override Pricing' on the Order Product.
  • Override Quote Order Invoice Delete
    • Allows the user to delete an inactive Quote, Order, or Invoice.
  • Override Quote Pricing
    • Allows the user to select a Write In Product, or select 'Override Pricing' on the Quote Product.

Service Tab

  • Approve Knowledge Articles
    • Ability to click "Approve" on a Knowledge Article. If this permission is not granted, the user will not see this button.
  • Publish Articles
    • Ability to publish an Article. This is the old Article entity, not the newer Knowledge Article entity.
  • Publish Knowledge Articles
    • Ability to click "Publish" on a Knowledge Article. If this permission is not granted, the user will not see this button.

Business Management Tab - Privacy Related Privileges

  • Document Generation
    • Allows the user to download a template from CRM (Templates > Document Templates).
  • Dynamics 365 for mobile
    • Allows access to the Dynamics 365 app on a mobile device.
  • Dynamics 365 for phones express
    • Allows access to the Dynamics 365 for phones express app on a mobile phone.
  • Export to Excel
    • Ability to export data from Views and Advanced Find to excel. If this permission is not granted, the user will not see this button.
  • Go Offline in Outlook
    • Allow users to sync offline while they are using Dynamics for Outlook. If this permission is not granted, the user will not see an option to 'Go Offline' in the Outlook client.
  • Mail Merge
    • Able to perform a Mail Merge in the Outlook client. The Web Mail Merge permission is required to perform a Mail Merge in the web client.
  • Print
    • Able to create a printer-friendly display of a grid, by selecting Print Preview in the personal Settings Menu.
  • Sync to Outlook
    • Allow users to sync Contacts and Activities to Outlook.
  • Use Dynamics 365 App for Outlook
    • Allows access to the Dynamics 365 app for Outlook.

Business Management Tab - Miscellaneous Privileges

  • Act on Behalf of Another User
    • Needed to publish workflows. Also can be used for impersonation.
  • Approve Email Addresses for Users or Queues
    • Able to click on 'Approve Email' and 'Reject Email' from the User record or the Queue record.
  • Assign manager for a user
    • Able to set the Manager field on a User record.
  • Assign position for a user
    • Able to set or change a Position for a User, using Hierarchy Modeling.
  • Assign Territory to User
    • Able to set the Territory field on a User record.
  • Bulk Edit
    • Ability to select multiple records at the same time, and click Edit.
  • Change Hierarchy Security Settings
    • Able to change from Position to Manager Hierarchy, Enable Hierarchy Modeling, and set the Entities to include in Hierarchy Modeling.
  • Dynamics 365 Address Book
    • Able to search on Dynamics 365 Contacts in the To, From, and Bcc fields of an Email opened through the Dynamics 365 App for Outlook.
  • Enable or Disable Business Unit
    • Able to select Enable/Disable on a Business Unit (under Settings > Security).
  • Enable or Disable User
    • Able to select Enable/Disable on a User (under Settings > Security).
  • Language Settings
    • Able to provision other Languages (under Settings > Administration).
  • Merge
    • Ability to merge records. If this permission is not granted, the user will not see this button.
  • Override Created on or Created by for Records during Data Import
    • Allows user to set Created On & Created By during import, instead of setting these to the import time and import User, respectively.
  • Perform in sync rollups on goals
    • Permits the user to roll up goal data on demand, instead of waiting for the next scheduled update period, by using the 'Recalculate' button on the Goal record.
  • Read License info
    • Able to access information about the CRM License via the API.
  • Reparent Business unit
    • Able to change the Parent Business field on a Business Unit record.
  • Reparent team
    • Able to change the Business Unit on a Team record (Under Settings > Security).
  • Reparent user
    • Able to change the Business Unit on a User record (Under Settings > Security).
  • Send Email as Another User
    • Able to change "From" on an Email to be a different User.
  • Send Invitation
    • Able to click 'Send Invitation' to a User record when using CRM Online (pre-integration with O365). Doesn't apply to On Premise.
  • Update Business Closures
    • Create / Update Business Closure records (under Settings > Business Management).
  • Web Mail Merge
    • Able to perform a Mail Merge in the web client. If this is not set, and the Mail Merge permission is set, the user will only be able to perform a Mail Merge in the Outlook client. The user can initiate the Mail Merge request from Advanced Find results.

Service Management Tab

  • Browse Availability
    • Able to view the Service Calendar (in the Service area).
  • Control Decrement Terms
    • Able to determine if a Case should not decrement from the Entitlement Terms. User will receive a permissions error when selecting "Do Not Decrement Entitlement Terms" on a Case if they do not have this permission.
  • Create own calendar
    • Able to set up a New Weekly Schedule, a Work Schedule for One Day, or Time Off in the logged in User's Calendar (open a User record, and look for Calendar in the related entities. When the Calendar displays, these options are under the Setup menu).
  • Delete own calendar
    • Able to delete a New Weekly Schedule, a Work Schedule for One Day, or Time Off in the logged in User's Calendar (open a User record, and look for Calendar in the related entities. When the Calendar displays, this is displayed as an X).
  • Read own calendar
    • Able to view the logged in User's Calendar (open a User record, and look for Calendar in the related entities).
  • Search Availability
    • Permits the user to search for available times when scheduling a Service activity.
  • Update Holiday Schedules
    • Able to create/update Holiday Schedule (under Settings > Service Management).
  • Write own calendar
    • Able to update the Weekly Schedule, Work Schedule for One Day, or Time Off in the logged in User's Calendar (open a User record, and look for Calendar in the related entities. When the Calendar displays, these options are under the Setup menu).

Customization Tab

  • Activate Business Process Flows
    • Able to click 'Activate' when setting up a business process flow (in customizations, under Processes).
  • Activate Business Rules
    • Able to click 'Activate' when setting up Business Rules (in the entity customizations).
  • Activate Real-time Processes
    • Able to click 'Activate' when setting up a workflow, dialog, or action (in customizations, under Processes).
  • Configure Yammer
    • Able to configure Yammer to work with Dynamics CRM.
  • Execute Workflow Job
    • Able to run a workflow over a record/set of records.
  • Export Customizations
    • Ability to export a solution.
  • Import Customizations
    • Able to import customizations and solutions into the environment.
  • ISV Extensions
    • Not currently in use.
  • Learning Path Authoring
    • Ability to create Learning Path training: contextual training that can include videos and walkthroughs.
  • Modify Customization constraints
    • Not currently in use.
  • Publish Customizations
    • Ability to publish customization updates.
  • Retrieve Multiple Social Insights
    • Used in conjunction with Microsoft Social Listening.

Any questions? Let us know.

Topics: Microsoft Dynamics 365

Import and Export Better than Art Vandelay

Today's blog post was written by Nick Costanzo (Principal Consultant) and Nathan Williams (Consultant) at Sonoma Partners.

If you've ever had to use the native import tool for Dynamics 365, you've more than likely had the experience of running into import errors of some sort. These errors are not always easy to resolve, and if you're importing large volumes of data, sorting through the errors can be very time consuming. Here at Sonoma Partners, we've had situations where client import files have 50k+ records and have resulted in thousands of errors on the initial import into a test environment. Dynamics 365 offers the ability to export the errored rows, but it doesn’t include the error codes. The export only includes the rows with the data you had already included in your import, which is not very helpful. Furthermore, you cannot access the error logs through Advanced Find.

Our team set out to find a better way to tackle this situation using Power BI.

Through our efforts, we came up with the following approach to better analyze these errors and resolve them more quickly. After all, we don’t want you to start yelling, “George is getting angry!” while dealing with import errors.

Here’s the approach we took:

1. First connect to CRM by choosing Get Data > OData Feed:

Nick c 1

2. Then choose the Import Logs and Import Files entities.

3. Next pull in the Web Service Error Codes published on MSDN, by choosing Get Data > Web:

Nick c 2

a. Note: Power BI will recognize the table of error codes on this page, but you will need to massage the data to get the Error IDs and text into their own columns:

Nick c 3

4. Now you can create your data model with relationships between these 3 tables:

Nick c 4

5. With your relationships in place, you can now create a report with visualizations to categorize your errors:

Nick c 5

  1. Create a slicer for the Import Date.
  2. Create a slicer for the File Name, in the event you have multiple files to import.
  3. Create a slicer for the Target Entity.
  4. Create a bar chart to count the errors per Field and Error Name.
  5. Create a bar chart to group the error by the field Value (i.e.  GUID from the source system).
  6. Create a table to display the record(s) based on which slicers have been selected.

6. The report now allows you to easily focus on which errors need to be fixed. In this case, we can see that 2 records were responsible for 1468 and 305 errors where the lookup could not be found. By fixing these 2 values, we’re much closer to a clean data set and can move on to the next ones.

7. Once you have resolved all errors in your source files, you can now reimport with a much higher level of confidence that the job will be successful.

If you wanted to take this a step further, you could set this up to analyze your data before importing to make sure it's clean. You would need to setup your lookup tables as data sources, and update the data model with those as well.  If you’d like help with these feel free to contact us, and our Power BI team would be glad to help!  Either way, you can certainly do more importing and exporting than Art Vandelay ever did!

Download our infographic on D365 for manufacturing

Topics: Microsoft Dynamics 365

Data Migration Testing 101

Today's blog post was written by Sid Thakkar, Senior QA at Sonoma Partners.

The concept of the data migration is very simple; testing is conducted to compare the source data to the migrated data. In other words, we try to discover any discrepancies that take place when moving the data from one database system to another. As simple as it might sound, the testing effort involved in data migration project is enormous, and it often ends up taking a lot of time.

A well-defined testing strategy is essential for delivering a successful data migration.

One of the important aspects of a successful data migration test can be archived using an “Automated” approach of testing. It also saves significant time, minimizes the typical iterative testing approach, and gives us the ability to test 100% of the migrated data. Different phases of data migration testing include:

  1. Data Migration Design Review
  2. Pre-Data Migration Testing
  3. Post-Data Migration Testing

Data Migration Design Review

It is important for a Quality Analyst to understand the design review of the migration specification during the early stage of the migration implementation/configuration. The QA should go through the detail analysis of Data Mapping requirement document prior to the start of any sort of testing. Ideally, we would want to note if any of the columns or fields match the below criteria.

  1. Change in data type from source to target (e.g. data in source may be represented as a character but in target table the same is represented as an integer)
  2. Modifying the existing data (e.g.  requirement of migrating “status = in progress” in source system to be migrated as “Status = lost” or “telephone = 1234567890” to be migrated as “telephone = 123-456-7890”)
  3. Document all Option Set values, lookups, and user mappings

Pre-Data Migration Testing

Before we jump into any kind of data testing, one should test source and target system connection from the migration platform.

Pre-Data migration testing can also be called Definition testing. Definition testing is something that doesn’t take place during the data migration testing. During definition testing, we should check the Data type and length of all fields in Source Database table to target. For example, Address_line1 field in source is of data type Varchar and has length of 50 whereas Address_line1 field in target is listed as Varchar(30). This basically means that there can be a potential issue with the data that has a length more than 30 in source table.

For each entity, run a similar SQL query to the one listed below for both source and target table in order to confirm that the definition of fields between both tables are correct.

Sid 1

Post-Data Migration Testing

Post-data migration testing is by far the most important phase of the migration testing. In a situation where we do not have enough time assigned for testing, we can directly jump into this phase of testing. The testing is divided in two parts:

  1. Record Counts
  2. Data Mapping
    1. Unmapped Record Counts
    2. Unmapped Record Values

This could be really easy to test once you understand the data structure of the migration process. In order to successfully automate some of the testing, you will need to find out database names, table names, primary Keys for the entity you are testing. For example, let’s assume that you are testing account migration, and the source table name is “Source_Accounts,” the target table name is “Target_Accounts,” and the primary key for both the table is “Account_ID.”

Record Counts

I prefer using Microsoft Excel to automate some of the testing. But you can write programs to do the same. As you can see in the image, I have listed source and target table names, columns and primary key in “sheet1” of an excel file.

Sid 2
Image 1

You can create a new excel sheet and write this command to auto generate record count queries (see image below).

="select "&Sheet1!B5&" = count ("&Sheet1!B5&") From "&Sheet1!$A$5&" where "&Sheet1!B5 &" is not null"

Sid 3
Image 2

select Address1_AddressId = count (Address1_AddressId)
From Project_Database.[dbo].[Source_Accounts]
where Address1_AddressId is not null

Next step is to run these queries in SQL window, and then store the result. Once you repeat the same process for target table, you should be able to compare record counts for all fields between the source and target tables.

Data Mapping

Once we have done the row count testing, we can go one step further to verify if the content matches as well. During this phase of the testing, we basically will cover all the testing we have done so far (which is one of the reasons why we jump directly to the data mapping testing in time-crunch situations).

Unmapped Record Counts

Let’s use the image1, create a new tab in the same excel file, and write below listed command to auto generate data mapping queries. It’s easier and safer to first find out the record counts that did not match and then dive into finding those records. Counting unmapped records is the first step towards this process.

="select count(*) From "&Sheet1!$A$5&" t1 join "&Sheet1!$D$5&" t2 on t1."&Sheet1!C$5&"= t2."&Sheet1!$F$5&" where t1."&Sheet1!B5& " <>  t2."&Sheet1!E5& " and t2."&Sheet1!E5& " is not null"

Sid 4

Sid 5

Unmapped Record Values

If the above query for unmapped record count returns zero for all fields, then the possibility of a successful migration is greater. But it isn’t really wise to leave the testing efforts just yet. I highly recommend that regardless of the result of above queries, one should go a step further and run below query to find out exact value mapping between source and target table.

Let’s use the image1 again and create a new tab in the same excel file to auto-generate the query for unmapped record values.

="select t1."&Sheet1!B5&" , t2."&Sheet1!E5&" From "&Sheet1!$A$5&" t1 join "&Sheet1!$D$5&" t2 on t1."&Sheet1!C$5&"= t2."&Sheet1!$F$5&" where t1."&Sheet1!B5& " <> t2."&Sheet1!E5& " and t2."&Sheet1!E5& " is not null"

Sid 6
Sid 7

 

In the next blog, I will be discussing how a QA can be involved in writing SSIS packages to be more self-dependent during any sort of data migration projects.

Topics: Microsoft Dynamics 365

Tips on Submitting a Dynamics 365 App to Microsoft AppSource

Microsoft's updated business app store, AppSource, has launched and as we mentioned before has been steadily gaining  momentum. We’ve submitted a few apps to the store for the Dynamics 365 CRM product, and I to share some tips to get through the evolving process more efficiently.

Note: Developing an app for AppSource is outside the scope of this article. Instead, I will focus on the submission process once you have a managed solution developed and ready to submit.

 

Process Overview

The app submission process encompasses more than just your Dynamics 365 managed solution file. Microsoft AppSource expects you to have your marketing, support, and image files ready for submission in addition to your solution. Because of this, please consider the following tips:

  • Start your marketing efforts in parallel to your solution packaging efforts, this includes the creation of marketing data sheets, product images, and application icons.
  • Application Icons
    • Icon & image sizing should match exactly the sizes the submission process requests.
    • Try to create all of the app icons requested.
    • Your solution package requires a 32x32.png logo file. Don't forget to get this completed, otherwise, you can't complete the solution process.
  • You will need an Azure subscription to store your solution package for the submission process to retrieve and test it. Use this handy tool for this process.
    • Note: Microsoft hosts your final solution file for AppSource. This is a temporary location for the submission process to evaluate your solution prior to publishing.
  • You will need to have a license, privacy statement, and supporting marketing data sheet documentation.
  • Don't select CSV for your lead source. This will create daily Excel files and they end up being difficult to manage. Since you have an Azure subscription for the file storage, you can use Azure table or select your cloud-based CRM system.
  • The AppSource review team will send you a document to complete the end-to-end testing steps. This will happen during the process, so be prepared to see it and send to them when requested.

Solution Packaging

You need to take your managed solution file and 'package' it using the solution packaging tool. Follow the article steps for more detail, but the part that might confuse a Visual Studio novice (like me) was the part to update your references. Here are the minimum steps you need to get a packaged file ready to zip.

  1. Assuming you have installed the package from the link above, Create a new CRM project
  2. Click References, right-click and select Manage NuGet Packages
  3. Click Updates and select all and update (this will update your references with the latest files from NuGet)
  4. Copy your managed solution to PkgFolder
  5. Update ImportConfig.xml with package name (and any other settings necessary)
  6. Build and note the location of your debug output file

AppSource Packaging

Microsoft provides you with detailed instructions for this process. This is a lengthy document, so here are the steps I take when preparing an app for the store submission.

First, the sample template zip file originally sent to me was incorrect. It fails to include the required ImportConfig.xml in the PkgFolder. And, while not a mistake, you don't need the privacy.htm file included. Here are the steps I take AFTER I have the solution package built from Visual Studio.

  1. Need to have a 32x32 logo file! Be sure to get that ahead of time and the size must be 32x32.
  2. Create a folder called Package.
  3. Copy the dll and PkgFolder files from your debug build.
  4. Inside the PkgFolder, delete the /en folder. The only two files necessary are the managed solution zip. and the ImportConfig.xml file.
    BE SURE THE ImportConfig.xml file is properly updated with your values.

    image2017-2-16 16-12-43
  5. IMPORTANT: Add a the content_types xml file! Grab this file from the template folder (or a previous submission).
    image2017-2-15 14-58-28
  6. Zip this and call it Package.zip. Be careful when you zip you don't get the parent folder. The inside of the zip should match the screenshot of step 5 exactly.
  7. Create another folder (I usually name it AppSource_<AppName>).
  8. In this folder, copy your Package.zip file you just created.
  9. Add the content types xml again, a license file, input.xml file, and the logo. All of these files are required.
  10. Be sure to update the input.xml file with your specific settings.

    image2017-2-15 15-0-33

    image2017-2-15 15-0-57
  11. Zip up the contents of this folder. The zip file will need to be placed on Azure and then a url created from to Azure storage which will be entered into the AppSource submission request.

Wrap Up

The process of loading your app to the AppStore may appear intimidating at first. However, Microsoft and your service partner can assist you throughout the process and Microsoft is continuing to improve the entire submission process experience. While these tips don't cover every step required, hopefully they provide a jumpstart to some of the more common missteps we see.

Topics: Microsoft Dynamics 365

How to: Migrating Unified Service Desk Configuration Data

Today's blog post was written by Michael Maloney, Principal Developer at Sonoma Partners.

As with many projects, we typically follow a development, staging, and production model of deployments. On larger projects, it’s not unheard of to have four, five, or even more environments. When it comes to deploying Unified Service Desk, this can be a challenge due to the heavy reliance on data as configuration. Today, we are going to walk through how you can easily migrate this configuration data from one environment to another. For the purposes of this walk-through, we will assume the environment(s) already have the required USD solutions installed. If not, take a look at one of our previous posts on how to get Unified Service Desk up and running.

Before getting started, be sure to download the latest version of the Dynamics CRM and UII SDK from here and extract each to a designated folder, e.g., D365\SDK and D365\UII.

Exporting Unified Service Desk Configuration Data from the Source Environment

To export the configuration data, run the DataMigrationUtility.exe file found in the D365\SDK\Tools\ConfigurationManager folder and choose Export Data on the main screen, then click Continue.

Maloney 1

Enter credentials for the organization you would like to export data from and click Login.

On the next screen, select the default Unified Service Desk configuration data schema file (USDDefaultSchema.xml) to be used for the data export. This is found in the UII\USD Developer Assets\USD Configuration Tool Schema folder.

Specify the name and location of the data file to be exported.

Maloney 2

Click Export Data. The screen displays the export progress and the location of the exported file at the bottom of the screen once the export is complete.

Maloney 3

Click Exit to return to the main menu.

Importing Unified Service Desk Configuration Data to the Target Environment

Before importing the USD configuration data to the target environment, be sure to import the necessary packages and/or solutions first.

From the main screen of the CRM Configuration Manager, select Import Data then click Continue.

Maloney 4

Enter credentials for the organization you would like to export data from and click Login.

The next screen prompts you to provide the data file (.zip) to be imported. Browse to the data file, select it, and then click Import Data.

The next screen displays the import status of your records. The data import is done in multiple passes to first import the foundation data while queuing up the dependent data, and then import the dependent data in the subsequent passes to handle any data dependencies or linkages. This ensures clean and consistent data import.

Maloney 5

Click Exit to close the tool.

To verify the changes in the target environment, open up the Unified Service Desk app and click the “Change Credentials” link on the loading screen.

Maloney 6

If you have more complex customizations involving many solutions and configuration data, you can opt to create a custom package instead. These packages bundle everything up so that you can then run them from the Package Deployer Tool, just as the original Unified Service Desk packages you see when setting up for the first time. We’ve written in the past on how to get started creating your own package, and you can find more detail on MSDN on how to include your configuration data along with the package. 

Dynamics 365: Editable Grids

Topics: Microsoft Dynamics 365

SystemForm with Id Does Not Exist

Today's blog post was written by Matt Dearing, Principal Developer at Sonoma Partners.

I had a customer reach out recently saying they were trying to open contact records from one of their sandbox Dynamics 2016 online instances and were getting the following popup:

Matt dearing 1

The log file showed the following:

"systemform With Id = 04238d8a-dbf8-467c-805f-4af4b757870 Does Not Exist"

I asked the user if they had deleted any forms recently. They said they had deleted a secondary "test" form in that org. My thought was that something had cached that old form id and CRM was continuing to try and load it even though it no longer existed. I asked the user to clear their browser cache, but they still received the same error. I asked them to try and load the same record in a secondary browser while I went ahead and queried "userentityuisettings.lastviewedformxml" via a fetch xml query and noticed that the old form's id was still there.

lastviewedformxml <MRUForm><Form Type="Main" Id="04238d8a-dbf8-467c-805f-4af4b757870f" /></MRUForm>

 

I did a "publish all" and queried again and saw that the correct form id was now stored.

lastviewedformxml <MRUForm><Form Type="Main" Id="1fed44d1-ae68-4a41-bd2b-f13acac4acfa" /></MRUForm>

 

Which meant the publish all may have triggered a refresh, or it was a coincidence and what actually refreshed "lastviewedformxml" was the user's secondary browser. Either way I asked the user to try again in the primary browser, expecting everything to work, but they still received the same error. I navigated to the same record, which loaded fine, so I decided to take a quick look at local storage via the dev tools. I noticed form ids were cached there.

I had the user run "localStorage.clear()" from the console window of the dev tools instance on their primary browser, then reload the page and everything loaded correctly.Although the user had cleared their cache it appears some browsers tie local storage to clearing cookies, so depending on what your cache clear is actually doing, it may not be clearing local storage.

The need for deleting a form rarely arises, but if you find yourself in a similar situation be very careful. If the form must be deleted and users have been using it, you may need them to fully clear their browser cache in order to get the correct form loaded.

Topics: Microsoft Dynamics 365

Plotting CRM Audit Data

Today's blog post was written by Angel Shishkov, Principal Developer at Sonoma Partners.

CRM field audit data is notoriously hard to retrieve. It is not available through Advanced Find and is not exportable through the Audit view on the CRM record. Going the custom route, it is possible to retrieve audit records through FetchXML queries, but getting to the concrete values that changed becomes harder. The recommended and supported way of getting to the field audit logs then is to use the RetrieveAttributeChangeHistoryRequest message.

I will demonstrate how to use the RetrieveAttributeChangeHistoryRequest message in CRM by setting up a simple feature that requires us to read the Audit logs.

Requirement

We have a custom currency field on the Account entity called new_revenue. We have an integration or some other process that updates this field periodically with the total revenue on this Account. We would like to plot the value of revenue on the Account over time, so we can visualize the trend.

Design

We will make use of CRM’s native field auditing to track the historical values of the revenue field on Account. We need to enable auditing on the CRM org, on the Account entity, and on the custom new_revenue field. We will use a custom console application to retrieve the field audit data for new_revenue through the RetrieveAttributeChangeHistoryRequest request of the CRM SDK. The data retrieved will be dumped into a CSV file, which can be opened in Excel and the data can be analyzed and plotted on a graph.

Implementation

The relevant C# code for the console app is below.

This is what it does:

  • Creates a RetrieveAttributeChangeHistoryRequest and sets the Account record as the Target and the new_revenue field as the AttributeLogicalName. This request will return all the audit history for this field.
  • Executes the request against the CRM OrgService and receives the response.
  • Opens a stream to write to a file called output.csv. This file will contain the audit history output in a comma-separated format that is readable by Excel.
  • The response returns a collection of AttributeAuditDetails, each of which represent a single change in the value of the new_revenue field.
  • We loop through all AttributeAuditDetails, and for each one we extract the following:
    • createdon: This is the date and time of the change. We retrieve it from FormattedValues, so that we get the date and time converted in our local time zone, instead of UTC.
    • new_revenue old value: This is the current value of new_revenue at the time the change occurred.
    • new_revenue new value: This is the new value that was assigned to new_revenue.
  • We format these values into a comma-separated line, and write the line to the CSV file.

When we run this code, it produces a file with contents that look like this:

We can open this file in Excel and see the three columns of data we extracted, the date and time of the change, the old value, and the new value of revenue. Now that it is in Excel, we can set up a simple chart to visualize the data. Select columns A and C (the date and the new value), open the Insert tab in Excel, and select a chart, for example a 2D Line. Excel automatically sorts out the axes, plotting the revenue dollar value on the vertical and the date and time values on the horizontal.

Here is what it looks like:

Angel image 1

Conclusion

Mining your audit data has lots of uses, and this is just one of them. If you are looking to expand on this, or build something more complex and you need some help, give us a call. Thanks for reading!

Ease into the cloud with Microsoft Dynamics Lifecycle Services

Topics: Microsoft Dynamics 365

Shifting to a PowerBI World

Today's blog post was written by Keith Mescha, a Principal Architect at Sonoma Partners.

Power B-What?

There's been a bit of buzz recently in the CRM circles around Power BI and how it fits into the overall CRM landscape. There is certainly no shortage of blogs, podcasts and event demos showcasing the power of this toolset. On a personal note, I'm am seeing the same level of interest based on the meetings on my calendar over the past few weeks.

What we're finding at Sonoma is that the topic of Power BI is still very confusing for most customers. In this multi-part series we hope to help demystify some of the confusion through real-life examples and suggest some best practices based on learnings from our internal use as well as implementations with our clients.

The Basics

Power BI is Microsoft's analytics and dashboard Business Intelligence (BI) suite of tools and solutions. It's made up of a few main offerings:

  • Power BI Desktop – Free development tool for building data models and dashboards
  • Power BI Service – Online PaaS offering to host and share Power BI solutions with a monthly licensing per user subscription based model
  • Power BI Embedded – Online PaaS offering for embedding Power BI solutions into applications for external consumption with a session based subscription model
  • Power BI Mobile – Free mobile app for consumption of deployed Power BI solutions on a mobile device
  • Power BI Enterprise Gateway – On Premise software used to extend on premise data to the Power BI Service

For latest and greatest features and release notes, refer to the Power BI site.

Real-Life Example

Every company I know has data quality issues or challenges to overcome. Unfortunately, we here at Sonoma are not exempt. One of the things we constantly struggle with is incomplete data and stale data. Often times, we will create a record without knowing all of the various data points we typically want to capture. If we do not go back and fill that information in once it becomes known, we are left with an incomplete and potentially inaccurate record. The genesis of the original report request was to provide a means for the data steward team to monitor newly created records.

The original ask was for a custom SSRS report that listed out all Accounts that were missing specific data across a pre-determined list of fields. This report would then be provided to our data steward team on a regular basis to ensure the data is augmented appropriately. When one of my colleagues stopped by to chat about the report, I happened to be in the middle of building a dashboard in Power BI for a customer. He was curious what I was working on, and as we talked and he explained his need, I fired up a new instance of Power BI desktop in an attempt to solve the problem.

image

Get Your Data

I pulled up the 'Filtered Account' view from the 'Get Data' option in Power BI. My User Account is a System Admin and the Filtered Account view resolves all the option set and lookup fields for us, so this was a quick and easy solution. When getting data for many of the source types, you have an option to Import or Direct connect. I’ll go deeper on that topic in my next post but for this solution I chose to import.

image


Shape Your Data

Initially Power BI pulls in all the attributes in the table, this can be a bit overwhelming so filtering down the attributes was our next step. My colleague had a list off the fields he wanted to audit and report, so I filtered down the data set to only include those attributes.

image

Within 15 minutes my colleague and I had a prototype dashboard built with the fields he needed, including a couple charts to use as filters. From there we iterated on a few designs, and ultimately landed on the below look and feel. End to end this entire process took a couple hours to build, including some custom columns to allow linking to our Account Forms on Dynamics and Salesforce.

image


Deploy Your Solution

Once we had a solid solution built in Power BI Desktop, I created a new O365 group for internal reporting, added a few users to the group and then deployed the Power BI solution to the Power BI service.

One key thing to note here is that we had to assign Power BI licenses to these users before they could login and see our dashboard. The fact that we wanted to schedule this a couple times per day and that we were managing this through our 0365 AD bumped us from the free to the paid license. We had a handful of users requiring access and plenty of licenses available so this was a decent option for us. If you have a different need there are other options, we can consider for staying on the free option.

image

Manage Your Solution

From there my colleague could interact with the report by logging into PowerBI.com.

image

After my colleague was happy with the presentation of the data and a few more cleanup steps, I was able to setup the data refresh of the data set in Power BI. We already had a Power BI Enterprise Gateway installed on our SQL Server for another solution, so including my data set in the refresh was very simple with just a few clicks.

Now that our solution was deployed, our data stewards have logged in and subscribed to the dashboards. Every morning they get an email with a snapshot of the dashboard and a link to access the dashboard where they get direct access to the records in Dynamics or Salesforce so they can easily update those with missing data.

In summary, the Power BI platform is a great set of tools that are easy to learn. There is quite a bit of information available through various internet communities, so getting started is easy. As shown here, from idea to solution was less than one day for a very simple deliverable. The toolset can extend into much larger solutions as needed, so please give us a call and let us help you take the next step in your analytics journey within your CRM applications.

In our next Power BI post, we will discuss in more detail the step of getting your data and how to decide on Data Import Vs Direct Query.

Topics: CRM Best Practices CRM for Professional Services Microsoft Dynamics 365 Salesforce

Infographic - Dynamics 365: Fast Facts and Advice for Transitioning

In our continued effort to keep you up-to-date on the new and exciting Dynamics 365 platform, check out our infographic on Dynamics 365: Fast Facts + Advice for Transitioning.

Dynamics 365 infographic previewDownload the Infographic >>

Topics: Microsoft Dynamics 365