Sonoma Partners Microsoft CRM and Salesforce Blog

Sonoma Partners Becomes a Microsoft CSP

Today's blog post was written by Corey O'Brien, VP of Development at Sonoma Partners.

We're excited to announce that Sonoma Partners has entered the ranks of the Microsoft Cloud Solution Providers. This means that not only will we continue to provide the best-in-class Dynamics services you have come to expect, but that we also now sell and support subscriptions to Microsoft services.

What does this mean?

We were already supporting these solutions, but adding this offering means we are now officially a one-stop-shop for our clients. Our support solutions and accelerators – more coming soon! – will not only add value to the platform but can now be included as part of our CSP offering. As said by Microsoft, “The Cloud Solution Provider program puts our partners in the middle of the relationship with the customer, delivering on solution success, and managing the business relationship.” You can learn more about the benefits of being and working with a CSP partner with Microsoft here.

Cloud solution provider-03

Our team has been working with Dynamics for years and is recognized as a leader in the Microsoft Dynamics world. We already have support tools to monitor our clients’ Dynamics instances proactively, providing insight into potential issues before they arise. Our team continues to look for ways to bundle in additional IP with subscriptions to provide more value to our customers.

Why should you care?

Starting immediately, our clients can buy subscriptions to Microsoft Dynamics, Office 365, and Azure from us directly. Purchasing through a CSP partner, as opposed to going directly to Microsoft, can provide you with a broader range of purchasing methods, more flexible and customized solutions, and a more direct line of support. Whether you have already taken advantage of our services or would like to start with licenses first, we can help.

If you would like more information on this topic, please contact us.

New Call-to-action

Topics: Microsoft Dynamics 365

Webpack and Dynamics 365

Today's blog post was written by William "Dibbs" Dibbern, Principal Developer at Sonoma Partners.

These days if you're building any complex custom UI, you're going to come across webpack, and for good reason. What is webpack? It's a build tool that you can read more about here.

Webpack is great, but when using it with D365 you will likely encounter a few bumps down the road. You'll likely notice that you are unable to load SVG or Font assets into D365, stumble across the restrictions on which characters can appear in web resource filenames, and flip a table over trying to get sourcemaps into D365 as well.

How do we get around these hurdles? The most obvious answer might be to use the url-loader module to embed your assets as Data URIs. That approach may work out fine in some cases, but there's a whole host of very valid reasons not to use the Data URI approach. Since we've got some significant downsides, let's take a look at other approaches to solving each of our problems.

Note: the below examples are targeting v3.4.1 of webpack

Unsupported File Types - Fonts

The way around most unsupported file types we care about is to append .css. We primarily use this for font files however. All you need to do to fix this in webpack-land is to adjust your output name pattern such that .css is appended, and you're all set. Therefore a font called something like awesomebold.ttf would become awesomebold.ttf.css.

Unsupported File Types - SVG

While we have support for SVG web resources supposedly coming in the next release, how do we handle this until then? You could certainly go with the same append .css to the filename trick we used for fonts, but in our experience we like another approach. We find our SVG assets do not change all that often, and are generally pretty small, so one would think we might lean back towards the url-loader approach mentioned above. And we do. Kinda. url-loader always does Base64 encoding of your assets, but SVG under the hood is just XML. As it is therefore human readable we prefer to pull the XML in pretty much as-is with svg-url-loader.

Filename Special Character Restrictions

If you've dealt with web resources for more than a hot second, you've likely stumbled across the fact that D365 will not allow you to name a web resource with a hyphen in it. According to the docs they technically only support:

[The] name can only include letters, numbers, periods, and nonconsecutive forward slash ("/") characters.

This means no hyphens. These are pretty darn common in asset filenames though, so thankfully we have a way around it! What we typically do here is simply ask webpack to strip out the hyphens using the customInterpolateName option. While the customInterpolateName option is not well documented, it is certainly available for us to hook into and thus solves our problem.

Sourcemaps

While it is arguable whether or not you want to store a sourcemap in D365, for homegrown applications these can be very handy for debugging, especially if you're not using something awesome like Imposter for Fiddler. Thankfully getting these into D365 is fairly easy. What we can do is use something similar to our first trick. In this case we prepend the word map as transforming the filename in any other way doesn't seem to work in all cases. Again, here we only need a quick change to our webpack config and we're off to the races.

Conclusion

So you see, with a few simple tweaks to our webpack config, we can now use just about any resources we want. Make a few config updates, kick off a new webpack build, and you've got your assets all set and ready to be uploaded to D365.

Have you come across any other issues with web resource development in D365 that you've yet to solve? We'd love to hear about them in the comments!

Topics: Microsoft Dynamics 365

Microsoft AppSource Admin Portal Improvements

Did you know that Microsoft AppSource recently celebrated its one-year anniversary? Me neither…time really flies, as I just assumed it has been around so much longer. The progress made over one year has been incredible!

Microsoft recently improved the app submission process experience with the release of their Cloud Partner Portal (https://cloudpartner.azure.com). In a previous post, I discussed how to prep your Dynamics 365 Customer Engagement solution to make it AppSource ready. This post will delve a bit more into the submission portal pages and detail the assets you need to have ready to submit an app.

Getting Started

First, you need to sign in with an account authorized to publish apps.

After sign in, you'll be taken to your offers list. This will show the status of your offers, including if there are any with errors existing with the submission process.

image

To create a new Dynamics 365 Customer Engagement (ie CRM) submission, click New offer and Select Dynamics 365 Customer Engagement.

image

This will take you to the new and improved Editor experience. As before, you can save your progress at any point (I always recommend you save often!).

image

The submission is broken into 4 key areas.

  • Offer Settings sets the Id, Publisher, and Name
  • Technical Info deals with Package information. The assets you need for this area is covered in detail in my previous post.
  • Storefront Details contains all of the marketing information required for the store.
  • Contacts simply collects the Engineering and Support Contact info

Storefront Details Section

Let's delve into my key takeaways from the Storefront Details section.

As you develop your solution for AppSource, be sure to have the following items ready from your content, product, and marketing teams:

App distribution logistics

  • A short summary and description of your application
  • Select your apps targeted industries, categories, and countries you plan to support
  • URLs to the following:
    • Help guide
    • App's privacy policy
    • Customer support
  • Terms of use
    • For some reason, this isn't a url, but the portal expects an HTMLI prefer to have our terms on our website, so we use the following HTML snippet:
    • <p>[App Name] is governed by the following terms located on our website.</p>
      < p>Please visit: 
      < a href=http://[url to license agreeement] target="_blank">http://[url to license agreeement</a>
      for full details.</p>

Marketing Artifacts

  • The new UI only requires two sizes for app logos. A 48x48 and a 216x216 sized logo. Be sure to the sizes correct
  • You can optionally connect videos that are stored on YouTube or Vimeo
  • You need at least one document. We usually add a product datasheet pdf
  • You can also add up to 5 screenshots of your app. Ensure your images size is 1280x720

Lead Management

Finally, you need to decide where you want AppSource to send the information it collects when users download your application. Your choices are:

  • None
  • Azure Blob
  • Azure Table
  • Dynamics 365 (CRM Online)
  • HTTPS Endpoint
  • Marketo
  • Salesforce

Additional Features

Compare

One final really cool new  feature is the Compare function. You can use Compare to find differences between the published version and your current draft version. This is helpful when you are making and trying to keep track of updates to your app.

image

image

Additional Portal Users

The new Cloud Partner Portal also allows the admin to create additional users who can have rights to contribute to the submission process.

SNAGHTML19b0cb5c7

This ensured that I am not a bottleneck in our app submissions and was a huge help to our organization. Smile

Final Thoughts

Once you are ready to submit, you will click the Publish button. Microsoft AppSource will go through a series of validation steps and email the contacts (and any additional email addresses you submit as part of the process) updates on completion or send you an email alert if it finds an error with something in your submission.

If you are an ISV and would like assistance with this process, please don't hesitate to contact us!

Topics: ISV for CRM Microsoft Dynamics 365

Connecting the Power BI On-Premise Gateway to a SQL Server Analysis Services Tabular Database

Today's blog post was written by Neil Erickson, Principal Developer at Sonoma Partners.

We’ve been surfacing a lot of our information for internal users via Power BI, and recently we wanted to author a round of reports that accessed an SSAS tabular database directly. We ran into some early snags, which were easily resolved by following some of the solution posted by the Power BI Team here.

Our takeaways from these early issues were to ensure a few things:

  1. Confirm that the account entered in the gateway configuration has been given Full Access within tabular database.

    Nerickson 1
  2. Confirm that the UPN that Power BI is using, the one on your Office 365 User object, matches the UPN of the target User in Active Directory.

Even with these changes in place we continued to see our report failing to load anything. We knew that the gateway was using EffectiveUserName to impersonate the end user, so we tried to determine if this was the culprit. In SSMS you can pass in the EffectiveUserName by going to the Connection Options and selecting the “Additional Connection Parameters” tab. By doing this, we captured in SQL Profiler the same error that we were seeing when the Power BI report would fail.

Nerickson 2

The following system error occurred: The user name or password is incorrect.

Nerickson 3

Nerickson 4

At this point we were confident that permissions were set properly within Analysis Services, so we began looking elsewhere. The step of verifying the UPN led us to Active Directory, and we discovered that making the account that runs Analysis Services an Domain Administrator solved the issue. This was not desirable as a long-term solution for obvious reasons so we continued to try various settings. Eventually whittled down the necessary permissions to Read on the AD User that is being specified by EffectiveUserName. To achieve this, we took the following steps.

1. Create a new Security Group and add the account running the SSAS Service to this group

We did this because we like to separate accounts that run different applications, and we like to retire old account when migrating to new versions of applications. It seems like this permission will be common, so our preference was to add the permission one time and manage the group membership as needed.

2. Grant the new security group read permissions to user objects in Active Directory

This permission can be granted to individual objects, or to an OU at a level that is sufficient to cover any users that would be passed into EffectiveUserName.

Nerickson 5

If this is being granted to an OU, click into “Advanced” and verify that is will apply to descendant objects as well. If it reads “This object only” edit the row and set “Applies To:” to include descendant objects as well.

Before:

Nerickson 6

After:

Nerickson 7

3. Restart the SSAS Service

Initially the report in Power BI still was showing the error but after waiting a few minutes and trying again it work. Are assumption here is Power BI has some caching so key point is when changing domain level security make sure you provide ample time between tests for settings to proliferate through the various layers.

Now our report is accessible on powerbi.com and is connecting to our On Premise Analysis Services Tabular Model deployed in our Data Center via the Enterprise Gateway.

Nerickson 8

Please reach out if you have any questions by filling out our contact us form or commenting below.

New Call-to-action

Topics: Analytics Microsoft Dynamics 365

Deleting Fields? Be Wary of Personal Views.

Today's blog post was written by Mike Dearing, Principal Developer at Sonoma Partners.

Dynamics System Administrators have free reign to do just about anything in a Dynamics organization… except with personal views. While you can manage your own of course, the personal views of others are locked down for those of us with even the greenest of bubbles in our privilege lists. And while the restriction itself makes sense I suppose – with 10s, 100s, or even 1000s of users, you can imagine how cluttered a System Administrator’s view lists would become if subjected to the whims of their user base’s individual views – this limitation does pose a problem when paired with the desire to delete custom fields.

Dearing 1v2
Behold the cursed yellow in a field of green, clipping our System Administrator wings, humbling us down to the level of a standard user.

Issue

You’ve been tasked with deleting a field in your organization - an easy feat for any System Administrator to accomplish. Or is it? Remember the 1000s of users mentioned before? Although this field may no longer serve a purpose for your organization, there was a point in time when it did, and someone is bound to have leveraged it as a column or criteria for one of their views. Dynamics will not notify you of this. It will tell you when the field is on a form, a system view, even a plugin image, but it won’t even warn of the possibility of this field being a part of a personal view – even your own.

Solution

Before I get into my code-based solution, there are several adequate non-code alternatives here, ones which most us have already been employing when facing a field deletion request. This can range from simply warning users of the upcoming deletion, and having them take it upon themselves to clean up any potentially affected personal views before the deletion occurs, or even tagging the field with a z_ at the front and setting it to non-searchable, making the field fairly out of sight for the average user.

But I wanted to take a different approach this time. A generic, programmatic approach, to clean any and all personal views of upcoming field deletions, without being bottlenecked by users, or corrupting their views in the process. My goal wasn’t to make the most optimized or prettiest resulting view definitions, but to safely eliminate references to fields with minimal disturbance to users.

Caveats

  1. It is possible to programmatically create personal views that introduce situations that this code may not reliably clean.
  2. Microsoft may update advanced find view creation to incorporate functionality that this code does not anticipate, and therefore cannot reliably clean.
  3. You cannot impersonate and perform actions on behalf of disabled users. This means any personal views belonging to a disabled user will be uncleanable – so be careful of re-enabling users without running this script again.
  4. If the fields being cleaned are the primary criteria or columns of a view, the view may no longer serve much of a purpose. Views are never fully deleted as a part of this script, just cleaned, so users may still want to go clean out views that are no longer useful to them post-cleanup.

Prerequisites

  1. This script was built to be run by Linqpad 5, a free tool that can execute C# code amongst other things. With minor adjustments this can be run from Visual Studio or your code editor of choice.
  2. You’ll need to reference the Microsoft.CrmSdk.CoreAssemblies nuget package, or the Microsoft.Xrm.Sdk assembly directly. This was written as of version 8.2.0.2.
  3. The user executing this code will need to be able to impersonate other users, meaning they will need the ‘ActOnBehalfOf’ privilege. If unsure if you’ll possess enough privileges, just run as a System Administrator.

Code Summary

  1. Use the Organization Service Proxy class to impersonate all enabled system users for the organization by setting the Caller Id
  2. Iterate over all personal views, for all entities, for these users
  3. Use XPath expressions to detect and clean view fetchxml and layoutxml based upon inputted entity and field combos. A first pass is done on the primary entity’s fetchxml, then a secondary pass for linked entities. Fields are matched by their entity context, so fields with the same schema across different entities will not be incorrectly cleaned in the process, and layoutxml leverages the proper entity aliases to ensure it is cleaned correctly. Multiple levels of links are also cleaned properly.
  4. Executes view updates per user, if supplied that a cleanup should be performed

Code

Example

I've created a contact personal view with a link to account and referencing a few fields specified in the code to be cleaned for both entities:

Dearing 2

And here are the accompanying view columns: 

Dearing 3
Click the image to expand.

FetchXml:

LayoutXml:

Result

The cleaned view definition:

Dearing 4

And the cleaned view columns:

Dearing 5

FetchXml:

LayoutXml:

After the script executes, we’re left with a functioning, albeit not fully optimized, view. I think we can all agree that it may not be the prettiest result given the sample we were fed, but we can now safely delete our fields without users receiving errors, and let the users decide what to do with their affected views on their own timetables.

Topics: Microsoft Dynamics 365

What if Power BI could help you forecast better?

Today's blog post was written by Brendan Landers, VP of Consulting at Sonoma Partners.

A new feature in the August release of Power BI is the What If parameter. This allows you to define values for a slicer that can be used in DAX formulas allowing you to look at data differently, based on what if scenarios. From my perspective, this is applicable when looking at opportunities.

Traditionally, we look at weighted revenue in forecasting opportunities with varying levels of success. Probability is typically self-reported by the sales representative or based on an opportunity status that is self-reported by the representative. While we are working on predicting opportunity outcomes using Azure Machine Learning, I still believe past performance is the best indicator of future results. I know with certainty what percentage of opportunity revenue we’ve closed in the past, so we can apply that to future opportunities to obtain forecast.

Below I hope to highlight how you can use the What If parameter in Power BI to look at a forecast.

The example will be rather basic to illustrate the capability of the feature, but you could get quite complex with this type of analysis.

First, in the Power BI Desktop I’ll start with a simple data model that includes Customers, Opportunities, and Users. I’ll create two charts. First, using a Clustered Bar Chart I’ll show the Expected Revenue by Sales Rep. Next, using a Clustered Column Chart we’ll show the Expected Revenue by Year and Quarter.

Blanders 1
Click the image to expand.

Next I am going to add in a What If Parameter.  To do so, under Modeling in the What If section, select New Parameter. 

Blanders 2
Click the image to expand.

 

The What-if parameter dialog is launched.  Here I will Name my parameter What If Percentage.  I will leave the Data Type as a whole Number, and set the Minimum to 25 and Maximum to 125.  This will allow me to see what if scenarios between 25% and 125% of the current pipeline, which I will show in a minute.  Leave the Add slicer to this page box checked and click OK

You will now see a new table on the right-hand side for What If Percentage.  Under the table, you will see two options.  The top option is the slicer values.  To add this to the page, create a new slicer chart and select the field.

Blanders 3
Click the image to expand.

We can now toggle the slicer between the values of 25 and 125, but we need to apply this to the expected revenue field to see the data in motion.  To do so, I’ll create a new measure in the Opportunity table called What If Revenue.  The formula is as follows:

What If Revenue = sum(Opportunities[Expected Revenue]) * ('What If Percentage'[Parameter Value]/100)

You can see we are multiplying the Opportunity Expected Revenue with the What If Parameter (after converting it to a percentage by dividing by 100).  We now have a dynamic value in our Opportunity table.  Next, we will add the new measure as values in our bar and column charts to view the Expected Revenue side by side with the What If Revenue.

Blanders 4
Click the image to expand.

So, now you can see with an Achieve Percent of 100, the values are equal, but as I slide the slicer I see the what if value change in real-time.  For example, if I slide the slicer to 60, I can see what the numbers would look like if we hit 60% of the expected revenue.

Blanders 5
Click the image to expand.

I hope this illustrates how you can apply the new what if parameter to your Dynamics 365 for Sales data to help forecast your pipeline. Let us know if you have any questions by commenting below.

Topics: Analytics Microsoft Dynamics 365

D365 Edition: Dynamic Forms vs. Business Rules

Today's blog post was written by Justin Concepcion, Developer at Sonoma Partners.

First released in 2013 as a feature of Dynamics CRM 2013, Microsoft created Business Rules as a way to automate an entity form without requiring the use or knowledge of JavaScript. At that time, we released a blog post comparing Business Rules to our own entity form automation solution, Dynamic Forms. Since then, however, many updates have been added to Business Rules, and we wanted to come back to this comparison and review it with an updated view.

In the initial blog post, we found that Dynamic Forms provided a lot of functionality that Business Rules do not. In 2017, there are still gaps in Business Rules that can be filled using Dynamic Forms. For example, both Business Rules and Dynamic Forms have the capability to manipulate fields on a form; including setting field values, disabling fields, or requiring them. However, only Dynamic Forms is able to manipulate sections, tabs, and even the forms themselves. With regards to conditions, Dynamic Forms is also able to check many more things that Business Rules cannot. For example, while both can check field values when deciding whether to run a rule or not, only Dynamic Forms can check the form type, related entity field values, and even the current user’s security role.

Since 2013, however, a few new features have been added to Business Rules that Dynamic Forms currently does not provide. For example, Business Rules in Dynamics 365 can now show Field Recommendations, which displays a tool tip next to fields suggesting certain values. This allows you to recommend certain values for fields without actually setting the fields themselves. Another large change with Business Rules is the fact that they now run server-side instead of client-side on a user’s computer. Since Dynamic Forms only runs client-side, this means that Business Rules will be at an advantage for performance and easier maintenance when JavaScript would be paired with a plugin.

Overall, in 2017, Dynamic Forms continues to provide a lot of functionality that the Business Rules do not. However, since 2013, new features have been added to Business Rules that currently are not provided as well in Dynamic Forms. When deciding to use one or the other, we recommended reviewing your business requirements and making sure that you choose the one that best suits your needs as well as ensuring the rules interact correctly. Potential issues to watch out for include plugins and workflows getting triggered by server-side business rules and client-side scripts (such as Dynamic Forms) behaving unexpectedly with server-side with business rules.

Please use the table below as a guide to see which features are included in Dynamic Forms, Business Rules, or both: Ross and j tableIf you wish to try out Dynamic Forms, the Dynamic Forms Community Edition is free to download but limited to a max of 10 rules you can create.  If you need more or have any questions, please contact us.

To learn more about Dynamic Forms and download the free Community Edition here.

New Call-to-action

Topics: Microsoft Dynamics 365

Dynamics 365 Multi-Select Fields

One of the most asked for features from our customers is “does Microsoft Dynamics CRM have multi-select fields?”  Well, now we can safely answer “Yes” to that question, as the new version of D365 version 9.0 will have this in the product.

Note that I’m calling this D365 version 9.0, as this was originally the Spring release of D365, then July 2017 release of D365, and now since July has come and gone, it’s not clear when the release will drop.  However, we do know that it’ll be the next major version v9.0.

Multi Select Pre D365 v9.0

Before the release of D365 v9.0, there were many ways to get around the fact that Microsoft CRM did not have multi-select fields built in:

  • Custom 1:N entities where each child record represented a selected value
  • Custom N:N relationship where linking records in each entity together represented a selected value
  • Multiple custom fields on the entity where making selections in each field represented a selected value
  • Custom 3rd party solution such as Sonoma Partners multi select tool for D365

These solutions came with their own pros and cons, and neither was 100% the way our customers wanted the solution to work.  Some came with a heavier investment, some came with a less than ideal user experience for either adding values in CRM or importing data in bulk, etc.

However, all that changes with the v9.0 release of D365.

How Does It Work?

To use multi-select fields in D365, users will configure them just as you would configure any field.  Microsoft has introduced a new field type called “MultiSelect Option Set” that you can select when creating a new field.

image

All properties of fields that exist for the current field types, also exist for this new multi-select field type (e.g., enabling auditing).

Note that you can also make a multi-select option leverage a global option set and therefore you can have multiple multi-select option sets using the same list of values.  There isn’t a difference between global option sets that can be used for single-select or multi-select option set fields – it’s the same list of available global option sets to use for both.

System Administrators and System Configurators can add the multi-select field to forms and views, just as you would any other field.

How Does It Look?

When you have a multi-select option set on a form, and click into the field, you’ll initially see a slim dropdown appear with no values with the text “enter text here” which will allow you to start typing and then will display values that match the text you entered.

image image

Alternatively, if you don’t want to start typing to see values that match your text, you can click on the dropdown arrow on the right side of the field to display all values.  This is someone of a pain as it requires two clicks to see all values in a multi-select option set, whereas for a single-select option set still requires a single click to see all values.

image

image

As users start selecting values by a clicking the checkbox, the values appear above the drop down with a little “x” on each value where a user can click on it to remove that value (or just uncheck the checkbox).  The value you select first drives what value appears first in the list above (it’s not in alphabetical order).

image

When done selecting values, simply click somewhere else on the form, and you’ll see your values selected semi-colon delimited.  The values, once select, now appear in alphabetical order in the field.

image

When viewing the data in a view, you’ll see the values selected semi-colon delimited as well.  Also note that view filtering is supported.  Therefore, users can select values to filter the records with as you would with a traditional single-select option set.  However, with a multi-select option set, when filtering, records are returned if any value selected to filter with, matches at least one value selected for the multi-value option set.  In other words, if you select a value to filter records with that is contained in a record, and select a value that is not contained in that same record, the record will still return after filtering because it met the criteria that at least one value existed in its field.

image

And what about editable grids (recently released in December 2016)?  Yup, multi-select option set supports those as well!

image

A new “Contain Values” operator was also added to Advanced Find.  When this is selected, records are returned where ANY of the values selected are contained in the field for those records (think of this as an OR statement).  The “Equals” operator only returns records where there’s an exact match to the values selected in Advanced Find (think of this as an AND statement).

image

Technical Details

There are a few additional details to note about the new multi-select field that will be released with v9.0 of D365.

  • You cannot convert a regular existing single-select option set field to a multi-select option set field at this time.
  • Multi-select option set fields cannot be a calculated or rollup field (single-select option set fields can be a calculated field).
  • Multi-select option sets support the web client, unified interface, advanced find, FetchXML, Platform SDK, and Client SDK
  • There is full platform support to use SDK messages for retrieve

Additional Resources

As mentioned earlier, this is one of the most sought-after features that is finally making its way to the product, and we’re excited for its release.  For additional information on this feature, along with what else is coming with v9.0, check out Microsoft’s documentation.

Topics: Microsoft Dynamics 365 Microsoft Dynamics CRM Microsoft Dynamics CRM Online

Leading CRM for Leader Dogs

Today's blog post was written by Kayla Silverstein, Marketing Specialist at Sonoma Partners.

Bogged down by an inflexible custom CRM solution (Quilogy), Leader Dogs for the Blind wanted a more efficient way to track operations and client records in Microsoft Dynamics CRM (now Dynamics 365).

Who is Leader Dogs for the Blind?

Leader Dogs for the Blind is a nonprofit organization based in Rochester Hills, Michigan. Founded in 1939, they provide guide dogs to the blind and visually impaired. Through their programs, Leader Dogs helps clients find and work with guide dogs for greater mobility, independence, and quality of life.

Leader Dogs Project Fast Facts:

  • Industry: Nonprofit
  • Workload type: Customer Service
  • # of employees: 65
  • # of users in deployment: 65
  • Platform: Dynamics 365
  • Fun fact: Leader Dogs operates as the only facility in the Western Hemisphere to teach deaf-blind students how to work with a guide dog.

Leaderdogs_infographic8_2017

The Challenge:

  • Previously, Leader Dogs used a customer solution called Quilogy to manage operations, client services, puppy breeding, and training. While functional, Quilogy was an old system with limitations in both capability and scalability. For example, the outdated solution was not built to track puppy production schedules or many of the other unique operational components Leader Dogs required.
  • An outside consulting firm developed Leader Dogs’ custom solution several years ago and their relationship with the firm had since dissolved, making any opportunity to further customize or update the system impossible.

The Solution:

  • Replace Quilogy with Dynamics 365 to maintain all department records within the organization.

The Result:

  • Automation in CRM manages daily tasks with the dogs (such as flea checks, baths, etc.). Based on different triggers in the system, CRM creates Task records and assigns them automatically. When dogs are handed off between different teams, Leader Dogs can see which Tasks have and haven’t been completed.
  • The Breeding department uses CRM to trace dogs and their performance over their lifetime. A lot of analysis and science goes into picking the right dogs to breed for key traits that are essential parts of a strong guide dog. With CRM, they’re able to see how the dogs perform both in training and on the job, creating a strong feedback loop.
  • Their portal now meets accessibility standards for use by the visually-impaired.
Topics: CRM Best Practices Microsoft Dynamics 365

Compare Fields Across D365 Orgs to help Debug Solution Import Failures

Today's blog post was written by Matt Dearing, Principal Developer at Sonoma Partners.

When working with a solution it is common to make changes to customizations. Scenarios like an incorrect attribute data type that needs to be recreated are very common, especially early in development. Although it is easy to recreate attributes in a development environment, it can be challenging to import that solution into a target with a previous version of the solution. The import process may fail in the target organization with a generic error message. When this occurs, it is generally related to attributes being recreated. The following is a simple LINQPad script you can use to compare attributes in two different environments. It will print out any differences in casing of schema name or data type which are two of the more common reasons recreated attributes will cause a solution import to fail.

The script itself connects to two different CRM environments: a source where customization changes are made and a target for deploying the updated solution. The script scans the target to get all of the custom entities with a given prefix where the prefix matches that of the publisher for the solution. If the entity is new in source, there is no reason to compare it to target as it will not fail the import.

Next the script loops through the custom entities for the target environment capturing each entities attributes. Schema Name and Data Type are the two most important metadata properties to compare, so those are returned. If an attribute is deleted and recreated and the casing of the Schema Name does not match exactly, the solution import will fail. This is because CRM does a case sensitive comparison of attributes on solution import to determine if any attributes are new and should be created. It will see two attributes with differently cased schema names as two unique attributes, the failure will occur in the SQL database as two columns cannot have the same name. Unfortunately this won't show up in the solution import UI as a helpful error. The second more common failure is that the data type has changed. Maybe the attribute was an integer/whole number and should have been of type money, or it was a picklist and should have been a string. In that case the solution import will also fail when Schema Names match but data type differs.

Next the source environment is queried for its attributes to do the comparison. If the entity no longer exists in the source environment, an exception will be thrown and caught and printed to the screen. This would mean the entire entity no longer exists in the source environment, so there is no comparison that can be made.

Finally the comparisons are done by Schema Name and Type and any discrepancies are printed out to the screen. If the source attribute no longer exists, there is no need to compare.

Knowing the discrepancies in attributes between environments can help tremendously when attempting to figure out why a solution import is failing. Here is the full source code:

Let us know if you need any help or have any questions by commenting below.

New Call-to-action

Topics: Microsoft Dynamics 365