Sonoma Partners Microsoft CRM and Salesforce Blog

Lightning Locker Service and You: Working with SecureDOM

Today's blog post was written by Nathen Drees, Senior Data Integration Developer at Sonoma Partners.

One of Salesforce’s recent major releases is the force enablement of the Lightning Locker Service. While Salesforce has done their best to give plenty of warning about the Locker Service being force enabled, we still see some of our partners and developers confused about what exactly it is, and if it affects them.

Today we wanted to review what the Locker Service is, provide context for when it will affect you, and some recommendations on working within it.

What is the Lightning Locker Service?

When Salesforce released the Lightning Experience, it had a vision for the AppExchange: developers could write lots of tiny components, and administrators could install them in their org, promoting code reuse and reducing the need to hire for custom development. It was a win-win proposition for both parties – developers have more chances to sell their individual widgets, while administrators have more options to choose from. But this vision comes with a problem: for this exchange to work, Salesforce needs to guarantee that the various components installed by administrators stay safe from one another, to prevent (accidental or otherwise) runaway scripts from causing too much damage. Unfortunately, browsers don’t natively provide a way for multiple scripts within a page to be sandboxed from one another – everyone has access to (more-or-less) everything on the page.

Enter Lightning Locker Service

This is where the Lightning Locker Service comes in to play. It provides each lightning component a “locker” that is must fit within, stricter guidelines on what it’s allowed to do, and how it can communicate with other components.

Here’s a comprehensive introduction to the technical aspects of the Locker Service on the Salesforce blog, but the key idea to understand is that components are not able to access anything on the page at any time. If components wish to communicate, they  need to coordinate in some agreed upon manner (such as application events).

Does the Lightning Locker Service Affect Me?

If you write Lightning Components, the Lightning Locker Service affects you.

Going forward (API version 40+), you will need to ensure that your components adhere to the Locker Service’s rules or you may find that they no longer function as expected. However, adhering to the Locker Service’s rules isn’t as difficult as it may seem.

Working Within the Lightning Locker Service

If you are sticking to the native APIs provided by Lightning Components, enabling the Locker Service doesn’t change much for you during development. The main area it impacts is when you try to navigate out of your current component into another – this may no longer work as expected, specifically if those other components are not in the same namespace as yours.

In an attempt to help developers ensure that they’re staying within the guidelines (and also pass a security review), Salesforce released the Salesforce Lightning CLI, which lints the components for known errors and mistakes. Using this tool liberally will help to avoid the common cross-component scripting mistakes that may break your components going forward.

If you do need to work across components, you need to start designing and writing events to let the components communicate amongst themselves. Depending on the structure of the components, you’ll want to consider either an application event or a component event.

Wrapping Up

There are of course many more tricks for making sure your components continue to work with the Locker Service, most of them being not very difficult to implement. If you would like help reviewing and/or correcting your components, or are looking to launch your product on the platform but could use the help of experienced developers, contact us and we can help.

Topics: ISV for CRM

Managing Multi-Domain Mailboxes in Dynamics 365

Today's blog post was written by Neil Erickson, Principal Developer at Sonoma Partners.

We often get asked about the viability of complex CRM scenarios. Below is one such scenario:

The customer had multiple instances of Exchange running in multiple domains. Dynamics CRM users existed as Active Directory users on the same domain, so both users existed as users within Dynamics CRM. One user was set up in the primary Exchange instance. The other user was set up as a mail contact with an email address outside the primary Exchange domain. We were asked if it were possible for both users to be email-enabled within CRM.

The first thing was to determine the sync method to use for the mail contact user, which was easy in this case: the preferred option in this case was for the user to use the Microsoft Dynamics for Outlook plugin to sync with CRM. From there, we followed these steps to test out the account.

Step 1: Creating the users

Primary Domain: sonomacrm1.local

Primary Exchange server: sonoma-crm1exch.sonomacrm1.local

CRM server: sonoma-crm1.sonomacrm1.local

Secondary Exchange domain (for mail contact user): sonomacrm2.local

First user: Control user

AD: SonomaTest1

Domain: sonomacrm1.local

Exchange Mailbox: SonomaTest1@sonomacrm1.local

CRM user: Sonoma Test 1

Second user: Mail contact user

AD: SonomaTest2

Domain: sonomacrm1.local

Mail contact user email address: SonomaTest3@sonomacrm2.local

CRM user: Sonoma Test 2

Step 2: Adding the users to CRM

Both users were added from their existing AD users using the New Multiple Users function. CRM correctly imported their information and email addresses.

Step 3: Setting up user's mailboxes in CRM

Both users Mailboxes had to have their email addresses verified, then approved. Once approved, both mailboxes were then activated in CRM to mark them as active mailboxes.

Step 4: Setting up the Microsoft Dynamics CRM for Outlook plugin

This step consisted of setting up the plugin for each user on separate client machines. Both connected without issue. The setup for the control user was straightforward. The caveat here was setting Outlook to the right mailbox login for the mail contact user. In this case, we connected it to SonomaTest3@sonomacrm2.local. So it attempts to connect, but it passes that password from the SonomaTest2 account. Correcting that is an easy but critical part of the setup. This is critical since SonomaTest2 will be used to authorize to the domain and to the CRM instance but not to Exchange.

Once they are connected properly to Exchange, configuring the CRM for Outlook plugin is as simple as plugging in the correct CRM organization URL.

Step 5: Testing out emails, appointments, and tracking

For testing, we developed scenarios that would work seamlessly in the control environment for SonomaTest1. These included:

  • Tracking an existing email in CRM from Outlook
  • Composing an email from the CRM dialog and tracking it
  • Composing an email from the CRM dialog and tracking it with a regarding field set
  • Tracking an existing Outlook appointment in CRM
  • Composing a new meeting request from the CRM dialog in Outlook, tracking it, and setting a regarding field.
  • Composing an email to a contact from within CRM

These all worked without incident in both the control environment and in the environment for the secondary domain. Monitoring SonomaTest3's external mailbox confirmed emails and appointments were being sent out.

Topics: Microsoft Dynamics 365

Create Dynamics 365 Plugins with VS Code and the .NET Core SDK

Today's blog post was written by Ian Moore, Senior Developer at Sonoma Partners.

All the buzz in the Microsoft development ecosystem in recent months has been about .NET Core. If you haven’t been able to keep up with all the new technology and terms, here is a quick recap:

  • .NET Framework – The version of .NET you know and love. It is installed at the operating system level and runs on Windows.
  • .NET Core – A new implementation and runtime of .NET that is cross-platform across Windows, Linux, and MacOS. Applications that target .NET Core include the necessary framework assemblies with the application.
  • Xamarin – A .NET implementation that runs on mobile devices with Android or iOS.
  • .NET Standard – This is specification of .NET APIs – things like System.Xml, System.Linq, etc. – that are available on a given .NET runtime. Code that is targeting .NET Standard can run on .NET Framework, .NET Core, and Xamarin.
  • NET Core – This is the latest version of the ASP.NET web application framework. It can run on both .NET Core and .NET Framework.
  • .NET Core SDK – This is a new, cross-platform, standalone SDK for .NET applications. It includes a new project file format, NuGet enhancements, and a command line tool that can build and run .NET projects. This SDK can build projects for .NET Core, .NET Standard, and .NET Framework.

Okay, not so quick after all – but there’s a lot of new and great things happening in the .NET world these days. What do these .NET Core developments mean for developers who work with Dynamics 365? Since the Dynamics SDK Assemblies are still built for .NET Framework, we can’t use them to build anything cross-platform. But we can use the new .NET Core tools to develop plugin assemblies and console apps that connect to Dynamics 365. With Visual Studio Code, a new text editor from Microsoft, we can do all the development we need without ever opening Visual Studio.

Here’s what you need to get started:

1. Install the .NET Core SDK from Microsoft -

2. Open a command prompt and run ‘dotnet’ to confirm the SDK command line tools were installed successfully.

3. Install Visual Studio Code -

4. Start Visual Studio Code and open the Extensions menu. Search for the ‘C#’ extension and install it.

With those installed, we can make a Dynamics 365 plugin project with a few simple steps.

1. Open VS Code and go to File -> Open Folder and choose a new folder for your plugin source code.

2. Open the VS Code Integrated Terminal from the View menu or using Ctrl+` . This terminal will be PowerShell or cmd depending on your VS Code settings.

3. In the terminal, run dotnet new classlib to create a new Class Library project in your current directory.

Ian moore blog 1

4. The command places two files in your directory: a stub .cs class file, and a .csproj file named after your directory. Open the .csproj file in the editor.

5. If you have ever looked at a csproj file in Visual Studio before, you will immediately notice the new format is significantly smaller. However, we need to change the TargetFramework to .NET Framework 4.5.2 to match the Dynamics SDK.

Ian moore blog 2

6. Now we need to add the Dynamics SDK package from NuGet. In the terminal run dotnet add package Microsoft.CrmSdk.CoreAssemblies. You will see this command adds a PackageReference to your project file.

7. Run dotnet restore to resolve the package reference.

Ian moore blog 3

8. Rename the Class1.cs file to MyPlugin.cs and open the file in the editor. Try making a simple plugin and notice that you have full intellisense support.

Ian moore blog 4

Here is an example plugin I wrote:

Ian moore blog 5

9. Back in the terminal, run dotnet build to build the project. You will find the output DLL in the bin/Debug/net452 folder.

Ian moore blog 6

10. There is still one more step required before we can register the assembly in Dynamics 365. We need to give the assembly a strong name using the strong name tool (sn.exe).

11. If you have a strong name key file already, you can skip this step. Otherwise, search for “Developer Command Prompt for VS2015” in the Start Menu and create a new key file with the command sn -k [YourProject].snk.

Ian moore blog 7

12. Back in VS Code, add the following lines to your csproj file within the Project element:



13. Your assembly will now have a strong name. Run dotnet build again from the command line to rebuild your project.

14. Now you can register your DLL with Dynamics 365 through the Plugin Registration Tool

Ian moore blog 8

With these steps complete, you can continue to add new plugin types by adding .cs files and including them in the .csproj file. This is exciting news for Dynamics 365 developers - it allows you to have all the tools needed to build plugins with the fully configurable VS Code editor. While Visual Studio will continue to be the go-to development solution for Dynamics development, it is nice to have a new and reliable option moving forward.

Topics: Microsoft Dynamics 365

Dynamics 365 Demo Video: Power BI Query Accelerator

Today's video and blog post were created by Kristian Altuve, Business Analyst at Sonoma Partners.

Microsoft Power BI combines a collection of software services, apps, and connectors to turn your complex sources of data into a visual and interactive format. However, when connecting to Dynamics CRM, there is still a lot of data cleaning needed before it is in a user-friendly state to begin creating Dashboard components. Power BI Query Accelerator was designed by Sonoma Partners to get your Power BI environment set up in a fraction of the time! Check out the video below to learn more or go directly to our download page here.

Topics: Microsoft Dynamics 365

Handling a Project Curveball with Project Service Automation and MS Project

Today's blog post was written by Nick Costanzo, Principal Consultant at Sonoma Partners.

For the last several months, Dynamics 365 has allowed project managers to use the MS Project Add In to link project plans to Project Service Automation (PSA) and manage their projects. The instructions here walk you through how to install, configure, and manage your WBS in MS Project while syncing it back to the linked project record in PSA. This works great if all projects run smoothly and everything stays on track. But what if your client throws you a curveball and the project hits a delay? We’ve all dealt with this, and the broad impact it has on the project team, resource managers, and financial forecasts.

Today I am going to cover how to leverage MS Project and PSA to address these challenges.

If we manage each change correctly, we can avoid striking out when we are faced with that hot, stinky, cheddar pitch.

1. First, let's start with our baseline project plan to create the project record in PSA and link them up by going to the Project Service tab > Publish > New PSA Project:

Nick cos 1

2. We’ve also gotten our resources booked, so now let’s fast forward to the completion of our Discover phase and our planned kick-off of the Iteration 1 Define phase on 5/29:

Nick cos 2

3. Let’s take a snapshot of our booked resources for reference. You can see the detailed view from the Bookable Resource Booking (BRB) records related to the project, which I’ve added as a navigation tile to the project form. For simplicity, I’ve also filtered the records down to Alan, our PM, and the week of 5/29/17.  As you can see, it’s a bit challenging to summarize the data as there is a record for each day, and the duration is displayed in hours in the view, but minutes in the chart:

Nick cos 3

4. Alternatively, once you refresh the plan in MS Project, you can get a much nicer view by going to Resource Usage. As you can see, we had Alan booked for 40 hours this week:

Nick cos 4

5. Next we will simulate what would happen if the client hits a delay with the Iteration 1 Define phase, by moving this date out by 1 month to 6/29/17 in MS Project and publishing it to PSA. You could do this directly in PSA, but in order to do so, you would have to unlink the MS Project plan.

Nick cos 5

Pro tip, make sure your all of your MS Project tasks are scheduled with Fixed Work, by selecting all, then clicking on Information > Advanced > Task Type > Fixed Work, to avoid unnecessary BRB records remaining with the original baseline dates.

Nick cos 6

6. Now you can see that the Iteration 1 Define phase start date and all subsequent dates have been moved out in the PSA WBS:

Nick cos 7

7. However we also need to make sure that the associated resource bookings have been moved as well. You can see that the all of the BRB records for the project have been adjusted according to the new schedule from the Iteration 1 Define phase onward. Again this is more clearly displayed in the MS Project, Resource Usage view:

Nick cos 8

8. With that complete, you have successfully re-baselined your project! And by keeping your resources in sync with the new dates you gain the advantage of:

  • Giving project team resources an updated view of their project bookings
  • Avoiding extra work for resource managers to change these bookings, and also allow them to book these resources to other projects during the delay
  • Ensuring the financial forecast is accurate for your project

Simple as that, by fighting off those curveballs, you can stay in control of your project and keep everyone on track.  After all, it's a lot easier to lay down a bunch of singles than it is to hit a home run!

Topics: Microsoft Dynamics 365

Sonoma Partners at D365UG / CRMUG Summit Nashville

Today's blog post was written by Ariel Upton, Marketing Manager at Sonoma Partners.

As June arrives and summer begins to spread its welcoming arms, it may pain you to turn your attention to the fall and look ahead to October. But if you’re planning to attend the D365UG/CRMUG Summit Nashville, it’s time to take a brief pause from the BBQ to take advantage of early-bird pricing (valid until June 29th, 2017).

Summit is the leading live event for Dynamics 365 users, featuring user-produced education on how to maximize the use of Microsoft Dynamics 365 and CRM. If you and your team are undecided about attending this year’s events, here are four reasons you should add Summit to your fall conference calendar.

Crmug blog 1

1. Content to benefit your business and your personal development.

We’ve all been to an event where we walk away thinking, “Well that was fun, but I’m not sure I actually learned anything meaningful.” Unlike other product conferences, the folks planning Summit go to great lengths to build a schedule filled with topics that CRM users are interested in learning about. The concentrated learning paths for Microsoft Dynamics 365 and CRM users (based on both role and skill level) include sessions focused on individual role responsibilities, industry best practices, and broader business goals. At Summit, you can engage in a tailored learning experience that best suits your individual needs. Thus, you’re guaranteed to leave with inspired ideas and actionable items that you can immediately layer into your deployment.

2. Network and swap best practices with peers.

Only at Summit does every other attendee talk the same talk and walk the same walk as you. At Summit you have authentic opportunities to meet with Dynamics peers and create connections that last after the event comes to a close. This is the perfect environment for you to talk, grow, build, and share your experiences with D365 & CRM with other users. 

3. Learn from the experts to get the greatest ROI.

Learn best practices and real world use cases from Dynamics 365 experts and Microsoft MVPs. Engage with a network of professionals throughout the User Group for Dynamics 365 & CRM who can help you uncover the best way to configure and customize, while maximizing out-of-the-box capabilities. When you compare high-quality training experiences to Summit, the ROI is unmatched. You and your team will walk away with tangible lessons to improve your implementation.

“Summit training is 3 ½ days of learning and networking. The total cost of the trip equals 9 hours of billable time. Awesome bargain!” – Joni, IT Manager, past Summit attendee

4. Spend time with Sonoma Partners.

With the sunset on Convergence, it’s inspiring to see the Dynamics 365 & CRM community pivot their energy and full attention to Summit. With year-over-year attendance growth and more active involvement than ever, we’re proud to be part of a group of people dedicated to the product’s evolution and success. Sonoma Partners is committed to helping people maximize their investment of Dynamics 365 & CRM, and we’ll be active participants and sponsors of this year’s event. Check us out at booth 811.

Music City, and the future success of your CRM deployment, is calling. Stay tuned for more announcements on our involvement at Summit 2017 and register now to take advantage of early-bird pricing.

Topics: Microsoft Dynamics 365

Best Practices for Microsoft Security Settings: Append vs. Append To

Today's blog post was written by Jen Ford, Principal QA at Sonoma Partners.

I frequently forget the difference between these two security settings. When setting up security permissions for a particular entity, I have always set them to the same values as each other, as a best practice, “just in case.” These two values do not always need to be the same, so let’s discuss how to set them properly with a few examples.

Say I am on the Contact form. I have a lookup to Account, and I have a subgrid of related Cases on the form. How can I set up my Append and Append To privileges on the Contact, Account, and Case entities so that they make sense for someone to create a Contact and associate these records?

To correctly set the Account lookup on the Contact record, set ‘Append’ on the Contact, and ‘Append To’ on the Account.

To correctly associate Cases to the Contact on the subgrid, set ‘Append To’ on the Contact, and ‘Append’ on the Case.

In this case, we are setting both Append and Append To on the Contact, but it is to provide security to fulfill two different relationships with the Contact.

If I wanted to provide the user with access to set all lookups on the Contact, and to associate any related records, this is a general rule to help set security settings:

For 1:N relationships on the Contact entity, set APPEND TO on the Contact entity, and APPEND on the related entities.

For N:1 relationships on the Contact entity, set APPEND on the Contact entity, and APPEND TO on the related entities.

In terms of other security, here are other considerations that will minimize the amount of security errors you could receive when creating or updating records. Let’s return to our example of a Contact record, with a lookup to the Account, and a related subgrid of Cases:

Do not set the APPEND TO level on the Account to be lower than the READ privilege of the Account. If you do this, then you will be able to select an Account you do not have access to APPEND TO a Contact, and get a security error.

Do not set the WRITE level on the Contact to be higher than the APPEND privilege of the Contact. If you do this, then you will be able to set an Account on the Contact that you may not have access to APPEND, and you will get a security error (this also depends on the READ level of the Account entity).

The same would hold true for associating a Case to a Contact:

Do not set the APPEND TO level on the Contact to be higher than the READ privilege of the Contact. If you do this, then you will be able to select a Contact you do not have access to APPEND TO a Case, and you will get a security error.

Do not set the WRITE level on the Case to be higher than the APPEND privilege of the Case. If you do this, then you will be able to set a Contact on the Case that you may not have access to APPEND, and you will get a security error (this also depends on the READ level of the Contact entity).

This is intended to be a starting point. There are always situations unique to customers that may need to set their permissions in a slightly different way. For example, Activities require both APPEND and APPEND TO to be set, in order to successfully create the records and associate them to other entities. As always, thoroughly test your security roles to ensure they are fully functioning.

Topics: Microsoft Dynamics 365

MSCRM Database Maintenance and Configuration Tips Checklist

Today's blog post was written by Rob Jasinksi, Principal Developer at Sonoma Partners.

If you are running Microsoft Dynamics CRM On-Premise, you also have to maintain and support the database that it uses. In this post, I have put together a list of some basic best practice tips that you should consider for your local CRM databases.

MSCRM uses a database for each organization plus a general configuration database that all organizations share. The naming conventions are generally [OrgName]_MSCRM for the organization database plus MSCRM_CONFIG for the configuration database. This article will focus on the organization database, but the same tips and best practices can also be applied to the configuration database.

Configuration Tips

  • Have the database data files (mdf) and log files (ldf) on separate disks or volumes. Both files are heavily used during normal CRM use and having the files on separate disks will spread the disk I/O across both disks, giving better overall performance.

  • Change the database auto grow feature to a higher value from its default of 1MB. Growing a database can be I/O intensive and you want to avoid it when possible, especially during business hours. I usually use a value of 256MB so this happens less frequently. But if your database grows faster or slower, this value can be adjusted higher or lower. Another option is to turn off auto grow and schedule it to run only when needed and during off hours. However this requires monitoring and if you forget, the database will fill up and CRM will cease to function normally or not at all.

Rob jas 1

  • For the organizations strictly used for development (non-production) purposes, consider changing the recovery model to Simple. This won’t keep a history of transactions and avoid having the transaction log grow and fill up the entire disk. Since data is constantly flowing in and being deleted to re-test the log can fill up fast. For most development environments, usually daily full or differential backups are sufficient for most organizations (unless there is a specific reason not to, for example, to match production).

Rob jas 2

Maintenance Tips Checklist

  • Make sure regular backups are being done. This sounds pretty basic, but after installing CRM, it’s possible that no one actually setup a daily backup plan. You can check this by right clicking on the database and choose Reports -> Standard Reports -> Backup and Restore Events.

  • Defragment or rebuild the indexes on a regular basis. Over time as data is added, updated, and deleted from CRM this will cause the underlying indexes to become fragmented over time. This can cause performance issues and only get worse over time. You can see if your indexes need to be maintained by right clicking on the database and choose Reports -> Standard Reports -> Index Physical Statistics. This will generate a report listing every index in the database. The column on the right will display a recommendation if the index needs to be rebuilt or reorganized. If you see a lot of these, it’s time to defragment those indexes.

  • Make sure you run CHECKDB against the database on a regular basis. This will check the consistency of the database and find any problems that need to be looked at. It’s better to find any problems early rather later after it grows to the point where the database won’t even start anymore. In that case, you may have to restore from the last backup. Note: only non-data loss CHECKDB operations are supported by CRM.

  • Check the size of your database files and the amount of disk space you have left on a regular interval. I have seen many cases where a database continues to grow until it fills up the entire disk and cannot grow anymore. Resolving this issue can be time consuming and your system will be down the entire time, so better to catch early and fix.

  • Look for any table growing out of control in size, it may indicate a more serious problem in your CRM. It’s common to see some tables, like AuditBase and Activity[Pointer/Party]Base to grow large. But if you see the contact table growing to a million records and you think you should only have 50,000 contacts, it may indicate that many duplicates have made their way into the system and something that should be investigated as to how they got in there and how to clean them up. To get a report of large tables, right click the database and choose Reports -> Standard Reports -> Disk Usage By Top Tables.

Hopefully these tips will help keep your CRM database in top condition and perform at its best. If I get any feedback below to dive into specific topics in this post in more detail, I might write that up as a separate post later.

Topics: Microsoft Dynamics 365

Accelerate Your Dynamics 365 to Power BI Analytics Path

Today's blog post was written by Keith Mescha, Principal Architect at Sonoma Partners.

If you have tried to connect to your Dynamics 365 org via the OData API feed through Power BI, you probably struggled to make sense of how to build something useful from that experience. I’m sure you have read all the blogs out there suggesting the solution of syncing the data down to a SQL database and building your reports from there. It is often a decent solution but requires additional setup, configuration, licensing, and gets beyond the abilities of many Dynamics Admins. We do support that approach and have delivered several solutions using this method including for our own reporting needs. However, we also felt there is a need for a better, quicker option for customers.

To that end, we sat down and designed an accelerator for Dynamics to make this a reality.

Why is this a better option?

  1. You can use existing Dynamics system or personal views to auto create Power BI Queries. We don’t limit you to this as you can also augment those with other fields, but they serve as good starting point. Not sure what you need in your report, use advanced find get close and save that as a view and then build that out more later as you go.

  2. Options sets and lookup values are automatically mapped for you in the query. When hitting the OData API by default, you get the integer values and GUIDs of options sets and related records. This requires a lot of additional work to relink related tables and values which is tedious work that is hard to maintain over time. With this tool, we do that mapping for you based on the Dynamics configuration.

  3. All fields are renamed to the labels as you have defined them in your Dynamics configuration. This cuts down on a bunch of work typically required when pulling in data other ways as schema names are what usually comes over. No more typos or copy paste errors, and the reports you build will have the same field names as users see in Dynamics on forms and views.

  4. Also is using the well-documented FetchXML queries options in the WebAPI you are limited to the 5,000 record limit to you queries. Well we have written some sweet M code to overcome that limit paging. Best of all you can deploy this to the PowerBI Service and schedule updates of your data so you are not limited to just PowerBI Desktop.

*Currently this solution will work in PowerBI Desktop for Online and on Prem IFD enabled Dynamics orgs. However, if you deploy your pbix file to the Online PowerBI Service you will only be able to schedule data refreshes if you have a Dynamics 365 Online Org.

Overall the main goal here is to let you work in Power BI building cool charts and analysis to solve your challenging business problems and less time plumbing and wrangling your data into Power BI from Dynamics.

Here is a quick walkthrough of what this solution provides.

Once you install our managed solution into your org you can access the tool via the solutions area of Dynamics.

Keith 1

Upon launch, you can pick any entity and any view saved against that entity. In this example, we are using the out of the box Active Accounts view. After selecting that view I can chose additional attributes or even select the “All Attributes” option and pull all fields. I will caution that the more you pull, the slower and harder your data will be to work with in Power BI so the best practice is to be as selective as you can in building out a query.

Keith 2

One you have all the attributes you need you can click the Generate button.

Keith 3

At this point we present to you a screen where you can copy queries to the Power BI desktop application. The first item is the URL for the API of the org you are in. Be diligent with the version at the end of the query. Dynamics 365 is going to be V8.2. This serves as the anchor for other queries and should be created first in Power BI as other queries will rely on this.

Keith 4

Copy this URL into a Blank query in Power BI desktop and rename that query “CRMServiceUrl”. Casing is important here so be sure to make sure it’s put in exactly as seen here.

Keith 5

On the Accelerator page hit the Next button to move onto the next query which is the main data query for the Account view we selected. As you can see here there is quite a bit going on. What this step is doing is generating a FetchXML query using an OData Query against the WebAPI. This is returned to us as a JSON document which we use the M language of Power BI to parse into a table and then rename all the fields to the Labels as you have defined in your org configuration. Finally returning that into usable data set, essentially doing all the heavy lifting of pulling your data out of Dynamics for you so you can focus on building reports.

Keith 6

Paste that query into Power BI as another blank query and rename it “Accounts.”  Notice that this query is referencing the URL we created in our first data set.

Keith 7

This point you should have a valid data set with fields that match your labels on the entity as setup in Dynamics. You will need to follow the auth prompts and login to your org to refresh the data.

Keith 8

You can build whatever reports you need at this stage in Power BI. If you want to pull other related data from Dynamics, you can do that and model the relationships in Power BI as needed.

Keith 9

Here I have created a simple chart counting the number of accounts by City. I then deploy that solution to My Workspace on At this point I can go into my deployed data set and schedule a refresh for my data.

Keith 10

From here I can enter credentials for my org. Note that the data refresh will use that account when making the calls to the API so if be sure to use an account that has the rights to read all the data.

Keith 11

Choose Oauth2 as the Authentication method, which from we have found is only currently available with Online orgs. We have tried on IFD enabled orgs and do not get the Oauth2 prompt to allow us to configure the refresh. You will be prompted to enter your credentials on a few other pages not show here. After this step is complete your data will start refreshing on the schedule you set.

Keith 12

You can check the Refresh history from the dataset area in

Keith 13

From here you can keep on building out additional queries by using our solution then copying those queries into your Power BI desktop file. After you make changes deploy them to the Power BI Service for use in dashboards.

We have made this Dynamics Managed Solution available on our website for you to try out.

*Please make sure that you have downloaded and are always using the latest version of Power BI Desktop when using this solution.

If you'd like to view this in solution in action, you can watch our demo video here.

We are happy to assist in your BI projects and have a team of data professionals ready to assist you as you extend your Dynamics system to other O365 offerings like Power BI.

Topics: Microsoft Dynamics 365

Form Script Wars: Revenge of the Events

Today's blog post was written by William Dibbern, Principal Developer at Sonoma Partners.

A long time ago, in a Dynamics 365 environment not that far away...

An odd bug in a client's pre-existing codebase popped up recently where we found that the same event handler was firing twice occasionally. An easy to digest scenario to replicate this would be if every time the phone number field changed, a field containing a phone number revision count was incremented by one. What would happen in this scenario is that after a record had been created, if you then changed the phone number field, the revision field would increment by two instead of one.

Dibbs 1a

What was the underlying issue? The event handler was firing twice because it was actually being attached twice, as the handler was being attached to the field in question through code, in the form load event handler.

Dibbs 1

You see, the form load event can indeed fire multiple times in the lifecycle of a form, but only twice at a maximum, and only in one situation. When you open a create form, the form load event fires. Then, when you save the record and it switches to an update form, the form load event fires again. This is the only time this happens. Therefore, if you were to open an existing record, this issue would not be reproducible.

What all of this means is that you can't count on the form load event to fire only once, but you also can't lean on it to fire every time the user saves. We therefore must account for this inconsistency and leverage other methodologies if we need to do something every time the user saves.

Don't worry young Padawan, we can fix this!

Dibbs 2a

You bet Ani! Just keep reading...

I will teach you everything I know, and you will become a far greater developer than I could ever hope to be...

So how do we solve this conundrum?

Dibbs 2

One possible solution you might think would to use the event registration dialog in the form customizations. In other words, use the Dynamics UI to register the events. While this is not an altogether horrible idea, we would not recommend this approach. Why is that, you ask? There are many benefits to registering your events in code, such as:

  1. You can see all of your events that are registered on that form, all in one place (you don't need to go to the Dynamics environment to see what's attached to what).
  2. You can leverage source control to more easily see how event registrations changed over the lifespan of the codebase.
  3. Registering your events in code should feel natural, when developing using JavaScript outside of Dynamics, we attach our events in code. We almost never attach them in the presentation (HTML) layer. Consistency, woo!

Given all of those benefits, we need to look for another approach. Good thing there is another! Let's instead use a variable to track when the events have been attached. So, the process would be to check a variable like isFormLoaded before attaching events, and if it's false, go ahead and attach your events but then be sure to set isFormLoaded = true when you're done.

Dibbs 2b

Yep, it's that simple. Just remember to scope your variables appropriately so as to not leak that new isFormLoaded variable of yours to the global scope so that we can dodge another potential bug!

Dibbs 3

Take this information and use it wisely, young Padawan. With this knowledge, prevent many bugs, we shall.

4dibbs 3

Topics: Microsoft Dynamics 365