Sonoma Partners Microsoft CRM and Salesforce Blog

Running Server Side Code from the Microsoft Portal (Online)

Today's blog post was written by Matt Dearing, Principal Developer at Sonoma Partners.

Executing server side code from the Microsoft Portal (online) has some limitations. By default CRUD plugins and asynchronous workflows can be triggered, however, if you want to add your own custom button to the portal and have it execute synchronous server side code, or if you have a liquid template where you want to execute some server side logic to determine what to render, a pattern we call "Web Service Plugins" will work.

When web resources became a part of the CRM platform (back in CRM 2011) we used a pattern we internally called "Web Service Plugins" to execute server side code. The basic idea was have an entity called prefix_webservice and make a "RetrieveMultiple" request to it with no instances persisted in CRM. The entity would have 2 fields: prefix_logicclass and prefix_data. We'd then have a plugin registered on post "RetrieveMultiple" of this entity that would look at the query's prefix_logicclass to determine what server side class to instantiate and execute, it would pass along the value in prefix_data, generally serialized JSON, to this logic class. The benefits included things like: bundling http calls, reuse of server side logic, avoiding timeouts (could batch calls from the client and make sure each could run in under 2 minutes), etc. Custom actions have replaced "Web Service Plugins" for us in CRM, but the portal does not allow direct access to executing custom actions. Luckily, we can leverage the "Web Service Plugins" pattern from the portal.

Take the following example where we will call a custom action "SomeAction" from the portal. There are a few parts we need to configure to make this happen. First the liquid template:

You'll notice this liquid template is executing fetch and writing out a result. The "Mime Type" of this template should be set to "application/json".

Next is javascript you would add to the portal to execute the liquid template:


It's a simple ajax call leveraging jquery. It is making a GET request (you could also POST) to the liquid template's web page (with a partial path of "SomeAction") passing a couple inputs and a "stopCache" value. The "stopCache" value helps to ensure the output of our liquid template won't be cached but will instead actually execute the fetch, therefore executing our plugin. When the result comes back a button could display, a grid could hide, a message could be alerted, etc.

Next is the Web Service Plugin that handles routing the request:


This plugin has a single plugin step registered for post of "RetrieveMultiple". The plugin starts by getting the prefix_logicclass and prefix_data from the query. It then dynamically instantiates the logic class by name (leveraging reflection) and calls a specific method (in this case "DoWork"). Next it JSON serializes the result for returning to the client that made the request. For this to work all of the logic classes need a common interface (in this case an abstract base class). As new logic classes are added no change needs to be made to the plugin class or the plugin steps.

Finally here is an example of a logic class:


This method executes a specific action named "prefix_SomeAction" and passes along input then returns the output. It could be made more generic to support many custom actions especially if they are using more generic string inputs and outputs that include json serialized data.

Although not as straight forward as being able to make CRM service calls directly from the portal or being able to write server side code, a pattern like the above does give us the ability to still execute logic to help extend our customer's portal implementations.

Topics: Microsoft Dynamics 365

Living Large with Salesforce CPQ Summer '17

Today's blog post was written by Troy Oliveira, Principal Architect at Sonoma Partners.

Salesforce CPQ released their Summer ’17 upgrade this month.

Did you hear about it? My guess is probably not. It appears that the Summer ’17 release came without pomp and circumstance, which I think is a real shame.

For those of you who want to dive directly into release notes, you’ll have to look in a couple places because Summer ’17 is actually a combination of both Spring ’17 and Summer ’17. Confusing, I know. But for those of you paying attention, you may have noticed that there never was a Spring ’17 release, just a series of updated releases of Winter ’17.

The big-ticket item for Summer ’17 is Quote Self-Service for Partner Portals. While this is a huge step towards making CPQ less about the Salesperson and more about the end-user, it is starting to feel a bit too eCommerce-y to me. Hopefully this doesn’t start to blur the lines between Salesforce CPQ and Commerce Cloud in the future. Or maybe it’s the first step towards a perfect marriage, who am I to judge?

For me, the big-ticket features are listed in the Spring ’17 release notes. The list is comprehensive, ranging from theming options to advanced order management. If either of those functionalities are important to you, look at the release notes for more information on what they do and how to enable them.

What I’m most interested in is what this release means for customers who have large quotes.

Aptly named, Salesforce CPQ Large Quote Experience introduces a whole new way to bring your quotes into existence. To see the power of large quotes in action, let me paint a picture.

As a Sales Rep, I am consistently creating quotes that have several hundred consumer-oriented products. To provide a quality quote for my customer, I need to have several points of reference data about my products in-line so that I don’t have to keep flipping back and forth between my product catalog and CPQ.

Without the Large Quote Experience, I have an endless number of lines that scrolls both vertically and horizontally.

Troyo 1_750

Troyo 2_750

This isn’t the end of the world as I’m used to being browser-based for my Salesforce needs.

However, I noticed a couple things.

  • Once I have more than a few lines on the quote, the Summaries are no longer visible and they are pushed “below the fold”.
  • When I scroll down to view my items lower in the list, I no longer have the headers and am easily confused by which column I am trying to update.

Troyo 3_750

Enter the Large Quote Experience. My System Administrator can update the CPQ Package Settings and “Enable Large Quote Experience."

Troyo 4_750

Once enabled, the issues of not being able to see my Summaries and losing my column headings are a thing of the past. Now I can scroll through my quote lines and have visibility into the overall context of my Quote. No more accidentally updating the wrong field because I don’t know what I’m looking at.

Troyo 5_750

You may have noticed that the space my quote header and footer have added to my screen has limited the number of visible lines. Don’t worry, Salesforce CPQ also took this into consideration. Notice the button in the top-right corner of the header. This button puts CPQ into Full Screen mode, getting rid of the unnecessary junk that stands between me and my quote.

Troyo 6_350

Exciting if I do say so myself. But wait, there’s more.

Not only did the Large Experience update bring the ability to keep better track of quote lines, but it also introduced Quote Line Editor Drawers. With Drawers, I can reduce the clutter of all the columns that I am using purely for reference sake, allowing me to put them into a “drawer” whenever I don’t need to see them, and open the drawer whenever I do. This keeps my quote editing more focused on the end price and less focused on scrolling.

Troyo 7_750

Troyo 8_750

Only you can figure out the right balance of which fields to put in the editor versus which fields you hide away in the drawer, but for me, this adds a whole new dimension to being able to quote efficiently.

This just scratches the surface of what the Summer ’17 release of Salesforce CPQ has done. Look through the release notes in the links below for more details.

Release notes:

New to Salesforce CPQ? Needing some help? Contact us, and we’ll give you a helping hand.

Topics: Salesforce

Lightning Locker Service and You: Working with SecureDOM

Today's blog post was written by Nathen Drees, Senior Data Integration Developer at Sonoma Partners.

One of Salesforce’s recent major releases is the force enablement of the Lightning LockerService. While Salesforce has done their best to give plenty of warning about the LockerService being force enabled, we still see some of our partners and developers confused about what exactly it is, and if it affects them.

Today we wanted to review what the LockerService is, provide context for when it will affect you, and some recommendations on working within it.

What is the Lightning LockerService?

When Salesforce released the Lightning Experience, it had a vision for the AppExchange: developers could write lots of tiny components, and administrators could install them in their org, promoting code reuse and reducing the need to hire for custom development. It was a win-win proposition for both parties – developers have more chances to sell their individual widgets, while administrators have more options to choose from. But this vision comes with a problem: for this exchange to work, Salesforce needs to guarantee that the various components installed by administrators stay safe from one another, to prevent (accidental or otherwise) runaway scripts from causing too much damage. Unfortunately, browsers don’t natively provide a way for multiple scripts within a page to be sandboxed from one another – everyone has access to (more-or-less) everything on the page.

SFDC_Lightning_Locker_service_blog

Enter Lightning LockerService

This is where the Lightning LockerService comes in to play. It provides each lightning component a “locker” that is must fit within, stricter guidelines on what it’s allowed to do, and how it can communicate with other components.

Here’s a comprehensive introduction to the technical aspects of the LockerService on the Salesforce blog, but the key idea to understand is that components are not able to access anything on the page at any time. If components wish to communicate, they  need to coordinate in some agreed upon manner (such as application events).

Does the Lightning LockerService Affect Me?

If you write Lightning Components, the Lightning LockerService affects you.

Going forward (API version 40+), you will need to ensure that your components adhere to the LockerService’s rules or you may find that they no longer function as expected. However, adhering to the LockerService’s rules isn’t as difficult as it may seem.

Working Within the Lightning LockerService

If you are sticking to the native APIs provided by Lightning Components, enabling the LockerService doesn’t change much for you during development. The main area it impacts is when you try to navigate out of your current component into another – this may no longer work as expected, specifically if those other components are not in the same namespace as yours.

In an attempt to help developers ensure that they’re staying within the guidelines (and also pass a security review), Salesforce released the Salesforce Lightning CLI, which lints the components for known errors and mistakes. Using this tool liberally will help to avoid the common cross-component scripting mistakes that may break your components going forward.

If you do need to work across components, you need to start designing and writing events to let the components communicate amongst themselves. Depending on the structure of the components, you’ll want to consider either an application event or a component event.

Wrapping Up

There are of course many more tricks for making sure your components continue to work with the LockerService, most of them being not very difficult to implement. If you would like help reviewing and/or correcting your components, or are looking to launch your product on the platform but could use the help of experienced developers, contact us and we can help.

Topics: ISV for CRM Salesforce

Managing Multi-Domain Mailboxes in Dynamics 365

Today's blog post was written by Neil Erickson, Principal Developer at Sonoma Partners.

We often get asked about the viability of complex CRM scenarios. Below is one such scenario:

The customer had multiple instances of Exchange running in multiple domains. Dynamics CRM users existed as Active Directory users on the same domain, so both users existed as users within Dynamics CRM. One user was set up in the primary Exchange instance. The other user was set up as a mail contact with an email address outside the primary Exchange domain. We were asked if it were possible for both users to be email-enabled within CRM.

The first thing was to determine the sync method to use for the mail contact user, which was easy in this case: the preferred option in this case was for the user to use the Microsoft Dynamics for Outlook plugin to sync with CRM. From there, we followed these steps to test out the account.

Step 1: Creating the users

Primary Domain: sonomacrm1.local

Primary Exchange server: sonoma-crm1exch.sonomacrm1.local

CRM server: sonoma-crm1.sonomacrm1.local

Secondary Exchange domain (for mail contact user): sonomacrm2.local

First user: Control user

AD: SonomaTest1

Domain: sonomacrm1.local

Exchange Mailbox: SonomaTest1@sonomacrm1.local

CRM user: Sonoma Test 1

Second user: Mail contact user

AD: SonomaTest2

Domain: sonomacrm1.local

Mail contact user email address: SonomaTest3@sonomacrm2.local

CRM user: Sonoma Test 2

Step 2: Adding the users to CRM

Both users were added from their existing AD users using the New Multiple Users function. CRM correctly imported their information and email addresses.

Step 3: Setting up user's mailboxes in CRM

Both users Mailboxes had to have their email addresses verified, then approved. Once approved, both mailboxes were then activated in CRM to mark them as active mailboxes.

Step 4: Setting up the Microsoft Dynamics CRM for Outlook plugin

This step consisted of setting up the plugin for each user on separate client machines. Both connected without issue. The setup for the control user was straightforward. The caveat here was setting Outlook to the right mailbox login for the mail contact user. In this case, we connected it to SonomaTest3@sonomacrm2.local. So it attempts to connect, but it passes that password from the SonomaTest2 account. Correcting that is an easy but critical part of the setup. This is critical since SonomaTest2 will be used to authorize to the domain and to the CRM instance but not to Exchange.

Once they are connected properly to Exchange, configuring the CRM for Outlook plugin is as simple as plugging in the correct CRM organization URL.

Step 5: Testing out emails, appointments, and tracking

For testing, we developed scenarios that would work seamlessly in the control environment for SonomaTest1. These included:

  • Tracking an existing email in CRM from Outlook
  • Composing an email from the CRM dialog and tracking it
  • Composing an email from the CRM dialog and tracking it with a regarding field set
  • Tracking an existing Outlook appointment in CRM
  • Composing a new meeting request from the CRM dialog in Outlook, tracking it, and setting a regarding field.
  • Composing an email to a contact from within CRM

These all worked without incident in both the control environment and in the environment for the secondary domain. Monitoring SonomaTest3's external mailbox confirmed emails and appointments were being sent out.

Topics: Microsoft Dynamics 365

Create Dynamics 365 Plugins with VS Code and the .NET Core SDK

Today's blog post was written by Ian Moore, Senior Developer at Sonoma Partners.

All the buzz in the Microsoft development ecosystem in recent months has been about .NET Core. If you haven’t been able to keep up with all the new technology and terms, here is a quick recap:

  • .NET Framework – The version of .NET you know and love. It is installed at the operating system level and runs on Windows.
  • .NET Core – A new implementation and runtime of .NET that is cross-platform across Windows, Linux, and MacOS. Applications that target .NET Core include the necessary framework assemblies with the application.
  • Xamarin – A .NET implementation that runs on mobile devices with Android or iOS.
  • .NET Standard – This is specification of .NET APIs – things like System.Xml, System.Linq, etc. – that are available on a given .NET runtime. Code that is targeting .NET Standard can run on .NET Framework, .NET Core, and Xamarin.
  • NET Core – This is the latest version of the ASP.NET web application framework. It can run on both .NET Core and .NET Framework.
  • .NET Core SDK – This is a new, cross-platform, standalone SDK for .NET applications. It includes a new project file format, NuGet enhancements, and a command line tool that can build and run .NET projects. This SDK can build projects for .NET Core, .NET Standard, and .NET Framework.

Okay, not so quick after all – but there’s a lot of new and great things happening in the .NET world these days. What do these .NET Core developments mean for developers who work with Dynamics 365? Since the Dynamics SDK Assemblies are still built for .NET Framework, we can’t use them to build anything cross-platform. But we can use the new .NET Core tools to develop plugin assemblies and console apps that connect to Dynamics 365. With Visual Studio Code, a new text editor from Microsoft, we can do all the development we need without ever opening Visual Studio.

Here’s what you need to get started:

1. Install the .NET Core SDK from Microsoft - https://www.microsoft.com/net/core#windowscmd

2. Open a command prompt and run ‘dotnet’ to confirm the SDK command line tools were installed successfully.

3. Install Visual Studio Code - https://code.visualstudio.com/

4. Start Visual Studio Code and open the Extensions menu. Search for the ‘C#’ extension and install it.

With those installed, we can make a Dynamics 365 plugin project with a few simple steps.

1. Open VS Code and go to File -> Open Folder and choose a new folder for your plugin source code.

2. Open the VS Code Integrated Terminal from the View menu or using Ctrl+` . This terminal will be PowerShell or cmd depending on your VS Code settings. https://code.visualstudio.com/docs/editor/integrated-terminal

3. In the terminal, run dotnet new classlib to create a new Class Library project in your current directory.

Ian moore blog 1

4. The command places two files in your directory: a stub .cs class file, and a .csproj file named after your directory. Open the .csproj file in the editor.

5. If you have ever looked at a csproj file in Visual Studio before, you will immediately notice the new format is significantly smaller. However, we need to change the TargetFramework to .NET Framework 4.5.2 to match the Dynamics SDK.

Ian moore blog 2

6. Now we need to add the Dynamics SDK package from NuGet. In the terminal run dotnet add package Microsoft.CrmSdk.CoreAssemblies. You will see this command adds a PackageReference to your project file.

7. Run dotnet restore to resolve the package reference.

Ian moore blog 3

8. Rename the Class1.cs file to MyPlugin.cs and open the file in the editor. Try making a simple plugin and notice that you have full intellisense support.

Ian moore blog 4

Here is an example plugin I wrote:

Ian moore blog 5

9. Back in the terminal, run dotnet build to build the project. You will find the output DLL in the bin/Debug/net452 folder.

Ian moore blog 6

10. There is still one more step required before we can register the assembly in Dynamics 365. We need to give the assembly a strong name using the strong name tool (sn.exe).

11. If you have a strong name key file already, you can skip this step. Otherwise, search for “Developer Command Prompt for VS2015” in the Start Menu and create a new key file with the command sn -k [YourProject].snk.

Ian moore blog 7

12. Back in VS Code, add the following lines to your csproj file within the Project element:

<PropertyGroup>
    <SignAssembly>true</SignAssembly>
    <AssemblyOriginatorKeyFile>CorePlugins.snk</AssemblyOriginatorKeyFile>

</PropertyGroup>

13. Your assembly will now have a strong name. Run dotnet build again from the command line to rebuild your project.

14. Now you can register your DLL with Dynamics 365 through the Plugin Registration Tool

Ian moore blog 8

With these steps complete, you can continue to add new plugin types by adding .cs files and including them in the .csproj file. This is exciting news for Dynamics 365 developers - it allows you to have all the tools needed to build plugins with the fully configurable VS Code editor. While Visual Studio will continue to be the go-to development solution for Dynamics development, it is nice to have a new and reliable option moving forward.

Topics: Microsoft Dynamics 365

Dynamics 365 Demo Video: Power BI Query Accelerator

Today's video and blog post were created by Kristian Altuve, Business Analyst at Sonoma Partners.

Microsoft Power BI combines a collection of software services, apps, and connectors to turn your complex sources of data into a visual and interactive format. However, when connecting to Dynamics CRM, there is still a lot of data cleaning needed before it is in a user-friendly state to begin creating Dashboard components. Power BI Query Accelerator was designed by Sonoma Partners to get your Power BI environment set up in a fraction of the time! Check out the video below to learn more or go directly to our download page here.

Topics: Microsoft Dynamics 365

Handling a Project Curveball with Project Service Automation and MS Project

Today's blog post was written by Nick Costanzo, Principal Consultant at Sonoma Partners.

For the last several months, Dynamics 365 has allowed project managers to use the MS Project Add In to link project plans to Project Service Automation (PSA) and manage their projects. The instructions here walk you through how to install, configure, and manage your WBS in MS Project while syncing it back to the linked project record in PSA. This works great if all projects run smoothly and everything stays on track. But what if your client throws you a curveball and the project hits a delay? We’ve all dealt with this, and the broad impact it has on the project team, resource managers, and financial forecasts.

Today I am going to cover how to leverage MS Project and PSA to address these challenges.

If we manage each change correctly, we can avoid striking out when we are faced with that hot, stinky, cheddar pitch.

1. First, let's start with our baseline project plan to create the project record in PSA and link them up by going to the Project Service tab > Publish > New PSA Project:

Nick cos 1

2. We’ve also gotten our resources booked, so now let’s fast forward to the completion of our Discover phase and our planned kick-off of the Iteration 1 Define phase on 5/29:

Nick cos 2

3. Let’s take a snapshot of our booked resources for reference. You can see the detailed view from the Bookable Resource Booking (BRB) records related to the project, which I’ve added as a navigation tile to the project form. For simplicity, I’ve also filtered the records down to Alan, our PM, and the week of 5/29/17.  As you can see, it’s a bit challenging to summarize the data as there is a record for each day, and the duration is displayed in hours in the view, but minutes in the chart:

Nick cos 3

4. Alternatively, once you refresh the plan in MS Project, you can get a much nicer view by going to Resource Usage. As you can see, we had Alan booked for 40 hours this week:

Nick cos 4

5. Next we will simulate what would happen if the client hits a delay with the Iteration 1 Define phase, by moving this date out by 1 month to 6/29/17 in MS Project and publishing it to PSA. You could do this directly in PSA, but in order to do so, you would have to unlink the MS Project plan.

Nick cos 5

Pro tip, make sure your all of your MS Project tasks are scheduled with Fixed Work, by selecting all, then clicking on Information > Advanced > Task Type > Fixed Work, to avoid unnecessary BRB records remaining with the original baseline dates.

Nick cos 6

6. Now you can see that the Iteration 1 Define phase start date and all subsequent dates have been moved out in the PSA WBS:

Nick cos 7

7. However we also need to make sure that the associated resource bookings have been moved as well. You can see that the all of the BRB records for the project have been adjusted according to the new schedule from the Iteration 1 Define phase onward. Again this is more clearly displayed in the MS Project, Resource Usage view:

Nick cos 8

8. With that complete, you have successfully re-baselined your project! And by keeping your resources in sync with the new dates you gain the advantage of:

  • Giving project team resources an updated view of their project bookings
  • Avoiding extra work for resource managers to change these bookings, and also allow them to book these resources to other projects during the delay
  • Ensuring the financial forecast is accurate for your project

Simple as that, by fighting off those curveballs, you can stay in control of your project and keep everyone on track.  After all, it's a lot easier to lay down a bunch of singles than it is to hit a home run!

Topics: Microsoft Dynamics 365

Sonoma Partners at D365UG / CRMUG Summit Nashville

Today's blog post was written by Ariel Upton, Marketing Manager at Sonoma Partners.

As June arrives and summer begins to spread its welcoming arms, it may pain you to turn your attention to the fall and look ahead to October. But if you’re planning to attend the D365UG/CRMUG Summit Nashville, it’s time to take a brief pause from the BBQ to take advantage of early-bird pricing (valid until June 29th, 2017).

Summit is the leading live event for Dynamics 365 users, featuring user-produced education on how to maximize the use of Microsoft Dynamics 365 and CRM. If you and your team are undecided about attending this year’s events, here are four reasons you should add Summit to your fall conference calendar.

Crmug blog 1

1. Content to benefit your business and your personal development.

We’ve all been to an event where we walk away thinking, “Well that was fun, but I’m not sure I actually learned anything meaningful.” Unlike other product conferences, the folks planning Summit go to great lengths to build a schedule filled with topics that CRM users are interested in learning about. The concentrated learning paths for Microsoft Dynamics 365 and CRM users (based on both role and skill level) include sessions focused on individual role responsibilities, industry best practices, and broader business goals. At Summit, you can engage in a tailored learning experience that best suits your individual needs. Thus, you’re guaranteed to leave with inspired ideas and actionable items that you can immediately layer into your deployment.

2. Network and swap best practices with peers.

Only at Summit does every other attendee talk the same talk and walk the same walk as you. At Summit you have authentic opportunities to meet with Dynamics peers and create connections that last after the event comes to a close. This is the perfect environment for you to talk, grow, build, and share your experiences with D365 & CRM with other users. 

3. Learn from the experts to get the greatest ROI.

Learn best practices and real world use cases from Dynamics 365 experts and Microsoft MVPs. Engage with a network of professionals throughout the User Group for Dynamics 365 & CRM who can help you uncover the best way to configure and customize, while maximizing out-of-the-box capabilities. When you compare high-quality training experiences to Summit, the ROI is unmatched. You and your team will walk away with tangible lessons to improve your implementation.


“Summit training is 3 ½ days of learning and networking. The total cost of the trip equals 9 hours of billable time. Awesome bargain!” – Joni, IT Manager, past Summit attendee

4. Spend time with Sonoma Partners.

With the sunset on Convergence, it’s inspiring to see the Dynamics 365 & CRM community pivot their energy and full attention to Summit. With year-over-year attendance growth and more active involvement than ever, we’re proud to be part of a group of people dedicated to the product’s evolution and success. Sonoma Partners is committed to helping people maximize their investment of Dynamics 365 & CRM, and we’ll be active participants and sponsors of this year’s event. Check us out at booth 811.

Music City, and the future success of your CRM deployment, is calling. Stay tuned for more announcements on our involvement at Summit 2017 and register now to take advantage of early-bird pricing.

Topics: Microsoft Dynamics 365

Best Practices for Microsoft Security Settings: Append vs. Append To

Today's blog post was written by Jen Ford, Principal QA at Sonoma Partners.

I frequently forget the difference between these two security settings. When setting up security permissions for a particular entity, I have always set them to the same values as each other, as a best practice, “just in case.” These two values do not always need to be the same, so let’s discuss how to set them properly with a few examples.

Say I am on the Contact form. I have a lookup to Account, and I have a subgrid of related Cases on the form. How can I set up my Append and Append To privileges on the Contact, Account, and Case entities so that they make sense for someone to create a Contact and associate these records?

To correctly set the Account lookup on the Contact record, set ‘Append’ on the Contact, and ‘Append To’ on the Account.

To correctly associate Cases to the Contact on the subgrid, set ‘Append To’ on the Contact, and ‘Append’ on the Case.

In this case, we are setting both Append and Append To on the Contact, but it is to provide security to fulfill two different relationships with the Contact.

If I wanted to provide the user with access to set all lookups on the Contact, and to associate any related records, this is a general rule to help set security settings:

For 1:N relationships on the Contact entity, set APPEND TO on the Contact entity, and APPEND on the related entities.

For N:1 relationships on the Contact entity, set APPEND on the Contact entity, and APPEND TO on the related entities.

In terms of other security, here are other considerations that will minimize the amount of security errors you could receive when creating or updating records. Let’s return to our example of a Contact record, with a lookup to the Account, and a related subgrid of Cases:

Do not set the APPEND TO level on the Account to be lower than the READ privilege of the Account. If you do this, then you will be able to select an Account you do not have access to APPEND TO a Contact, and get a security error.

Do not set the WRITE level on the Contact to be higher than the APPEND privilege of the Contact. If you do this, then you will be able to set an Account on the Contact that you may not have access to APPEND, and you will get a security error (this also depends on the READ level of the Account entity).

The same would hold true for associating a Case to a Contact:

Do not set the APPEND TO level on the Contact to be higher than the READ privilege of the Contact. If you do this, then you will be able to select a Contact you do not have access to APPEND TO a Case, and you will get a security error.

Do not set the WRITE level on the Case to be higher than the APPEND privilege of the Case. If you do this, then you will be able to set a Contact on the Case that you may not have access to APPEND, and you will get a security error (this also depends on the READ level of the Contact entity).

This is intended to be a starting point. There are always situations unique to customers that may need to set their permissions in a slightly different way. For example, Activities require both APPEND and APPEND TO to be set, in order to successfully create the records and associate them to other entities. As always, thoroughly test your security roles to ensure they are fully functioning.

Topics: Microsoft Dynamics 365

MSCRM Database Maintenance and Configuration Tips Checklist

Today's blog post was written by Rob Jasinksi, Principal Developer at Sonoma Partners.

If you are running Microsoft Dynamics CRM On-Premise, you also have to maintain and support the database that it uses. In this post, I have put together a list of some basic best practice tips that you should consider for your local CRM databases.

MSCRM uses a database for each organization plus a general configuration database that all organizations share. The naming conventions are generally [OrgName]_MSCRM for the organization database plus MSCRM_CONFIG for the configuration database. This article will focus on the organization database, but the same tips and best practices can also be applied to the configuration database.

Configuration Tips

  • Have the database data files (mdf) and log files (ldf) on separate disks or volumes. Both files are heavily used during normal CRM use and having the files on separate disks will spread the disk I/O across both disks, giving better overall performance.

  • Change the database auto grow feature to a higher value from its default of 1MB. Growing a database can be I/O intensive and you want to avoid it when possible, especially during business hours. I usually use a value of 256MB so this happens less frequently. But if your database grows faster or slower, this value can be adjusted higher or lower. Another option is to turn off auto grow and schedule it to run only when needed and during off hours. However this requires monitoring and if you forget, the database will fill up and CRM will cease to function normally or not at all.

Rob jas 1

  • For the organizations strictly used for development (non-production) purposes, consider changing the recovery model to Simple. This won’t keep a history of transactions and avoid having the transaction log grow and fill up the entire disk. Since data is constantly flowing in and being deleted to re-test the log can fill up fast. For most development environments, usually daily full or differential backups are sufficient for most organizations (unless there is a specific reason not to, for example, to match production).

Rob jas 2

Maintenance Tips Checklist

  • Make sure regular backups are being done. This sounds pretty basic, but after installing CRM, it’s possible that no one actually setup a daily backup plan. You can check this by right clicking on the database and choose Reports -> Standard Reports -> Backup and Restore Events.

  • Defragment or rebuild the indexes on a regular basis. Over time as data is added, updated, and deleted from CRM this will cause the underlying indexes to become fragmented over time. This can cause performance issues and only get worse over time. You can see if your indexes need to be maintained by right clicking on the database and choose Reports -> Standard Reports -> Index Physical Statistics. This will generate a report listing every index in the database. The column on the right will display a recommendation if the index needs to be rebuilt or reorganized. If you see a lot of these, it’s time to defragment those indexes.

  • Make sure you run CHECKDB against the database on a regular basis. This will check the consistency of the database and find any problems that need to be looked at. It’s better to find any problems early rather later after it grows to the point where the database won’t even start anymore. In that case, you may have to restore from the last backup. Note: only non-data loss CHECKDB operations are supported by CRM.

  • Check the size of your database files and the amount of disk space you have left on a regular interval. I have seen many cases where a database continues to grow until it fills up the entire disk and cannot grow anymore. Resolving this issue can be time consuming and your system will be down the entire time, so better to catch early and fix.

  • Look for any table growing out of control in size, it may indicate a more serious problem in your CRM. It’s common to see some tables, like AuditBase and Activity[Pointer/Party]Base to grow large. But if you see the contact table growing to a million records and you think you should only have 50,000 contacts, it may indicate that many duplicates have made their way into the system and something that should be investigated as to how they got in there and how to clean them up. To get a report of large tables, right click the database and choose Reports -> Standard Reports -> Disk Usage By Top Tables.

Hopefully these tips will help keep your CRM database in top condition and perform at its best. If I get any feedback below to dive into specific topics in this post in more detail, I might write that up as a separate post later.

Topics: Microsoft Dynamics 365