Sonoma Partners Microsoft CRM and Salesforce Blog

Implementing Machine Learning in D365 – Part 2

This post is the final part of our two-part series on Implementing Machine Learning in D365. Head here to read part one if you haven’t been following along yet.

In part one, we built an Azure Machine Learning (AML) algorithm against our Opportunity data in D365 to come up with a probability that the Opportunity will win. In this post, we will use AML to generate a web service for us and then we will use Microsoft Flow to retrieve the probability and update newly created Opportunities with the value.

Within AML, after you successfully ran the experiment, select Set Up Web Service –> Predictive Web Service

image

Once it is finished, you will then have to click Run to go through the experiment again. Once that is finished you can click Deploy Web Service –> Deploy Web Service [New] Preview. You will have to create a new Price Plan (fortunately there is a free one with 1,000 transactions) and then click Deploy. Once it is deployed, you will want to click Use Web Service. On the next page you will want to make note of the Primary Key and Request-Response values as we will need this shortly.

image

Next, head over to Microsoft Flow and sign in with a Microsoft account. Click “Create from blank” and then search for the “When a record is created” for D365 trigger.

SNAGHTML387eecc

You will need to authenticate to D365, select your organization and then select the Opportunities entity.

image

Click + New step –> Add an action. Search for and select the “HTTP – HTTP” connector. We will use this to call our AML web service that we just deployed earlier. Set the Method to Post and the Uri to the value of the “Request-Response” that we noted earlier from AML. Next, add a header and set the key to ‘Authorization’ and set the value to ‘Bearer xxx’ where ‘xxx’ is the value of “Primary Key” that we noted earlier from AML. Then add a new header value with a key “Content-Type” and value “application/json”. Lastly, set the Body to the following:

{
    “Inputs”: {
        “input1: [
          {
            “description”: <description>,
            “estimatedvalue”: <estimatedvalue>,
            “statecode”: 0
          }
        ]
    },
    “GlobalParameters”: {}
}

Using the dynamic content helper on the right, replace <description> from the Body by selecting Description from the list of dynamic content. Do the same for the <estimatedvalue> so that the whole HTTP action looks like the following screenshot:

image

Next, click + New Step –> Add an action and select Parse JSON. Using the dynamic content helper, set the Content field to the Body from the HTTP action. Set the Schema field to the following text:

{
            "type": "object",
            "properties": {
                "Results": {
                    "type": "object",
                    "properties": {
                        "output1": {
                            "type": "array",
                            "items": {
                                "type": "object",
                                "properties": {
                                    "estimatedvalue": {
                                        "type": "string"
                                    },
                                    "description": {
                                        "type": "string"
                                    },
                                    "statecode": {
                                        "type": "string"
                                    },
                                    "Scored Labels": {
                                        "type": "string"
                                    },
                                    "Scored Probabilities": {
                                        "type": "string"
                                    }
                                },
                                "required": [
                                    "estimatedvalue",
                                    "description",
                                    "statecode",
                                    "Scored Labels",
                                    "Scored Probabilities"
                                ]
                            }
                        }
                    }
                }
            }
        }

image

This will help parse the response from the AML web service call so we can get to the actual probability value that is returned.

Lastly, click + New Step –> Add an action and select the “Update a record (V2)” for D365. Set your organization and the Opportunities entity again. Then using the dynamic content helper, set the required fields to the respective values from our first trigger from D365. For the probability, I created a custom Decimal field called Predicted Probability. This field will show up in the “Show advanced options” section in Flow. Click that and find the Predicted Probability field and set the expression to the following:

mul(float(body('Parse_JSON')?['Results']?['output1'][0]['Scored Probabilities']), 100)

This expression is using the Parse JSON action we added to get to the Score Probabilities value from AML, converting it to a float and then multiplying by 100 to show as a percentage in D365. The “Update a record (V2)” trigger should look like the following screenshots:

image

image

Now you can save the flow and make sure the trigger is set to “On”. Now in D365, we can create a new opportunity. I set the Description to “Wine” with an Est. Revenue of $90. Back in Flow, you can check the Run History and make sure the Flow ran and was successful. It may take a minute or two for it to run.

image

Once you see a successful process in Flow, refresh your D365 Opportunity and you should see the Predicted Probability updated.

image

Now we have a complete round-trip from getting the data out of D365 to train it and then pushing the predicted probability back into D365. For existing records, you will want to build a process to retrieve them all and call the AML web service for each one.

Azure Machine Learning has many different algorithms (here is a nice cheat sheet) for all different types of business processes. This is just one example and will hopefully give you an idea of what you can accomplish in your own organization. If you have an questions or are interested in enhancing your organization with machine learning, contact us.

Happy Machine Learning!

Topics: Integrations Microsoft Dynamics 365 Microsoft Dynamics CRM Microsoft Dynamics CRM Online

Implementing Machine Learning in D365 – Part 1

Machine Learning is the latest hot topic and for many good reasons. Being able to take a manual, sometimes inaccurate data point and transforming it into an automated and consistently accurate one, has huge advantages. Take for example the probability that an opportunity will close. It is up to your salesperson to input this data and you relying on it to be accurate.  With machine learning, we can automatically provide this data point and have it be consistently accurate based on historical data.

In this two-part series, I will be walking through just that, how to implement Azure Machine Learning (AML) that will provide the probability that an opportunity will win. I will be using a very simple algorithm with only a couple factors as an example. It will be up to you to come up with the factors that matter for your business and make sure your CRM has that data, in order to come up with an accurate opportunity probability.

I will be using D365 Online and the first thing we will want to do is setup the Data Export Service so we can replicate our data into an Azure SQL database which AML can easily connect with. If you haven’t set this up yet, head here for a complete guide on how to set it up. For this walkthrough, you will only need to sync the Opportunity entity but if you want to expand your factors further, you will most likely want to sync other entities.

Note: If you are on an on-premise version, you can utilize SSIS to replicate on-premise CRM database to an Azure SQL database.

Next, go to your Azure Portal, create a new resource and search for “Machine Learning Studio Workspace”

image

Once everything is setup, you can navigate to the new Machine Learning Studio workspace and click “Launch Machine Learning Studio”.  This will take you to a separate website where you will need to sign in using your Azure credentials.

image

Then create a new blank experiment:

image

You will then see a blank canvas where you can drag and drop different components. The first thing we will want to do is search for “Import Data” on the top left and drag the “Import Data” component onto the canvas.

image

Then, highlight the Import Data component and click the “Launch import Data Wizard” button on the right. Select Azure SQL Database and enter the right settings for your environment. Next, enter this SQL query:

select
    cast(estimatedvalue as float) as estimatedvalue,
    description,
    CASE 
       WHEN statecode = 2 THEN 0
    END  as statecode
from opportunity
where statecode = 1 or statecode = 2

In this query, we are retrieving the estimated value and description of the opportunity. With my sample data, the description is acting more like a “type” of opportunity. The values in this field in my org are either “Beer” or “Wine”. Estimated value and description are the two factors that we are using to determine if the Opportunity is Won or Lost. In a production scenario, you will most likely want to capture and use more data points for an accurate opportunity probability. Factors such as how long the opportunity sits in each stage, how many activities are regarding the opportunity, or maybe even the salesperson that owns the opportunity.

The first thing to note with the query is that AML currently doesn’t support decimals so we need to cast estimatedvalue as a float.

cast(estimatedvalue as float) as estimatedvalue

Next, we’re going to be using the Two-Class Boosted Decision Tree algorithm which requires us to pass it a 0 for negative and 1 for positive. We want to label the Lost opportunities as negative and the Won opportunities as positive so we use a SQL CASE statement to set the statecode value for Lost (2) as 0. The statecode value for Won is already 1 so we are good there.

CASE 
       WHEN statecode = 2 THEN 0 
END  as statecode

Lastly, we only want to return Won and Lost opportunities as we are building the AML model based on what is already Won or Lost since that is what we are trying to predict.

where statecode = 1 or statecode = 2

Next, search for the “Split Data” component and drag it onto the canvas. Then, connect the Import Data and Split Data components together.

image

Best practice with machine learning is to split the data so that most of the data is used to build the model and the other piece of data is used to evaluate the model to see how accurate it is. That is what the Split Data component is for. You can play around with the settings to split the data how you like.

Next, drag the Two-Class Boosted Decision Tree component to the left of the Split Data component.  Then, drag the Train Model component under the Two-Class Boosted and Split Data components and connect the Two-Class Boosted to the left point of the Train Model and the Split Data to the right point of the Train Model.

image

Highlight the Train Model component and on the right-side and click the “Launch column selector” button. Next, enter “statecode” and click the checkmark.

image

This is taking a subset of the data (based on the Split Data component) and applying the Two-Class Boosted Decision Tree algorithm to it using the statecode parameter to build a model based on the Won and Lost opportunities.

Lastly, drag the Score Model and Evaluate Model components onto the canvas and make the connections like so:

image

This will run the other piece of data from the Split Data component and run it against the trained model and determine the score, which in our case is the opportunity probability. Then it will take that score and evaluate it to provide metrics on how well the model is.

Now click the Run button at the bottom which will process each component. Once it is finished, you can right-click Score Model and select Scored datasets –> Visualize. This will show you your data with the score columns applied. Scored Probabilities is the percent that the machine believes it is the positive value, which in our case is a “Won” opportunity. The Scored Labels column is the value of 0 or 1 for Lost or Won opportunity. If Scored Probabilities is greater than 50% then Scored Labels will be set to 1, otherwise it will be 0. More information on the Score Model can be found here.

image

Now right-click Evaluate Model and select Evaluation results –> Visualize. This will display a chart of how accurate your model is with data points on the accuracy. Based on your results, you can tweak the settings in the other components or bring in and remove factors and re-evaluate the model to get an accuracy that you are comfortable with. More information on the Evaluate Model and what the different data points mean can be found here.

image

That concludes the first part of our series. Now that we have a trained model with a high accuracy, in the next part of the series, I will show you how to generate a web service that we will use to retrieve the probability prediction and update D365 with the data. The best part is, we can use the power of Microsoft Flow to do all this without any code…stay tuned!

Topics: Integrations Microsoft Dynamics 365 Microsoft Dynamics CRM Microsoft Dynamics CRM Online

Introduction to Dynamics CRM and Salesforce Data Migrations Using SQL Server Integration Services

Today's blog post was written by Rob Jasinski, Principal Developer at Sonoma Partners.

There are many occasions when you’ll need to migrate a large amount of data from some outside data source into your Dynamics CRM or Salesforce environment. Both systems do have native import tools, but they have many limitations and can be difficult to use on a large amount of data. If your company owns SQL Server licenses, those also come with SSIS licenses and that can be a viable alternate solution to migrate data into CRM. This article will give you an introduction to setting up a simple data migration to get you started for creating your own CRM migration solutions.

First you’ll need SQL Server Data Tools. This can be downloaded from this link. You should have Visual Studio installed, but if you don’t this will also install a light version, enough to run data tools and allow you to create SSIS integration package.

Next, since you’ll be migrating data to CRM, you’ll need a third party destination component. Now you can write your own custom code, either in C# or VB, and code directly to the API of the destination CRM system, but this can be tedious and very time consuming. These third party tools encapsulate all the code into easy to use drag-and-drop components. They are very reasonably priced and will save a lot of time and frustrations. There are many third party components you can use from vendors such as CozyRoc, CData, and KingswaySoft. For this example, we’ll be using KingswaySoft, but the other vendors would work fine as well. You can download the component used in this example from this link.

Once you have data tools installed, launch Visual Studio and choose new solution and choose “New SSIS Package."

Rob j 1

This will create an empty canvas ready for you to start creating your first SSIS data migration.

Rob j 2

Next you’ll need to create a data source. This is the location of the source data that exists outside of your current CRM solution. SSIS has several options to pull data from many sources, including Excel, SQL Server, ODBC, CSV files, etc. For this example, we’ll assume our data is located in a SQL Server database. So in the Connection Managers area, right click in that area and choose “New OLE DB Connection.” Configure the server name, credentials, and database name, and click OK. In this example, I’ve renamed the connection to “Source.”

Rob j 3

Next drag and drop the “Data Flow Task” from the SSIS Toolbox onto the Control Flow canvas.

Rob j 4

Double-click the Data Flow Task just created to open the Data Flow canvas. This is where you’ll create the actual migration using source, destination, and transformation components (if needed).

Next drag and drop the OLE DB Source component from the SSIS Toolbox onto the Data Flow canvas and configure the component along with the query to pull the data that you need. In this example, we’ll be pulling basic account information from some outside system into our Dynamics CRM environment.

Rob j 5

Now you’ll need to configure a Destination connection. In this example, we’ll be using the KingswaySoft Dynamics CRM adapter for connecting to our CRM environment. Right click in the Connection Managers area and choose “New Connection.”  From the Connection list, choose DynamicsCRM and click Add. Also, you must have already installed the KingswaySoft Dynamics CRM adapter for it to appear in this list.

Rob j 6

Next you’ll need to configure the CRM connection. Choose the appropriate Authentication Type, depending if you’re online or on-premise. Then enter the CRM Discovery URL, then any credentials and finally choose the appropriate organization name from the drop-down list. Since there are many environment configurations you may be using, please refer to this KingswaySoft documentation for further details on configuring this.

You should now have two connection managers. In this example, I’ve renamed it to DynamicsCRM. Most migrations you create will mostly like just have two data sources, one source and one destination. But there may be cases where you have multiple sources that are then merged together within the Data Flow process before migrating to CRM.

Rob j 7

Next drag and drop the Dynamics CRM Destination components from the SSIS Toolbox onto the Data Flow canvas. Then manually connect the arrow from the OLE DB Source to the new CRM Destination just created. The component has a little red X displayed because it still needs to be configured in the next step.

Rob j 8

Next double click on the Destination component to bring up the configuration screen. Choose the CRM connection manager you just created. For this example, we’re migrating data into accounts, so choose that as the destination entity.

Rob j 9

Next select the Columns tab and map the fields from the source query to the fields in CRM. Using a third party component, like KingswaySoft, it’s that easy and no custom coding is required.

Rob j 10

Press OK and you’re done. You’ve created your first, although very simple, data migration of data outside of CRM into CRM. Now the nice thing about SSIS, it comes with many transformation tools out of box that allow you to cleanup, manipulate, or even de-dupe data. For example, here is a link that is a more complicated example of an SSIS data migration that also de-duplicates data. So from this simple example as a starting point, you can expand it to perform almost any kind of complicated data migration your business may require.

Topics: Integrations