Deploy Model & Configure Watson OpenScale
In this section, we will deploy a machine learning model that predicts credit risk and set up OpenScale to monitor the model.
Last updated
In this section, we will deploy a machine learning model that predicts credit risk and set up OpenScale to monitor the model.
Last updated
The first notebook in the project you imported will deploy a model and configure OpenScale to start monitoring that model.
In Watson Studio, select the project that you previously imported and click on the 'Assets' tab on the top of the project page.
Click on the model deployment/configuration notebook '1-deploysparkmlmodel-wos-configuration' and then click on the pencil icon to enable you to edit / run the notebook.
After the notebook environment starts up, scroll down to the section titled 'Service Credentials'. Copy and Paste the Watson Machine Learning service credentials and the Cloud API Key that you saved to a text editor earlier.
Go back to the first cell in the notebook and run the notebook. You can run the cells individually by clicking on each cell and then click the Run
button at the top of the notebook.
By the end of the notebook, you should have a model created and deployed. With an online endpoint you can use the submit scoring requests. You will also have set up a subscription from OpenScale to the model.
Now that you have created a machine learning model and configured OpenScale, you can utilize the OpenScale dashboard to monitor the model. Although we have not enabled any type of monitoring yet, with the deployment approach we are using for this lab ( Watson Machine Learning as the model engine ), we will be able to see payload and some performance information out of the box.
Open the Watson OpenScale dashboard in the same browser (but separate tab) as you used to run the Watson Studio notebook.
When the dashboard loads, Click on the 'Model Monitors' tab and you will see the one deployment you configured in the previous section.
From the 'Model Monitors' tab, click on the deployment tile you have created. You will see a graph of average throughput over time (we have only run a small number of requests through our model, so we see on spike in the graph)
Click on the bar in the throughput graph and you will be able to find the actual transactions (if you executed the cells of the notebook in the previous section only once, you would see just the 8 scoring requests we made after enabling the subscription).
You can also use the OpenScale dashboard to visualize the confidence of the models predictions.
Click on the 'Predictions by Confidence' option on the left panel.
You will see the number of 'Risk' and 'No Risk' predictions for each confidence range.
If you see payload information you have successfully subscribed Watson OpenScale to the deployed machine learning model. You're ready to continue to the next section to enable different model monitors.
Click on the back arrow icon next to the chart title: