Scenario
Have you ever built a machine learning model and asked yourself what is the next step? Bingo, Deploying the machine learning model to the internet to be able to use it anywhere you want. Sometimes, you struggle with this process and may find yourself asking the following questions:
What are the accepted formats for my model?
How can I do it?
What are the things that I need in order to deploy my model?
In this blog, we will go through a step-by-step coding guide, from converting our model to ONNX format until we use it in our Power Apps application.
Technical Architecture
We will start by creating a resource group inside the Azure Platform and then creating an Azure Machine Learning Workspace, Compute, and Notebook. Inside the notebook, we will run some code that will help us to build, save, package, register, and deploy our Machine Learning Model. In addition, We will use the Power Apps template that I created to trigger a Power Automate Cloud Flow to consume the deployed model, make requests, and ingest responses.
Prerequisites:
An Azure subscription.
- If you don’t already have one, you can sign up for an Azure free account.
- For Students, you can use the free Azure for Students offer which doesn’t require a credit card only your school email.
A Power Apps environment.
Code Notebook and Solution Zip File from GitHub.
Summary of the steps:
Step 1: Open your Azure Portal and Sign in
Step 2: Create an Azure Machine Learning Workspace
Step 3: Setup your Environment
Step 4: Build and Save a Machine Learning Model
Step 5: Package the Model using ONNX
Step 6: Register the Model on Azure ML
Step 7: Deploy the Model to Azure ML
Step 8: Open Power Apps and Import the Solution
Step 9: Edit the Power Automate Flow
Step 10: Publish your Power App
Step 1: Open your Azure Portal and Sign in
Go to https://portal.azure.com and sign in.
Choose your preferred account and proceed
Now you are inside the Azure portal!
Step 2: Create an Azure Machine Learning Workspace
Search for Azure Machine Learning and select it.
Click on Create and choose New Workspace to create a new machine learning workspace.
What do you need to create it?
- Azure Subscription (All resources in an Azure subscription are billed together. Learn more: here)
- Azure Resource Group (A resource group is a collection of resources that share the same life cycle, permissions, and policies. Learn more: here)
- Workspace Name (Unique name that matches the constraints for naming on Azure)
- Region (Choose the region closest to you and your customers. Learn more: here)
- Storage Account (A storage account is used as the default datastore for the workspace. You may create a new Azure Storage resource or select an existing one in your subscription. Learn more: here)
- Key vault (A key vault is used to store secrets and other sensitive information that is needed by the workspace. You may create a new Azure Key Vault resource or select an existing one in your subscription. Learn more: here)
- Application Insights (The workspace uses Azure Application Insights to store monitoring information about your deployed models. You may create a new Azure Application Insights resource or select an existing one in your subscription. Learn more: here)
- Container Registry (A container registry is used to register docker images used in training and deployments. To minimize costs, a new Azure Container Registry resource is created only after you build your first image. Alternatively, you may choose to create the resource now or select an existing one in your subscription. Learn more: here)
For simplicity, we will click on Create new for the resource group, provide a name, click on Ok then provide a name for the workspace and all the other options will be automatically populated for us if you want to learn more about each option you may look at the attached links above.
then Click Review + Create.
Wait for the deployment to finish then click on Go to resource.
Step 3: Setup Your Environment
We need to create a compute to enable us to run notebooks in Azure ML Studio and our code is inside a notebook that we will obtain later from github.
Open the Azure Machine Learning Studio using the Studio web URL.
Click on Compute from the left side menu under Manage
Click on + New to create a new Compute.
Choose a unique name for your compute, CPU as we won't need GPU capabilities, and Standard_DS11_v2 for the machine size as we need it for lightweight processing then Click on Create.
While the instance is being created let's get our code from GitHub as it will take some time.
Go to github.com/John0Isaac/house-price-predicition-aml-powerapp, Click on Code, and Copy the URL to Clone the repo locally.
Open Git Bash and type the following
git clone https://github.com/John0Isaac/house-price-predicition-aml-powerapp.git
See Screenshot Below
You can find inside the cloned repo the following:
- Solution/ folder that we will need later to import in power apps.
- Images/ folder that has screenshots from the application.
- Deploy_House_Price_Prediciton_Model.ipynb notebook that has all the code we need to run in order to build, save, package, register, and deploy our ML Model.
Go back to Azure ML Studio to find that your compute is now ready then Click on Notebooks from the left side menu under Authoring.
Now we need to upload our notebook from the repo that we cloned before Click on the + Icon and Choose Upload files.
Click on the + Button then Select the notebook file then Click on Upload.
Double Click on the newly uploaded notebook and make sure that you are connected to the compute we created earlier and that its State is Running and that the kernel you are connected to is Python 3.8 - AzureML.
Then Click on Authenticate.
Step 4: Build and Save a Machine Learning Model
We are not going to go through the notebook code line by line only essential parts will be highlighted.
For this demo, I created a simple machine-learning model using TensorFlow, you may be using Pytorch or any other library.
The most important thing is that after you build your machine-learning model you test it and save it.
Saving a model in TensorFlow reference see here.
Saving a model in Pytorch reference see here.
When you finish and save your model you will find a new folder created in your notebook files explorer called tf-model/ (Note: If it's not visible refresh your files.)
Step 5: Package the Model using ONNX
We are using ONNX as a unified deployment format for all our models as it gives us the ability to build our model using any framework and in the end use .onnx format for deployment. (Learn more: here)
Does it only work with TensorFlow? the simple answer is no.
You can find examples of how to convert your model from any machine-learning framework here.
When you run the code to generate the ONNX model you will find a new folder created in your notebook files explorer called model/ and inside it our .onnx model that we are going to deploy. (Note: If it's not visible refresh your files.)
Notable part here in 5.c Test the ONNX format
ONNX model receives the input in a dictionary format with the name of the input layer as the key of the input.
How can we find the name of the key to use in our input? see the screenshot below it will be visible in the conversion logs.
So, now we have 2 options either hard code it or use this line of code to format our input according to the accepted input shape by onnx.
feed = dict([(input.name, input_data[n]) for n, input in enumerate(onnx_session.get_inputs())]) # {'dense_input': [[2.0, 1.0, 1.0]]}
You can see the output commented on above then we feed this to the model to test it.
If you get the expected output from your model now you are ready to move forward!
Step 6: Register the Model on Azure ML
In order to register a model in the Azure Model Registery you only need the model file (Learn more: here)
so we are providing the path to the model folder, workspace variable that contains (subscription id, Azure ML workspace name, Resource group), model file name, and optional tags.
You may be prompted to log in with a message that looks like this: “Performing interactive authentication. Please follow the instructions on the terminal. To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code XXXXXXXXXX to authenticate.” To log in, click the link, enter the code, and follow any login prompts to continue.
After you finish you can find the model in the Models tab from the left side menu under Assets.
Step 7: Deploy the Model to Azure ML
- Inference Configuration:
- Entry Script => Score.py (needed to load the model and make a prediction when anyone invokes our model endpoint).
- Conda File => myenv.yml (our environment dependencies which are the libraries needed to run our model).
-
Runtime => python.
- Registered Model.
- Deployment Configuration:
- Number of Virtual CPU's
- Memory in Gb
- Whether we want to enable authentication or not)
- Workspace variable that contains:
- Subscription id
- Azure ML workspace name
- Resource group
ws = Workspace.from_config()
model = Model(workspace=ws, name=aml_model_name)
inference_config = InferenceConfig(runtime="python",
entry_script=entry_script,
conda_file=conda_file)
deployment_config = AciWebservice.deploy_configuration(cpu_cores = 0.8, memory_gb = 1, auth_enabled=is_secure)
service = Model.deploy(ws, aci_service_name, [model], inference_config, deployment_config)
Just run the code and everything you need will be created for you.
Once the endpoint is ready you can find it here under the Assets from the left side menu.
In order to use the deployment we need two things the Scoring URL and API Key.
Save them as we will use them later.
Step 8: Open Power Apps and Import the Solution
Open Power Apps using this link https://make.powerapps.com/.
Click on More from the left side window then Choose Solutions.
Click on Import Solution to load the Zip file that we cloned.
Click on Browse and Choose the Zip File that we cloned then Click on Open and Next.
After it finishes processing Click on Import and wait for a couple of minutes.
You can now find a Solution named AML PowerApps HousePricePrediciton Sample in the solutions table ready for you to use.
Step 9: Edit the Power Automate Flow
Click on the Solution Display Name to open it, then Click on the Canvas App to Open the application in edit mode.
Now you can edit the application if you want to change colors, fonts, and backgrounds or leave it as it is and continue by clicking on Power Automate from the left side menu.
Hover over the flow name (HousePricePredictionFlow) and from the three dots Click on Edit.
Add the values you copied in the places shown below then Click on Save and Close the flow from the X button.
Step 10: Publish your Power App
Now let's preview our application to test if everything works Click on the Play button from the Navigation Menu or (F5).
Enter the area of the house, Choose from the drop-down number of rooms, and whether it's in the city or rural location then Click on Predict and Wait for the response to be received.
The last step is to save our changes and publish them by Clicking on Save Icon from the Navigation Menu and waiting until it finishes saving then Clicking on the Publish Icon from the same Menu.
Now you can find the application in your Microsoft 365 Online Apps section. office.com/apps
You can share your application with anyone inside your Organization.
Thank you so much for following along...
Found this useful? Share it with others and follow me to get updates on:
- Twitter (twitter.com/john00isaac)
- LinkedIn (linkedin.com/in/john0isaac)
You can learn more at:
- Explore the Azure Machine Learning workspace - Training | Microsoft Learn
- Create a canvas app in Power Apps - Training | Microsoft Learn
- Integrate Power Automate flows and Dataverse - Training | Microsoft Learn
Feel free to share your comments and/or inquiries in the comment section below...
See you in future demos!
Updated Jul 14, 2023
Version 2.0JohnAziz
Iron Contributor
Joined December 09, 2021
Educator Developer Blog
Follow this blog board to get notified when there's new activity