Learn how to access Azure resources from your scoring script with an online endpoint and either a system-assigned managed identity or a user-assigned managed identity.
Both managed endpoints and Kubernetes endpoints allow Azure Machine Learning to manage the burden of provisioning your compute resource and deploying your machine learning model. Typically your model needs to access Azure resources such as the Azure Container Registry or your blob storage for inferencing; with a managed identity, you can access these resources without needing to manage credentials in your code. Learn more about managed identities.
This guide assumes you don't have a managed identity, a storage account, or an online endpoint. If you already have these components, skip to the Give access permission to the managed identity section.
To use Azure Machine Learning, you must have an Azure subscription. If you don't have an Azure subscription, create a trial subscription before you begin. Try the trial subscription today.
An Azure resource group, in which you (or the service principal you use) need to have User Access Administrator and Contributor access. You have such a resource group if you configured your ML extension per the preceding article.
An Azure Machine Learning workspace. You already have a workspace if you configured your ML extension per the preceding article.
A trained machine learning model ready for scoring and deployment. If you're following along with the sample, a model is provided.
If you haven't already set the defaults for the Azure CLI, save your default settings. To avoid passing in the values for your subscription, workspace, and resource group multiple times, run this code:
az account set --subscription <subscription ID>
az configure --defaults gitworkspace=<Azure Machine Learning workspace name> group=<resource group>
To follow along with the sample, clone the samples repository and then change directory to cli.
git clone https://github.com/Azure/azureml-examples --depth 1
cd azureml-examples/cli
To use Azure Machine Learning, you must have an Azure subscription. If you don't have an Azure subscription, create a trial subscription before you begin. Try the trial subscription today.
An Azure Resource group, in which you (or the service principal you use) need to have User Access Administrator and Contributor access. You have such a resource group if you configured your ML extension per the preceding article.
An Azure Machine Learning workspace. You have a workspace if you configured your ML extension per the above article.
A trained machine learning model ready for scoring and deployment. If you're following along with the sample, a model is provided.
If you haven't already set the defaults for the Azure CLI, save your default settings. To avoid passing in the values for your subscription, workspace, and resource group multiple times, run this code:
az account set --subscription <subscription ID>
az configure --defaults gitworkspace=<Azure Machine Learning workspace name> group=<resource group>
To follow along with the sample, clone the samples repository and then change directory to cli.
git clone https://github.com/Azure/azureml-examples --depth 1
cd azureml-examples/cli
To use Azure Machine Learning, you must have an Azure subscription. If you don't have an Azure subscription, create a trial subscription before you begin. Try the trial subscription today.
Install and configure the Azure Machine Learning Python SDK (v2). For more information, see Install and set up SDK (v2).
An Azure Resource group, in which you (or the service principal you use) need to have User Access Administrator and Contributor access. You have such a resource group if you configured your ML extension per the preceding article.
An Azure Machine Learning workspace. You already have a workspace if you configured your ML extension per the preceding article.
A trained machine learning model ready for scoring and deployment. If you're following along with the sample, a model is provided.
Clone the samples repository, then change the directory.
git clone https://github.com/Azure/azureml-examples --depth 1
cd azureml-examples/sdk/endpoints/online/managed/managed-identities
To follow along with this notebook, access the companion example notebook within in the sdk/endpoints/online/managed/managed-identities directory.
Other Python packages are required for this example:
To use Azure Machine Learning, you must have an Azure subscription. If you don't have an Azure subscription, create a trial subscription before you begin. Try the trial subscription today.
Role creation permissions for your subscription or the Azure resources accessed by the user-assigned identity.
Install and configure the Azure Machine Learning Python SDK (v2). For more information, see Install and set up SDK (v2).
An Azure Resource group, in which you (or the service principal you use) need to have User Access Administrator and Contributor access. You have such a resource group if you configured your ML extension per the preceding article.
An Azure Machine Learning workspace. You already have a workspace if you configured your ML extension per the preceding article.
A trained machine learning model ready for scoring and deployment. If you're following along with the sample, a model is provided.
Clone the samples repository.
git clone https://github.com/Azure/azureml-examples --depth 1
cd azureml-examples/sdk/endpoints/online/managed/managed-identities
To follow along with this notebook, access the companion example notebook within in the sdk/endpoints/online/managed/managed-identities directory.
Other Python packages are required for this example:
The identity for an endpoint is immutable. During endpoint creation, you can associate it with a system-assigned identity (default) or a user-assigned identity. You can't change the identity after the endpoint is created.
Configure variables for deployment
Configure the variable names for the workspace, workspace location, and the endpoint you want to create for use with your deployment.
Next, specify what you want to name your blob storage account, blob container, and file. These variable names are defined here, and are referred to in az storage account create and az storage container create commands in the next section.
The following code exports those values as environment variables:
After these variables are exported, create a text file locally. When the endpoint is deployed, the scoring script accesses this text file using the system-assigned managed identity that's generated upon endpoint creation.
Decide on the name of your endpoint, workspace, and workspace location, then export that value as an environment variable:
Next, specify what you want to name your blob storage account, blob container, and file. These variable names are defined here, and are referred to in az storage account create and az storage container create commands in the next section.
After these variables are exported, create a text file locally. When the endpoint is deployed, the scoring script accesses this text file using the user-assigned managed identity used in the endpoint.
Decide on the name of your user identity name, and export that value as an environment variable:
export UAI_NAME="<USER_ASSIGNED_IDENTITY_NAME>"
Assign values for the workspace and deployment-related variables:
Next, specify what you want to name your blob storage account, blob container, and file. These variable names are defined here, and are referred to in the storage account and container creation code by the StorageManagementClient and ContainerClient.
After these variables are assigned, create a text file locally. When the endpoint is deployed, the scoring script accesses this text file using the system-assigned managed identity that's generated upon endpoint creation.
Now, get a handle to the workspace and retrieve its location:
from azure.ai.ml import MLClient
from azure.identity import AzureCliCredential
from azure.ai.ml.entities import (
ManagedOnlineDeployment,
ManagedOnlineEndpoint,
Model,
CodeConfiguration,
Environment,
)
credential = AzureCliCredential()
ml_client = MLClient(credential, subscription_id, resource_group, workspace_name)
workspace_location = ml_client.workspaces.get(workspace_name).location
Use this value to create a storage account.
Assign values for the workspace and deployment-related variables:
Next, specify what you want to name your blob storage account, blob container, and file. These variable names are defined here, and are referred to in the storage account and container creation code by the StorageManagementClient and ContainerClient.
After these variables are assigned, create a text file locally. When the endpoint is deployed, the scoring script will access this text file using the user-assigned managed identity that's generated upon endpoint creation.
Decide on the name of your user identity name:
uai_name = "<USER_ASSIGNED_IDENTITY_NAME>"
Now, get a handle to the workspace and retrieve its location:
from azure.ai.ml import MLClient
from azure.identity import AzureCliCredential
from azure.ai.ml.entities import (
ManagedOnlineDeployment,
ManagedOnlineEndpoint,
Model,
CodeConfiguration,
Environment,
)
credential = AzureCliCredential()
ml_client = MLClient(credential, subscription_id, resource_group, workspace_name)
workspace_location = ml_client.workspaces.get(workspace_name).location
To deploy an online endpoint with the CLI, you need to define the configuration in a YAML file. For more information on the YAML schema, see online endpoint YAML reference document.
The YAML files in the following examples are used to create online endpoints.
The following YAML example is located at endpoints/online/managed/managed-identities/1-sai-create-endpoint. The file,
Defines the name by which you want to refer to the endpoint, my-sai-endpoint.
Specifies the type of authorization to use to access the endpoint, auth-mode: key.
To deploy an online endpoint with the CLI, you need to define the configuration in a YAML file. For more information on the YAML schema, see online endpoint YAML reference document.
The YAML files in the following examples are used to create online endpoints.
The following YAML example is located at endpoints/online/managed/managed-identities/1-uai-create-endpoint. The file,
Defines the name by which you want to refer to the endpoint, my-uai-endpoint.
Specifies the type of authorization to use to access the endpoint, auth-mode: key.
Indicates the identity type to use, type: user_assigned
To deploy an online endpoint with the Python SDK (v2), objects can be used to define the following configuration. Alternatively, YAML files can be loaded using the .load method.
The following Python endpoint object:
Assigns the name by which you want to refer to the endpoint to the variable endpoint_name.
Specifies the type of authorization to use to access the endpoint auth-mode="key".
To deploy an online endpoint with the Python SDK (v2), objects can be used to define the following configuration. Alternatively, YAML files can be loaded using the .load method.
For a user-assigned identity, you define the endpoint configuration after the user-assigned managed identity is created.
This deployment object:
Specifies that the type of deployment you want to create is a ManagedOnlineDeployment via the class.
Indicates that the endpoint has an associated deployment called blue.
Configures the details of the deployment such as the name and instance_count
Defines more objects inline and associates them with the deployment for Model,CodeConfiguration, and Environment.
Includes environment variables needed for the user-assigned managed identity to access storage.
Adds a placeholder environment variable for UAI_CLIENT_ID, which is added after creating one and before actually deploying this configuration.
deployment = ManagedOnlineDeployment(
name="blue",
endpoint_name=endpoint_name,
model=Model(path="../../model-1/model/"),
code_configuration=CodeConfiguration(
code="../../model-1/onlinescoring/", scoring_script="score_managedidentity.py"
),
environment=Environment(
conda_file="../../model-1/environment/conda.yml",
image="mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04:20210727.v1",
),
instance_type="Standard_DS2_v2",
instance_count=1,
environment_variables={
"STORAGE_ACCOUNT_NAME": storage_account_name,
"STORAGE_CONTAINER_NAME": storage_container_name,
"FILE_NAME": file_name,
# We will update this after creating an identity
"UAI_CLIENT_ID": "uai_client_id_place_holder",
},
)
Create the managed identity
To access Azure resources, create a system-assigned or user-assigned managed identity for your online endpoint.
For this example, create a blob storage account and blob container, and then upload the previously created text file to the blob container. You give the online endpoint and managed identity access to this storage account and blob container.
Then, upload a blob to the container with the ContainerClient:
file_path = "hello.txt"
with open(file_path, "rb") as f:
container_client.upload_blob(name=file_name, data=f.read())
Create an online endpoint
The following code creates an online endpoint without specifying a deployment.
Warning
The identity for an endpoint is immutable. During endpoint creation, you can associate it with a system-assigned identity (default) or a user-assigned identity. You can't change the identity after the endpoint has been created.
az ml online-endpoint create --name $ENDPOINT_NAME -f endpoints/online/managed/managed-identities/1-uai-create-endpoint.yml --set identity.user_assigned_identities[0].resource_id=$uai_id
Check the status of the endpoint with the following.
Online endpoints require Azure Container Registry pull permission, AcrPull permission, to the container registry and Storage Blob Data Reader permission to the default datastore of the workspace.
You can allow the online endpoint permission to access your storage via its system-assigned managed identity or give permission to the user-assigned managed identity to access the storage account created in the previous section.
Retrieve the system-assigned managed identity that was created for your endpoint.
system_identity=`az ml online-endpoint show --name $ENDPOINT_NAME --query "identity.principal_id" -o tsv`
From here, you can give the system-assigned managed identity permission to access your storage.
az role assignment create --assignee-object-id $system_identity --assignee-principal-type ServicePrincipal --role "Storage Blob Data Reader" --scope $storage_id
uai_clientid=`az identity list --query "[?name=='$UAI_NAME'].clientId" -o tsv`
uai_principalid=`az identity list --query "[?name=='$UAI_NAME'].principalId" -o tsv`
Retrieve the user-assigned managed identity ID.
uai_id=`az identity list --query "[?name=='$UAI_NAME'].id" -o tsv`
Get the container registry associated with workspace.
container_registry=`az ml workspace show --name $WORKSPACE --query container_registry -o tsv`
Retrieve the default storage of the workspace.
storage_account=`az ml workspace show --name $WORKSPACE --query storage_account -o tsv`
Give permission of storage account to the user-assigned managed identity.
az role assignment create --assignee-object-id $uai_principalid --assignee-principal-type ServicePrincipal --role "Storage Blob Data Reader" --scope $storage_id
Give permission of container registry to user assigned managed identity.
az role assignment create --assignee-object-id $uai_principalid --assignee-principal-type ServicePrincipal --role "AcrPull" --scope $container_registry
Give permission of default workspace storage to user-assigned managed identity.
az role assignment create --assignee-object-id $uai_principalid --assignee-principal-type ServicePrincipal --role "Storage Blob Data Reader" --scope $storage_account
First, make an AuthorizationManagementClient to list role definitions:
from azure.mgmt.authorization import AuthorizationManagementClient
from azure.mgmt.authorization.v2018_01_01_preview.models import RoleDefinition
import uuid
role_definition_client = AuthorizationManagementClient(
credential=credential,
subscription_id=subscription_id,
api_version="2018-01-01-preview",
)
Next, assign the Storage Blob Data Reader role to the endpoint. The role definition is retrieved by name and passed along with the Principal ID of the endpoint. The role is applied at the scope of the storage account created above and allows the endpoint to read the file.
role_name = "Storage Blob Data Reader"
scope = storage_account.id
role_defs = role_definition_client.role_definitions.list(scope=scope)
role_def = next((r for r in role_defs if r.role_name == role_name))
role_assignment_client.role_assignments.create(
scope=scope,
role_assignment_name=str(uuid.uuid4()),
parameters=RoleAssignmentCreateParameters(
role_definition_id=role_def.id, principal_id=system_principal_id
),
)
First, make an AuthorizationManagementClient to list role definitions:
from azure.mgmt.authorization import AuthorizationManagementClient
from azure.mgmt.authorization.v2018_01_01_preview.models import RoleDefinition
import uuid
role_definition_client = AuthorizationManagementClient(
credential=credential,
subscription_id=subscription_id,
api_version="2018-01-01-preview",
)
Then, get the principal ID and client ID of the user-assigned managed identity. To assign roles, you only need the principal ID. However, you use the client ID to fill the UAI_CLIENT_ID placeholder environment variable before creating the deployment.
Next, assign the Storage Blob Data Reader role to the endpoint. The role definition is retrieved by name and passed along with the principal ID of the endpoint. The role is applied at the scope of the storage account created above to allow the endpoint to read the file.
role_name = "Storage Blob Data Reader"
scope = storage_account.id
role_defs = role_definition_client.role_definitions.list(scope=scope)
role_def = next((r for r in role_defs if r.role_name == role_name))
role_assignment_client.role_assignments.create(
scope=scope,
role_assignment_name=str(uuid.uuid4()),
parameters=RoleAssignmentCreateParameters(
role_definition_id=role_def.id, principal_id=uai_principal_id
),
)
For the next two permissions, you need the workspace and container registry objects:
Next, assign the AcrPull role to the user-assigned identity. This role allows images to be pulled from an Azure Container Registry. The scope is applied at the level of the container registry associated with the workspace.
role_name = "AcrPull"
scope = container_registry
role_defs = role_definition_client.role_definitions.list(scope=scope)
role_def = next((r for r in role_defs if r.role_name == role_name))
role_assignment_client.role_assignments.create(
scope=scope,
role_assignment_name=str(uuid.uuid4()),
parameters=RoleAssignmentCreateParameters(
role_definition_id=role_def.id, principal_id=uai_principal_id
),
)
Finally, assign the Storage Blob Data Reader role to the endpoint at the workspace storage account scope. This role assignment allows the endpoint to read blobs in the workspace storage account as well as the newly created storage account.
The role has the same name and capabilities as the first role assigned above, however it is applied at a different scope and has a different ID.
role_name = "Storage Blob Data Reader"
scope = workspace.storage_account
role_defs = role_definition_client.role_definitions.list(scope=scope)
role_def = next((r for r in role_defs if r.role_name == role_name))
role_assignment_client.role_assignments.create(
scope=scope,
role_assignment_name=str(uuid.uuid4()),
parameters=RoleAssignmentCreateParameters(
role_definition_id=role_def.id, principal_id=uai_principal_id
),
)
Scoring script to access Azure resource
Refer to the following script to understand how to use your identity token to access Azure resources, in this scenario, the storage account created in previous sections.
import os
import logging
import json
import numpy
import joblib
import requests
from azure.identity import ManagedIdentityCredential
from azure.storage.blob import BlobClient
def access_blob_storage_sdk():
credential = ManagedIdentityCredential(client_id=os.getenv("UAI_CLIENT_ID"))
storage_account = os.getenv("STORAGE_ACCOUNT_NAME")
storage_container = os.getenv("STORAGE_CONTAINER_NAME")
file_name = os.getenv("FILE_NAME")
blob_client = BlobClient(
account_url=f"https://{storage_account}.blob.core.chinacloudapi.cn/",
container_name=storage_container,
blob_name=file_name,
credential=credential,
)
blob_contents = blob_client.download_blob().content_as_text()
logging.info(f"Blob contains: {blob_contents}")
def get_token_rest():
"""
Retrieve an access token via REST.
"""
access_token = None
msi_endpoint = os.environ.get("MSI_ENDPOINT", None)
msi_secret = os.environ.get("MSI_SECRET", None)
# If UAI_CLIENT_ID is provided then assume that endpoint was created with user assigned identity,
# # otherwise system assigned identity deployment.
client_id = os.environ.get("UAI_CLIENT_ID", None)
if client_id is not None:
token_url = (
msi_endpoint + f"?clientid={client_id}&resource=https://storage.azure.com/"
)
else:
token_url = msi_endpoint + f"?resource=https://storage.azure.com/"
logging.info("Trying to get identity token...")
headers = {"secret": msi_secret, "Metadata": "true"}
resp = requests.get(token_url, headers=headers)
resp.raise_for_status()
access_token = resp.json()["access_token"]
logging.info("Retrieved token successfully.")
return access_token
def access_blob_storage_rest():
"""
Access a blob via REST.
"""
logging.info("Trying to access blob storage...")
storage_account = os.environ.get("STORAGE_ACCOUNT_NAME")
storage_container = os.environ.get("STORAGE_CONTAINER_NAME")
file_name = os.environ.get("FILE_NAME")
logging.info(
f"storage_account: {storage_account}, container: {storage_container}, filename: {file_name}"
)
token = get_token_rest()
blob_url = f"https://{storage_account}.blob.core.chinacloudapi.cn/{storage_container}/{file_name}?api-version=2019-04-01"
auth_headers = {
"Authorization": f"Bearer {token}",
"x-ms-blob-type": "BlockBlob",
"x-ms-version": "2019-02-02",
}
resp = requests.get(blob_url, headers=auth_headers)
resp.raise_for_status()
logging.info(f"Blob contains: {resp.text}")
def init():
global model
# AZUREML_MODEL_DIR is an environment variable created during deployment.
# It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)
# For multiple models, it points to the folder containing all deployed models (./azureml-models)
# Please provide your model's folder name if there is one
model_path = os.path.join(
os.getenv("AZUREML_MODEL_DIR"), "model/sklearn_regression_model.pkl"
)
# deserialize the model file back into a sklearn model
model = joblib.load(model_path)
logging.info("Model loaded")
# Access Azure resource (Blob storage) using system assigned identity token
access_blob_storage_rest()
access_blob_storage_sdk()
logging.info("Init complete")
# note you can pass in multiple rows for scoring
def run(raw_data):
logging.info("Request received")
data = json.loads(raw_data)["data"]
data = numpy.array(data)
result = model.predict(data)
logging.info("Request processed")
return result.tolist()
This deployment can take approximately 8-14 minutes depending on whether the underlying environment/image is being built for the first time. Subsequent deployments using the same environment will go quicker.
az ml online-deployment create --endpoint-name $ENDPOINT_NAME --all-traffic --name blue --file endpoints/online/managed/managed-identities/2-sai-deployment.yml --set environment_variables.STORAGE_ACCOUNT_NAME=$STORAGE_ACCOUNT_NAME environment_variables.STORAGE_CONTAINER_NAME=$STORAGE_CONTAINER_NAME environment_variables.FILE_NAME=$FILE_NAME
Note
The value of the --name argument may override the name key inside the YAML file.
Check the status of the deployment.
az ml online-deployment show --endpoint-name $ENDPOINT_NAME --name blue
The init method in the scoring script reads the file from your storage account using the system-assigned managed identity token.
To check the init method output, see the deployment log with the following code.
# Check deployment logs to confirm blob storage file contents read operation success.
az ml online-deployment get-logs --endpoint-name $ENDPOINT_NAME --name blue
az ml online-endpoint create --name $ENDPOINT_NAME -f endpoints/online/managed/managed-identities/1-uai-create-endpoint.yml --set identity.user_assigned_identities[0].resource_id=$uai_id
Note
The value of the --name argument may override the name key inside the YAML file.
Once the command executes, you can check the status of the deployment.
When your deployment completes, the model, the environment, and the endpoint are registered to your Azure Machine Learning workspace.
Test the endpoint
Once your online endpoint is deployed, test and confirm its operation with a request. Details of inferencing vary from model to model. For this guide, the JSON query parameters look like:
If you don't plan to continue using the deployed online endpoint and storage, delete them to reduce costs. When you delete the endpoint, all of its associated deployments are deleted as well.