Secure an Azure Machine Learning inferencing environment with virtual networks
In this article, you learn how to secure inferencing environments (online endpoints) with a virtual network in Azure Machine Learning. There are two inference options that can be secured using a VNet:
Azure Machine Learning managed online endpoints
Tip
Azure recommends using an Azure Machine Learning managed virtual networks instead of the steps in this article when securing managed online endpoints. With a managed virtual network, Azure Machine Learning handles the job of network isolation for your workspace and managed computes. You can also add private endpoints for resources needed by the workspace, such as Azure Storage Account. For more information, see Workspace managed network isolation.
Azure Kubernetes Service
Tip
This article is part of a series on securing an Azure Machine Learning workflow. See the other articles in this series:
- Virtual network overview
- Secure the workspace resources
- Secure the training environment
- Enable studio functionality
- Use custom DNS
- Use a firewall
For a tutorial on creating a secure workspace, see Tutorial: Create a secure workspace, Bicep template, or Terraform template.
Prerequisites
Read the Network security overview article to understand common virtual network scenarios and overall virtual network architecture.
An existing virtual network and subnet that is used to secure the Azure Machine Learning workspace.
To deploy resources into a virtual network or subnet, your user account must have permissions to the following actions in Azure role-based access control (Azure RBAC):
- "Microsoft.Network/*/read" on the virtual network resource. This permission isn't needed for Azure Resource Manager (ARM) template deployments.
- "Microsoft.Network/virtualNetworks/join/action" on the virtual network resource.
- "Microsoft.Network/virtualNetworks/subnets/join/action" on the subnet resource.
For more information on Azure RBAC with networking, see the Networking built-in roles
- If using Azure Kubernetes Service (AKS), you must have an existing AKS cluster secured as described in the Secure Azure Kubernetes Service inference environment article.
Secure managed online endpoints
For information on securing managed online endpoints, see the Use network isolation with managed online endpoints article.
Secure Azure Kubernetes Service online endpoints
To use Azure Kubernetes Service cluster for secure inference, use the following steps:
Create or configure a secure Kubernetes inferencing environment.
Deploy Azure Machine Learning extension.
Model deployment with Kubernetes online endpoint can be done using CLI v2, Python SDK v2 and Studio UI.
- CLI v2 - https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/kubernetes
- Python SDK V2 - https://github.com/Azure/azureml-examples/tree/main/sdk/python/endpoints/online/kubernetes
- Studio UI - Follow the steps in managed online endpoint deployment through the Studio. After you enter the Endpoint name, select Kubernetes as the compute type instead of Managed.
Limit outbound connectivity from the virtual network
If you don't want to use the default outbound rules and you do want to limit the outbound access of your virtual network, you must allow access to Azure Container Registry. For example, make sure that your Network Security Groups (NSG) contains a rule that allows access to the AzureContainerRegistry.RegionName service tag where `{RegionName} is the name of an Azure region.
Next steps
This article is part of a series on securing an Azure Machine Learning workflow. See the other articles in this series: