Provision a service principal by using Terraform

Note

To provision a Microsoft Entra ID managed service principal by using the Azure portal and the Azure Databricks user interface instead, see Manage service principals.

Microsoft Entra ID managed service principals differ from managed identities for Azure resources, which Azure Databricks also supports for authentication. To learn how to use managed identities for Azure resources instead of Microsoft Entra ID managed service principals for Azure Databricks authentication, see Set up and use Azure managed identities authentication for Azure Databricks automation.

A service principal is an identity for automated tools and systems like scripts, apps, and CI/CD platforms. Databricks recommends using a service principal and its OAuth token or personal access token instead of your Azure Databricks user account and personal access token. Benefits include:

  • Granting and restricting access to resources independently of a user.
  • Enabling users to better protect their own access tokens.
  • Disabling or deleting a service principal without affecting other users.
  • Removing a user when they leave the organization without impacting any service principal.

Follow these instructions to use Terraform to create a Microsoft Entra ID managed service principal in Azure, use the Databricks Terraform provider to link the Microsoft Entra ID service principal to your Azure Databricks workspace, and then optionally create a Microsoft Entra ID token or Azure Databricks OAuth token for the service principal.

Requirements

Step 1: Create the service principal

If you already have a Microsoft Entra ID managed service principal available, skip ahead to Step 2.

  1. In your terminal, create an empty directory and then switch to it. (Each separate set of Terraform configuration files must be in its own directory.) For example: mkdir terraform_azure_service_principal_demo && cd terraform_azure_service_principal_demo.

    mkdir terraform_azure_service_principal_demo && cd terraform_azure_service_principal_demo
    
  2. In this empty directory, create a file named main.tf. Add the following content to this file, and then save the file.

    variable "azure_service_principal_display_name" {
      description = "A display name for the <entra-service-principal>."
      type        = string
    }
    
    terraform {
      required_providers {
        azuread = {
          source  = "hashicorp/azuread"
        }
      }
    }
    
    provider "azurerm" {
      features {}
    }
    
    resource "azuread_application" "this" {
      display_name = var.azure_service_principal_display_name
    }
    
    resource "azuread_service_principal" "this" {
      application_id = azuread_application.this.application_id
    }
    
    resource "time_rotating" "month" {
      rotation_days = 30
    }
    
    resource "azuread_service_principal_password" "this" {
      service_principal_id = azuread_service_principal.this.object_id
      rotate_when_changed  = { rotation = time_rotating.month.id }
    }
    
    output "azure_client_id" {
      description = "The Azure AD service principal's application (client) ID."
      value       = azuread_application.this.application_id
    }
    
    output "azure_client_secret" {
      description = "The Azure AD service principal's client secret value."
      value       = azuread_service_principal_password.this.value
      sensitive   = true
    }
    
  3. In the same directory, create a file named terraform.tfvars. Add the following content to this file, replacing the following value, and then save the file:

    • Replace the azure_service_principal_display_name value with a display name for the Microsoft Entra ID service principal.
    azure_service_principal_display_name = "<A display name for the <entra-service-principal>>"
    
  4. Initialize the working directory containing the main.tf file by running the terraform init command. For more information, see Command: init on the Terraform website.

    terraform init
    
  5. Check whether there are any syntax errors in the configuration by running the terraform validate command. For more information, see Command: validate on the Terraform website.

    terraform validate
    
  6. Apply the changes required to reach the desired state of the configuration by running the terraform apply command. For more information, see Command: apply on the Terraform website.

    terraform apply
    

After you create the service principal, copy the azure_client_id and azure_client_secret output values, as you will need them later.

To get the azure_client_secret value, see the value of outputs.client_secret.value in the terraform.tfstate file, which is in the working directory containing the main.tf file.

Step 2: Add the service principal to the Azure Databricks workspace

Note

The following content adds a service principal at the Azure Databricks workspace level. If your Azure Databricks workspace is enabled for identity federation, then the following content also automatically synchronizes the service principal to the related Azure Databricks account.

  1. In your terminal, create an empty directory and then switch to it. Each separate set of Terraform configuration files must be in its own directory. For example: mkdir terraform_databricks_service_principal_demo && cd terraform_databricks_service_principal_demo.

    mkdir terraform_databricks_service_principal_demo && cd terraform_databricks_service_principal_demo
    
  2. In this empty directory, create a file named main.tf. Add the following content to this file, and then save the file.

    variable "databricks_host" {
      description = "The Azure Databricks workspace URL."
      type = string
    }
    
    variable "azure_client_id" {
      type        = string
      description = "The application (client) ID of the <entra-service-principal> to link to an Azure Databricks service principal. This application (client) ID will be the application ID of the Azure Databricks service principal."
    }
    
    variable "databricks_service_principal_display_name" {
      type        = string
      description = "A workspace display name for the Azure Databricks service principal."
    }
    
    terraform {
      required_providers {
        databricks = {
          source = "databricks/databricks"
        }
      }
    }
    
    provider "databricks" {
      host = var.databricks_host
    }
    
    resource "databricks_service_principal" "sp" {
      application_id = var.azure_client_id
      display_name   = var.databricks_service_principal_display_name
    }
    
    output "databricks_service_principal_application_id" {
      value       = databricks_service_principal.sp.application_id
      description = "Application ID of the Azure Databricks service principal."
    }
    
    output "databricks_service_principal_display_name" {
      value       = databricks_service_principal.sp.display_name
      description = "Workspace display name of the Azure Databricks service principal."
    }
    
    output "databricks_workspace_service_principal_id" {
      value       = databricks_service_principal.sp.id
      description = "Workspace ID of the Azure Databricks service principal. This ID is generated by Azure Databricks for this workspace."
    }
    

    Note

    To add this service principal to groups, and to add entitlements to this service principal, see databricks_service_principal on the Terraform website.

  3. In the same directory, create a file named terraform.tfvars. Add the following content to this file, replacing the following values, and then save the file:

    • Replace the databricks_host value with the URL of the Azure Databricks workspace.
    • Replace the azure_client_id value with the azure_client_id value from Step 1.
    • Replace the databricks_service_principal_display_name value with a workspace display name for the Azure Databricks service principal.
    databricks_host                           = "<The Azure Databricks workspace URL, starting with https://>"
    azure_client_id                           = "<The Azure client ID of the Azure Active AD service principal>"
    databricks_service_principal_display_name = "<A workspace display name for the Azure Databricks service principal>"
    
  4. Initialize the working directory containing the main.tf file by running the terraform init command. For more information, see Command: init on the Terraform website.

    terraform init
    
  5. Check whether there are any syntax errors in the configuration by running the terraform validate command. For more information, see Command: validate on the Terraform website.

    terraform validate
    
  6. Apply the changes required to reach the desired state of the configuration by running the terraform apply command. For more information, see Command: apply on the Terraform website.

    terraform apply
    

After you create the service principal, copy the databricks_service_principal_application_id output value, as you will need it to create a Microsoft Entra ID token for the service principal.

(Optional) Step 3: Create a Microsoft Entra ID access token for a Microsoft Entra ID service principal

Databricks does not recommend that you create Microsoft Entra ID tokens for Microsoft Entra ID service principals manually. This is because each Microsoft Entra ID token is short-lived, typically expiring within one hour. After this time, you must manually generate a replacement Microsoft Entra ID token. Instead, use one of the participating tools or SDKs that implement the Databricks client unified authentication standard. These tools and SDKs automatically generate and replace expired Microsoft Entra ID tokens for you, leveraging the following Databricks authentication types:

If you need to manually create a Microsoft Entra ID token for a Microsoft Entra ID service principal, gather the following information, and then follow the instructions in Get a Microsoft Entra ID access token with the Microsoft identity platform REST API or Get a Microsoft Entra ID access token with the Azure CLI :

  • The tenant ID for your Microsoft Entra ID service principal, which you will use as the Tenant ID / Directory (tenant) ID / <tenant-id> in the instructions.
  • The databricks_service_principal_application_id value from Step 2, which you will use as the Client ID / Application (client) ID / <client-id> in the instructions.
  • The azure_client_secret value from Step 1, which you will use as the Client secret / Value / <client-secret> in the instructions.

After you create the Microsoft Entra ID token, copy the access_token value, as you will need to provide it to your script, app, or system.

(Optional) Step 4: Create an Azure Databricks OAuth token for a Microsoft Entra ID service principal

Databricks does not recommend that you create Azure Databricks OAuth tokens for Microsoft Entra ID managed service principals manually. This is because each Azure Databricks OAuth token is short-lived, typically expiring within one hour. After this time, you must manually generate a replacement Azure Databricks OAuth token. Instead, use one of the participating tools or SDKs that implement the Databricks client unified authentication standard. These tools and SDKs automatically generate and replace expired Azure Databricks OAuth tokens for you, leveraging Authenticate access to Azure Databricks with a service principal using OAuth (OAuth M2M). If you need to manually create an Azure Databricks OAuth token for a Microsoft Entra ID service principal, see Manually generate and use access tokens for OAuth M2M authentication.