Quickstart: Get started with Azure Data Factory
APPLIES TO: Azure Data Factory Azure Synapse Analytics
Tip
Try out Data Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises. Microsoft Fabric covers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how to start a new trial for free!
Welcome to Azure Data Factory! This getting started article will let you create your first data factory and pipeline within 5 minutes. The ARM template below will create and configure everything you need to try it out. Then you only need to navigate to your demo data factory and make one more click to trigger the pipeline, which moves some sample data from one Azure blob storage to another.
Prerequisites
If you don't have an Azure subscription, create a trial account before you begin.
Try your first demo with one click
In your first demo scenario you will use the Copy activity in a data factory to copy an Azure blob named moviesDB2.csv from an input folder on an Azure Blob Storage to an output folder. In a real world scenario this copy operation could be between any of the many supported data sources and sinks available in the service. It could also involve transformations in the data.
Try it now with one click! After clicking the button below, the following objects will be created in Azure:
- A data factory account
- A pipeline within the data factory with one copy activity
- An Azure blob storage with moviesDB2.csv uploaded into an input folder as source
- A linked service to connect the data factory to the Azure blob storage
Step 1: Click the button to start
Select the button below to try it out! (If you clicked the one above already, you don't need to do it again.)
You will be redirected to the configuration page shown in the image below to deploy the template. Here, you only need to create a new resource group. (You can leave all the other values with their defaults.) Then click Review + create and click Create to deploy the resources.
Note
The user deploying the template needs to assign a role to a managed identity. This requires permissions that can be granted through the Owner, User Access Administrator or Managed Identity Operator roles.
All of the resources referenced above will be created in the new resource group, so you can easily clean them up after trying the demo.
Step 2: Review deployed resources
Select Go to resource group after your deployment is complete.
In the resource group, you will see the new data factory, Azure blob storage account, and managed identity that were created by the deployment.
Select the data factory in the resource group to view it. Then select the Launch Studio button to continue.
Select on the Author tab and then the Pipeline created by the template. Then check the source data by selecting Open.
In the source dataset that you will see, select Browse, and note the moviesDB2.csv file, which has been uploaded into the input folder already.
Step 3: Trigger the demo pipeline to run
- Select Add Trigger, and then Trigger Now.
- In the right pane under Pipeline run, select OK.
Monitor the pipeline
Select the Monitor tab .
You can see an overview of your pipeline runs in the Monitor tab, such as run start time, status, etc.
In this quickstart, the pipeline has only one activity type: Copy. Click on the pipeline name and you can see the details of the copy activity's run results.
Click on details, and the detailed copy process is displayed. From the results, data read and written size are the same, and 1 file was read and written, which also proves all the data has been successfully copied to the destination.
Clean up resources
You can clean up all the resources you created in this quickstart in either of two ways. You can delete the entire Azure resource group, which includes all the resources created in it. Or if you want to keep some resources intact, browse to the resource group and delete only the specific resources you want, keeping the others. For example, if you are using this template to create a data factory for use in another tutorial, you can delete the other resources but keep only the data factory.
Related content
In this quickstart, you created an Azure Data Factory containing a pipeline with a copy activity. To learn more about Azure Data Factory, continue on to the article and Learn module below.