Creating and Accessing Blob Storage with Azure Data Factory: A Complete Guide
Introduction:
This guide will walk you through creating and accessing Azure Blob Storage and integrating it with Azure Data Factory to automate data pipelines. From setting up a storage account and managing containers to configuring pipelines and transferring data to an Azure SQL Database, this step-by-step tutorial ensures you gain a comprehensive understanding of the process.
Steps:
- Start by opening the Azure Portal and signing into your account Navigate to the Azure Portal.
- Go to Storage Accounts.

3. Click on + Create to initiate the creation of a new storage account.

4. Fill in the required fields like subscription, resource group, and region. Review all the settings before proceeding.

5. Create a Storage Account

6. Once the storage account is created, go to the resource by clicking on Go to Resource.

7. In the storage account, navigate to the Containers section and click + Container to create a new container for storing your files.

8. Click on the container you just created to access its contents.

9. Upload the desired JSON file into the container by clicking on Upload and selecting the file from your local system.

10. Ensure that the uploaded file is now listed in the container.

11. Go back to the Azure Portal and search for Azure Data Factory to open the ADF service.

12. From the ADF home screen, go to Author > Datasets. Click + New Dataset to create a new dataset for your Blob Storage.

13. Select the Azure Blob Storage dataset type, as you are working with data stored in Blob Storage.

14. Choose the data format that matches the file you uploaded, such as JSON, and click Continue.

15. Enter the necessary details for your dataset, including the file path and format settings. Select the appropriate Authentication type and specify the Storage account where the Blob Storage resides. Click Create to finalize the dataset creation.

16. Verify the settings and click OK to confirm the dataset configuration.


17. Navigate to the Pipelines section and click + New Pipeline to create a pipeline that will define your data flow.

18. Pipeline gets created successfully as shown below.

19. In the pipeline, select the dataset type as Azure SQL Database and click Continue to set up the SQL Database dataset.

20. Provide the necessary Linked Service details for your SQL database and click Create.



21. After configuring both the source and target datasets, and the pipeline, publish all the elements to save your work.



22. Once the pipeline is running successfully, you can verify its functionality by querying the destination database to ensure data is being transferred properly.
a. Go to the SQL Database and select the relevant database.

b. Select the database on which we have perform a query.

c. Log in with your credentials.

d. Write a simple test query to verify data has been transferred from Blob Storage to the SQL Database. Execute the query and confirm that the expected output is returned.

Conclusion:
Integrating Azure Blob Storage with Azure Data Factory is a powerful way to manage and automate data workflows in the cloud. This guide walks you through creating a storage account, configuring containers, uploading data, and designing a pipeline to process and transfer data to Azure SQL Database. By following these steps, you can efficiently handle large-scale data integration and ensure seamless communication between your data sources and destinations.
Azure Data Factory not only simplifies the process of orchestrating data pipelines but also provides robust options for monitoring and optimizing workflows. Whether you are managing JSON files, processing transactional data, or setting up complex ETL processes, Azure’s ecosystem offers a reliable and scalable solution.
Start exploring these tools today to unlock new possibilities in data-driven operations!
We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Related posts:
Azure Integration Services (AIS): The Key to Scalable Enterprise Integrations
Understanding and Analyzing Customer Ledger Data with Business Charts in Dynamics 365
Mastering Date Manipulation with CALCDATE in Microsoft Dynamics 365 Business Central
A Guide to Batch and Serial Numbers in D365 F&O: Part 1
Share Story :
SEARCH BLOGS :
FOLLOW CLOUDFRONTS BLOG :
Enter your email address to follow this blog and receive notifications of new posts by email.
Categories
- Azure (96)
- Azure and Office 365 (107)
- Azure App Services (1)
- Azure Databricks (3)
- Azure DevOps Services (5)
- Azure Function (11)
- Azure Synapse Analytics (3)
- Blog (1,309)
- Business Process Flow (2)
- C# (5)
- Cloud flows (20)
- CloudFlows (9)
- Copilot (1)
- Customer Success (79)
- d365 (1)
- D365 Business Central (289)
- D365 Commerce (5)
- D365 Customer Service (59)
- D365 Field Service (21)
- D365 Finance (4)
- D365 Finance and Operations (204)
- D365 General (300)
- D365 Project Operations (4)
- D365 Project Service Automation (55)
- D365 Retail (60)
- D365 Sales (53)
- D365 SCM (8)
- Dataverse (7)
- Demand and Supply Forecasting (1)
- Dot Net (1)
- Dynamics 365 (231)
- Dynamics 365, Business (16)
- Dynamics AX (40)
- Dynamics CRM (135)
- Dynamics NAV (14)
- InforLN (1)
- JavaScript (7)
- Logic App (26)
- LS Central (13)
- Model-Driven App (4)
- MS Teams (5)
- Power Automate (56)
- Power BI (193)
- Power Plattform (15)
- Power Query (3)
- Power Virtual Agent (1)
- PowerApps (54)
- PowerApps Portal (5)
- Press Releases (49)
- Project Management (3)
- Project Service Automation (7)
- Salesforce (2)
- SharePoint (4)
- SQL Server (7)
- SSIS (1)
- Thought Leadership Article (5)
- Tibco (3)
RECENT UPDATES
- Azure Integration Services (AIS): The Key to Scalable Enterprise Integrations
- Enhancing the Recurring General Journal with Automated Approval Workflows in Dynamics 365 Business Central
- Managing Profile Pictures on Custom Pages in Microsoft Dynamics 365 Business Central
- Integrating Sales, Project Management, and Billing with our Quick Start offer
- Understanding and Analyzing Customer Ledger Data with Business Charts in Dynamics 365