Category Archives: Azure
Building the AI Bridge: How CloudFronts Helps You Connect Systems That Talk to Each Other
When we say building a bridge? Does it mean something isn’t connected together? And what is it?It’s AI itself and your systems that are not connected. What this means if although your AI can access your systems to derive information, it’s still unreliable, slow. What is needed for AI to be successful? In order for AI to be successful, below is what to avoid: In order to eliminate the above, we must have a layer of ‘catalog’ which will house all business data together so that a common vocabulary is established between systems. AI then pools from this ‘Data catalog’ to perform agentic actions. The diagram below best explains, on a high level, how this looks : And all this is defined by how well the integrations between these systems are established. How CloudFronts Can Help? CloudFronts has deep integration expertise where we connected cloud-based applications with each other with the below in mind – Often times, we find ready-made plug and play cloud-based integration solutions which come with their own hefty licensing that keeps going up every few years. Using such integration tools not only affects cash flow but also adds a layer of opaqueness, as we don’t control the flow of integration, and we cannot granularize it beyond what’s offered. Custom integration gives you better control and analytics, which readymade solutions can’t.Here’s a CloudFronts Case Study published by Microsoft, wherein we connected systems for our customer with multiple systems driving data and insights. To conclude, AI Agents are meant to be for your organization aren’t optimized to work right away. This disconnect needs to be engineered just like any other implementation project today. As this gap is real and must be fulfilled by something called Unity Catalog and integrations, CloudFronts can help bridge this gap and make AI work for your organization to continue to optimize cash flow against rising costs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
When to Use Azure Data Factory vs Logic Apps in Dynamics 365 Integrations
You’re integrating Dynamics 365 CRM with other systems—but you’re confused:Should I use Azure Data Factory or Logic Apps?Both support connectors, data transformation, and scheduling—but serve different purposes. When you’re working on integrating Dynamics 365 with other systems, two Azure tools often come up: Azure Logic Apps and Azure Data Factory (ADF). I’ve been asked many times — “Which one should I use?” — and honestly, there’s no one-size-fits-all answer. Based on real-world experience integrating D365 CRM and Finance, here’s how I approach choosing between Logic Apps and ADF. When to Use Logic Apps Azure Logic Apps is ideal when your integration involves: 1. Event-Driven / Real-Time Integration 2. REST APIs and Lightweight Automation 3. Business Process Workflows 4. Quick and Visual Flow Creation Azure Data Factory is better for: 1. Large Volume, Batch Data Movement 2. ETL / ELT Scenarios 3. Integration with Data Lakes and Warehouses 4. Advanced Data Flow Transformation Feature Comparison Table Feature Logic Apps Data Factory Trigger on Record Creation/Update Yes No (Batch Only) Handles APIs (HTTP, REST, OData) Excellent Limited Real-time Integration Yes No Large Data Volumes (Batch) Limited Excellent Data Lake / Warehouse Integration Basic (via connectors) Deep support Visual Workflow Visual Designer Visual (for Data Flows) Custom Code / Transformation Limited (use Azure Function) Strong via Data Flows Cost for High Volume Higher (Per Run) Cost-efficient for batch Real-World Scenarios 2. Use ADF When: To conclude, choose Logic Apps for real-time, low-volume, API-based workflows.Use Data Factory for batch ETL pipelines, high-volume exports, and reporting pipelines. Integrations in Dynamics 365 CRM aren’t one-size-fits-all—pick the right tool based on the data size, speed, and transformation needs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Create No Code Powerful AI Agents – Azure AI Foundry
An AI agent is a smart program that can think, make decisions, and do tasks. Sometimes it works alone, and sometimes it works with people or other agents. The main difference between an agent and a regular assistant is that agents can do things on their own. They don’t just help—you can give them a goal, and they’ll try to reach it. Every AI agent has three main parts: Agents can take input like a message or a prompt and respond with answers or actions. For example, they might look something up or start a process based on what you asked. Azure AI Foundry is a platform that brings all these things together; so you can build, train, and manage AI agents easily. References What is Azure AI Foundry Agent Service? – Azure AI Foundry | Microsoft Learn Understanding deployment types in Azure AI Foundry Models – Azure AI Foundry | Microsoft Learnhttps://learn.microsoft.com/en-us/azure/ai-foundry/how-to/index-add Usage Firstly, we create a project in Azure AI Foundry. Click on Next and give a name to your project. Wait till the setup finishes. Once the project creation finishes we are greeted with this screen. Click on Agents tab and click on Next to choose the model. I’m currently using GPT-4o Mini. It also includes descriptions for all the available models. Then we configure the deployment details. There are multiple deployment types available such as – Global Deployments Data Zone Standard Deployments Standard deployments [Standard] follow a pay-per-use model perfect for getting started quickly.They’re best for low to medium usage with occasional traffic spikes. However, for high and steady loads, performance may vary.Provisioned deployments [ProvisionedManaged] let you pre-allocate the amount of processing power you need.This is measured using Provisioned Throughput Units (PTUs). Each model and version requires a different number of PTUs and offers different performance levels. Provisioned deployments ensure predictable and stable performance for large or mission-critical workloads. This is how the deployment details look for in Global Standard. I’ll be choosing Standard deployment for our use case. Click on deploy and wait for a few seconds. Once the deployment is completed, you can give your agent a name and some instructions for their behavior. You should specify the tone, end goal, verbosity, etc as well. You can also specify the Temperature and Top P values which are both a control on the randomness or creativeness of the model. Temperature controls how bold or cautious the model is. Lower temperature = Safer, more predictable answers. (Factual Q&A, Code Summarization)Higher temperature = More creative or surprising answers. (Poetry/Creative writing) Top P (Nucleus Sampling) controls how wide the model’s word choices are. Lower Top P = Only picks from the most likely words. (Legal or financial writing) Higher Top P = Includes less likely, more diverse words. (Brainstorming names) Next, I’ll add a knowledge base to my bot. For this example, I’ll just upload a single file.However, you have the option to add an sharepoint folder or files, connect it to Bing Search, MS Fabric, Azure AI search, etc as required. A Vector store in Azure AI Foundry helps your AI agent retrieve relevant information based on meaning rather than just keywords.It works by breaking your content (like a PDF) into smaller parts, converting them into numerical representations (embeddings), and storing them.When a user asks a question, the AI finds the most semantically similar parts from the vector store and uses them to generate accurate, context-aware responses. Once you select the file, click on Upload and save. At this point, you can start to interact with your model. To “play around” with your model, click on the “Try in Playground” button. And here, we can see the output based on our provided knowledge base. One more example, just because it is kind of fun. Every input that you provide to the agent is called as a “message”. Everytime the agent is invoked for processing the provided input is called a “run”. Every interaction session with the agent is called a “thread”. We can see all the open threads in the threads section. To conclude, Azure AI Foundry makes it easy to build and use AI agents without writing any code. You can choose models, set how they behave, and connect your data all through a simple interface. Whether you’re testing ideas, automating tasks, or building custom bots, Foundry gives you the tools to do it.If you’re curious about AI or want to try building your agent, Foundry is a great place to begin. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Struggling with Siloed Systems? Here’s How CloudFronts Gets You Connected
In today’s world, we use many different applications for our daily work. One single application can’t handle everything because some apps are designed for specific tasks. That’s why organizations use multiple applications, which often leads to data being stored separately or in isolation. In this blog, we’ll take you on a journey from siloed systems to connected systems through a customer success story. About BÜCHI Büchi Labortechnik AG is a Swiss company renowned for providing laboratory and industrial solutions for R&D, quality control, and production. Founded in 1939, Büchi specializes in technologies such as: Their equipment is widely used in pharmaceuticals, chemicals, food & beverage, and academia for sample preparation, formulation, and analysis. Büchi is known for its precision, innovation, and strong customer support worldwide. Systems Used by BÜCHI To streamline operations and ensure seamless collaboration, BÜCHI leverages a variety of enterprise systems: Infor and SAP Business One are utilized for managing critical business functions such as finance, supply chain, manufacturing, and inventory. Reporting Challenges Due to Siloed Systems Organizations often rely on multiple disconnected systems across departments — such as ERP, CRM, marketing platforms, spreadsheets, and legacy tools. These siloed systems result in: The Need for a Single Source of Truth To solve these challenges, it’s critical to establish a Single Source of Truth (SSOT) — a central, trusted data platform where all key business data is: How We Helped Büchi Connect Their Systems To build a seamless and scalable integration framework, we leveraged the following Azure services: >Azure Logic Apps – Enabled no-code/low-code automation for integrating applications quickly and efficiently. >Azure Functions – Provided serverless computing for lightweight data transformations and custom logic execution. >Azure Service Bus – Ensured reliable, asynchronous communication between systems with FIFO message processing and decoupling of sender/receiver availability. >Azure API Management (APIM) – Secured and simplified access to backend services by exposing only required APIs, enforcing policies like authentication and rate limiting, and unifying multiple APIs under a single endpoint. BÜCHI’s case study was published on the Microsoft website, highlighting how CloudFronts helped connect their systems and prepare their data for insights and AI-driven solutions. Why a Single Source of Truth (SSOT) Is Important A Single Source of Truth means having one trusted location where your business stores consistent, accurate, and up-to-date data. Key Reasons It Matters: How we did this We used Azure Function Apps, Service Bus, and Logic Apps to seamlessly connect the systems. Databricks was implemented to build a Unity Catalog, establishing a Single Source of Truth (SSOT). On top of this unified data layer, we enabled advanced analytics and reporting using Power BI. In May, we hosted an event with BÜCHI at the Microsoft Office in Zurich. During the session, one of the attending customers remarked, “We are five years behind BÜCHI.” Another added, “If we don’t start now, we’ll be out of the race in the future.” This clearly reflects the urgent need for businesses to evolve. Today, Connected Systems, a Single Source of Truth (SSOT), Advanced Analytics, and AI are not optional — they are essential for sustainable growth and improved human efficiency. The pace of transformation has accelerated: tasks that once took months can now be achieved in days — and soon, perhaps, with just a prompt. To conclude, if you’re operating with multiple disconnected systems and relying heavily on manual processes, it’s time to rethink your approach. System integration and automation free your teams from repetitive work and empower them to focus on high impact, strategic activities. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Use Webhooks for Real-Time CRM Integrations in Dynamics 365
Are you looking for a reliable way to integrate Dynamics 365 CRM with external systems in real time? Polling APIs or scheduling batch jobs can delay updates and increase complexity. What if you could instantly notify external services when key events happen inside Dynamics 365? This blog will show you how to use webhooks—an event-driven mechanism to trigger real-time updates and data exchange with external services, making your integrations faster and more efficient. A webhook is a user-defined HTTP callback that is triggered by specific events. Instead of your external system repeatedly asking Dynamics 365 for updates, Dynamics 365 pushes updates to your service instantly. Dynamics 365 supports registering webhooks through plugin steps that execute when specific messages (create, update, delete, etc.) occur. This approach provides low latency and ensures that your external systems always have fresh data. This walkthrough covers the end-to-end process of configuring webhooks in Dynamics 365, registering them via plugins, and securely triggering external services. What You Need to Get Started Step 1: Create Your Webhook Endpoint Step 2: Register Your Webhook in Dynamics 365 Step 3: Create a Plugin to Trigger the Webhook public void Execute (IServiceProvider serviceProvider) { var notificationService = (IServiceEndpointNotificationService)serviceProvider.GetService(typeof(IServiceEndpointNotificationService)); Entity entity = (Entity)context.InputParameters[“Target”]; notificationService.Notify(“yourWebhookRegistrationName”, entity.ToEntityReference().Id.ToString()); } Register this plugin step for your message (Create, Update, Delete) on the entity you want to monitor. Step 4: Test Your Webhook Integration Step 5: Security and Best Practices Real-World Use Case A company wants to notify its external billing system immediately when a new invoice is created in Dynamics 365. By registering a webhook triggered by the Invoice creation event, the billing system receives data instantly and processes payment without delays or manual intervention. To conclude, webhooks offer a powerful way to build real-time, event-driven integrations with Dynamics 365, reducing latency and complexity in your integration solutions. We encourage you to start by creating a simple webhook endpoint and registering it with Dynamics 365 using plugins. This can transform how your CRM communicates with external systems. For deeper technical support and advanced integration patterns, explore CloudFront’s resources or get in touch with our experts to accelerate your Dynamics 365 integration project. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Migrating Data from Azure Files Share to Azure Blob Storage Using C#
For growing businesses, efficient data management is as critical as streamlined processes and actionable reporting. As organizations scale, the volume and complexity of data stored in systems like Azure Files Share increase, necessitating robust, scalable storage solutions like Azure Blob Storage. Are you struggling to manage your file storage efficiently? If you’re looking to automate data migration from Azure Files Share to Azure Blob Storage using C#, this article is for you. Research shows that 70% of customers value seamless experiences with efficient systems, impacting brand loyalty. Businesses automating data management processes can reduce retrieval times by up to 90%, while organizations leveraging cloud storage solutions like Azure Blob Storage report a 25% increase in operational productivity and 60% improved satisfaction in data workflows. This article provides a structured guide to migrating data using C#, drawing from practical implementation insights to help Team Leads, CTOs, and CEOs optimize their data storage for scalability and efficiency. Why Migrate to Azure Blob Storage? Azure Files Share offers managed file shares via the Server Message Block (SMB) protocol, suitable for traditional file system needs. However, Azure Blob Storage excels in scalability, cost efficiency, and integration with advanced Azure services like Azure Data Lake and AI/ML workloads. Key benefits include: Migrating Data Using C#: A Step-by-Step Approach To migrate data from Azure Files Share to Azure Blob Storage programmatically, you can leverage C# with Azure SDKs. Below is a structured approach, referencing a C# implementation that uses a timer-triggered Azure Function to automate the process. Step 1: Set Up Your Environment Step 2: Design the Migration Logic The C# code uses an Azure Function triggered on a schedule (e.g., every 5 seconds) to process files. Key components include: Step 3: Execute the Migration Step 4: Optimize and Automate Step 5: Validate and Test A Glimpse of the C# Implementation The C# code leverages an Azure Function to automate migration. It connects to the file share, enumerates files, uploads them to a blob container, and deletes them from the source upon successful transfer. Key features include: This approach ensures minimal manual intervention and robust error handling, aligning with the needs of growing businesses. Benefits of Programmatic Migration Using C# for migration offers: To conclude, migrating data from Azure File Share to Azure Blob Storage using C# empowers growing businesses to achieve scalable, cost-efficient, and automated data management. By implementing a structured approach with Azure Functions, you can streamline operations and unlock advanced analytics capabilities. Evaluate your current data management processes and identify one area for improvement, such as automating file transfers with C#. Start today to enhance efficiency and customer satisfaction. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification
In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects. To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation. Challenge Sales professionals routinely face challenges such as: These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects. How It Works Platform Data Sources CloudFronts SmartPitch pulls information from the following knowledge sources: AI Integration Key Features MQL – SQL Summary Generator Users can request MQL – SQL document which contains The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX. HTTP Request to Logic App At Logic App we used ChatGPT API to fetch company and client information Extract the company location from the company information, and similarly, extract the industry as well. Render it to custom copilot via request to the Logic App. Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions. Generative AI can also be instructed to directly create a formatted json based on parsed values. This formatted Json can be passed to converted to an actual Json and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person. This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry. This triggers an auto download of the MQL-SQL created as a PDF file on your system. Content Search Users can ask questions related to – Users can ask questions like “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume. –Security & Governance Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions. To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Top 5 Ways to Integrate Microsoft Dynamics 365 with Other Systems
When it comes to Microsoft Dynamics 365, one of its biggest strengths—and challenges—is how many ways there are to integrate it with other platforms. Whether you’re syncing with an ERP, pushing data to a data lake, or triggering notifications in Teams, the real question becomes: Which integration method should you choose? In this blog, we’ll break down the top 5 tools used by teams around the world to integrate Dynamics 365 with other systems. Each has its strengths, and each fits a different type of use case. 1. Power Automate – Best for Quick, No-Code Automations What it is: A low-code platform built into the Power Platform suite. When to use it: Internal automations, approvals, email notifications, basic integrations. Lesser-Known Tip: Power Automate runs on two plans—per user and per flow. If you have dozens of similar flows, the “per flow” plan can be more cost-effective than individual licenses. Advanced Feature: You can call Azure Functions or hosted APIs directly within a flow, effectively turning it into a lightweight integration framework. Pros: Cons: Example: When a new lead is created in D365, send an email alert and create a task in Outlook. 2. Azure Logic Apps – Best for Scalable Integrations What it is: A cloud-based workflow engine for system-to-system integrations. When to use it: Large-scale or backend integrations, especially when working with APIs. Lesser-Known Tip: Logic Apps come in two flavours—Consumption and Standard. The Standard tier offers VNET-integration, local development, and built-in connectors at a flat rate, which is ideal for predictable, high-throughput scenarios. Advanced Feature: Use Logic Apps’ built-in “Integration Account” to manage schemas, maps, and certificates for B2B scenarios (AS2, X12). Pros: Cons: Example: Sync Dynamics 365 opportunities with a SQL database in real time. 3. Data Export Service / Azure Synapse Link – Best for Analytics What it is: Tools to replicate D365 data into Azure SQL or Azure Data Lake. When to use it: Advanced reporting, Power BI, historical data analysis. Lesser-Known Tip: Data Export Service is being deprecated in flavours of Azure Synapse Link, which provides both near-real-time and “materialized view” patterns. You can even write custom analytics in Spark directly against your live CRM data. Advanced Feature: With Synapse Link, you can enable change data feed (CDC) and query Delta tables in Synapse, unlocking time-travel queries for historical analysis. Pros: Cons: Example: Export all account and contact data to Azure Synapse and visualize KPIs in Power BI. 4. Dual-write – Best for D365 F&O Integration What it is: A Microsoft-native framework to connect D365 CE (Customer Engagement) and D365 F&O (Finance & Operations). When to use it: Bi-directional, real-time sync between CRM and ERP. Lesser-Known Tip: Dual-write leverages the Common Data Service pipeline under the covers—so any customization (custom entities, fields) you add to Dataverse automatically flows through to F&O once you map it. Advanced Feature: You can extend dual-write with custom Power Platform flows to handle pre- or post-processing logic before records land in F&O. Pros: Cons: Example: Automatically sync customer and invoice records between D365 Sales and Finance. 5. Custom APIs & Webhooks – Best for Complex, Real-Time Needs What it is: Developer-driven integrations using HTTP APIs or Dynamics 365 webhooks. When to use it: External systems, fast processing, custom business logic. Lesser-Known Tip: Dynamics 365 supports registering multiple webhook subscribers on the same event. You can chain independent systems (e.g., call your middleware, then a monitoring service) without writing code. Advanced Feature: Combine webhooks with Azure Event Grid for enterprise-grade event routing, retry policies, and dead-lettering. Pros: Cons: Example: Trigger an API call to a shipping provider when a case status changes to “Ready to Ship.” To conclude, Microsoft Dynamics 365 gives you a powerful set of integration tools, each designed for a different type of business need. Whether you need something quick and simple (Power Automate), enterprise-ready (Logic Apps), or real-time and custom (Webhooks), there’s a solution that fits. Take a moment to evaluate your integration scenario. What systems are involved? How much data are you moving? What’s your tolerance for latency and failure? If you’re unsure which route to take, or need help designing and implementing your integrations, reach out to our team for a free consultation. Let’s make your Dynamics 365 ecosystem work smarter—together. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification
In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects. To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation. Challenge Sales professionals routinely face challenges such as: These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects. How It Works Platform Data Sources CloudFronts SmartPitch pulls information from the following knowledge sources: AI Integration Key Features MQL – SQL Summary Generator Users can request MQL – SQL document which contains The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX. HTTP Request to Logic App At Logic App we used ChatGPT API to fetch company and client information Extract the company location from the company information, and similarly, extract the industry as well. Render it to custom copilot via request to the Logic App. Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions. Generative AI can also be instructed to directly create a formatted json based on parsed values. This formatted JSON can be passed to converted to an actual JSON and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person. This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry. This triggers an auto download of the MQL-SQL created as a PDF file on your system. Content Search Users can ask questions related to – 1. Case Study FAQ: Helps users ask questions about client success stories and project case studies, retrieves relevant information from a knowledge source, and offers follow-up FAQs before ending the conversation. Cloudfronts official website is used for fetching Case Studies information. 2. Opportunities: Helps users inquire about past projects or opportunities, detailing client names, roles, estimated revenue and outcomes. 3. SOPs: Provides quick answers and summaries for frequently asked questions related to organizational processes and SOPs. Users can ask questions like “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume. Security & Governance Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions. To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Copy On-Premises SQL Database to Azure SQL Server Using ADF: A Step-by-Step Guide
Migrating an on-premises SQL database to the cloud can streamline operations and enhance scalability. Azure Data Factory (ADF) is a powerful tool that simplifies this process by enabling seamless data transfer to Azure SQL Server. In this guide, we’ll walk you through the steps to copy your on-premises SQL database to Azure SQL Server using ADF, ensuring a smooth and efficient migration. Prerequisites Before you begin, ensure you have: Step 1: Create an Azure SQL Server Database First, set up your target database in Azure: Step 2: Configure the Azure Firewall To allow ADF to access your Azure SQL Database, configure the firewall settings: Step 3: Connect Your On-Premises SQL Database to ADF Next, use ADF Studio to link your on-premises database: Step 4: Set Up a Linked Service A Linked Service is required to connect ADF to your on-premises SQL database: Step 5: Install the Integration Runtime for On-Premises Data Since your data source is on-premises, you need an Integration Runtime: Finally, ensure everything is set up correctly: Step 6: Verify and Test the Connection To conclude, migrating you’re on-premises SQL database to Azure SQL Server using ADF is a straightforward process when broken down into these steps. By setting up the database, configuring the firewall, and establishing the necessary connections, you can ensure a secure and efficient data transfer. With your data now in the cloud, you can leverage Azure’s scalability and performance to optimize your workflows. Happy migrating! Please refer to our case study of the city Council https://www.cloudfronts.com/case-studies/city-council/ to know more about how we used the Azure Data Factory and other AIS to deliver seamless integration. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com.