Azure Archives - Page 3 of 15 - - Page 3

Category Archives: Azure

Migrating Data from Azure Files Share to Azure Blob Storage Using C#

For growing businesses, efficient data management is as critical as streamlined processes and actionable reporting. As organizations scale, the volume and complexity of data stored in systems like Azure Files Share increase, necessitating robust, scalable storage solutions like Azure Blob Storage. Are you struggling to manage your file storage efficiently? If you’re looking to automate data migration from Azure Files Share to Azure Blob Storage using C#, this article is for you. Research shows that 70% of customers value seamless experiences with efficient systems, impacting brand loyalty. Businesses automating data management processes can reduce retrieval times by up to 90%, while organizations leveraging cloud storage solutions like Azure Blob Storage report a 25% increase in operational productivity and 60% improved satisfaction in data workflows. This article provides a structured guide to migrating data using C#, drawing from practical implementation insights to help Team Leads, CTOs, and CEOs optimize their data storage for scalability and efficiency. Why Migrate to Azure Blob Storage? Azure Files Share offers managed file shares via the Server Message Block (SMB) protocol, suitable for traditional file system needs. However, Azure Blob Storage excels in scalability, cost efficiency, and integration with advanced Azure services like Azure Data Lake and AI/ML workloads. Key benefits include: Migrating Data Using C#: A Step-by-Step Approach To migrate data from Azure Files Share to Azure Blob Storage programmatically, you can leverage C# with Azure SDKs. Below is a structured approach, referencing a C# implementation that uses a timer-triggered Azure Function to automate the process. Step 1: Set Up Your Environment Step 2: Design the Migration Logic The C# code uses an Azure Function triggered on a schedule (e.g., every 5 seconds) to process files. Key components include: Step 3: Execute the Migration Step 4: Optimize and Automate Step 5: Validate and Test A Glimpse of the C# Implementation The C# code leverages an Azure Function to automate migration. It connects to the file share, enumerates files, uploads them to a blob container, and deletes them from the source upon successful transfer. Key features include: This approach ensures minimal manual intervention and robust error handling, aligning with the needs of growing businesses. Benefits of Programmatic Migration Using C# for migration offers: To conclude, migrating data from Azure File Share to Azure Blob Storage using C# empowers growing businesses to achieve scalable, cost-efficient, and automated data management. By implementing a structured approach with Azure Functions, you can streamline operations and unlock advanced analytics capabilities. Evaluate your current data management processes and identify one area for improvement, such as automating file transfers with C#. Start today to enhance efficiency and customer satisfaction. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification

In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects.  To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation.  Challenge  Sales professionals routinely face challenges such as:  These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects.  How It Works  Platform  Data Sources  CloudFronts SmartPitch pulls information from the following knowledge sources:  AI Integration  Key Features  MQL – SQL Summary Generator  Users can request MQL – SQL document which contains   The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX.  HTTP Request to Logic App At Logic App we used ChatGPT API to fetch company and client information  Extract the company location from the company information, and similarly, extract the industry as well.  Render it to custom copilot via request to the Logic App.  Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions. Generative AI can also be instructed to directly create a formatted json based on parsed values.   This formatted Json can be passed to converted to an actual Json and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person. This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry.   This triggers an auto download of the MQL-SQL created as a PDF file on your system.   Content Search  Users can ask questions related to –  Users can ask questions like   “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume.  –Security & Governance  Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions.  To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com 

Share Story :

Top 5 Ways to Integrate Microsoft Dynamics 365 with Other Systems 

When it comes to Microsoft Dynamics 365, one of its biggest strengths—and challenges—is how many ways there are to integrate it with other platforms. Whether you’re syncing with an ERP, pushing data to a data lake, or triggering notifications in Teams, the real question becomes:  Which integration method should you choose?  In this blog, we’ll break down the top 5 tools used by teams around the world to integrate Dynamics 365 with other systems. Each has its strengths, and each fits a different type of use case.  1. Power Automate – Best for Quick, No-Code Automations  What it is: A low-code platform built into the Power Platform suite. When to use it: Internal automations, approvals, email notifications, basic integrations.  Lesser-Known Tip: Power Automate runs on two plans—per user and per flow. If you have dozens of similar flows, the “per flow” plan can be more cost-effective than individual licenses.  Advanced Feature: You can call Azure Functions or hosted APIs directly within a flow, effectively turning it into a lightweight integration framework. Pros:  Cons:  Example: When a new lead is created in D365, send an email alert and create a task in Outlook.  2. Azure Logic Apps – Best for Scalable Integrations  What it is: A cloud-based workflow engine for system-to-system integrations. When to use it: Large-scale or backend integrations, especially when working with APIs.  Lesser-Known Tip: Logic Apps come in two flavours—Consumption and Standard. The Standard tier offers VNET-integration, local development, and built-in connectors at a flat rate, which is ideal for predictable, high-throughput scenarios.  Advanced Feature: Use Logic Apps’ built-in “Integration Account” to manage schemas, maps, and certificates for B2B scenarios (AS2, X12). Pros:  Cons:  Example: Sync Dynamics 365 opportunities with a SQL database in real time.  3. Data Export Service / Azure Synapse Link – Best for Analytics  What it is: Tools to replicate D365 data into Azure SQL or Azure Data Lake. When to use it: Advanced reporting, Power BI, historical data analysis.  Lesser-Known Tip: Data Export Service is being deprecated in flavours of Azure Synapse Link, which provides both near-real-time and “materialized view” patterns. You can even write custom analytics in Spark directly against your live CRM data.  Advanced Feature: With Synapse Link, you can enable change data feed (CDC) and query Delta tables in Synapse, unlocking time-travel queries for historical analysis. Pros:  Cons:  Example: Export all account and contact data to Azure Synapse and visualize KPIs in Power BI.  4. Dual-write – Best for D365 F&O Integration  What it is: A Microsoft-native framework to connect D365 CE (Customer Engagement) and D365 F&O (Finance & Operations). When to use it: Bi-directional, real-time sync between CRM and ERP.  Lesser-Known Tip: Dual-write leverages the Common Data Service pipeline under the covers—so any customization (custom entities, fields) you add to Dataverse automatically flows through to F&O once you map it.  Advanced Feature: You can extend dual-write with custom Power Platform flows to handle pre- or post-processing logic before records land in F&O. Pros:  Cons:  Example: Automatically sync customer and invoice records between D365 Sales and Finance.  5. Custom APIs & Webhooks – Best for Complex, Real-Time Needs  What it is: Developer-driven integrations using HTTP APIs or Dynamics 365 webhooks. When to use it: External systems, fast processing, custom business logic.  Lesser-Known Tip: Dynamics 365 supports registering multiple webhook subscribers on the same event. You can chain independent systems (e.g., call your middleware, then a monitoring service) without writing code.  Advanced Feature: Combine webhooks with Azure Event Grid for enterprise-grade event routing, retry policies, and dead-lettering. Pros:  Cons:  Example: Trigger an API call to a shipping provider when a case status changes to “Ready to Ship.”  To conclude, Microsoft Dynamics 365 gives you a powerful set of integration tools, each designed for a different type of business need. Whether you need something quick and simple (Power Automate), enterprise-ready (Logic Apps), or real-time and custom (Webhooks), there’s a solution that fits.  Take a moment to evaluate your integration scenario. What systems are involved? How much data are you moving? What’s your tolerance for latency and failure?  If you’re unsure which route to take, or need help designing and implementing your integrations, reach out to our team for a free consultation. Let’s make your Dynamics 365 ecosystem work smarter—together.  We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification

In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects.  To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation.  Challenge  Sales professionals routinely face challenges such as:  These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects.  How It Works  Platform  Data Sources  CloudFronts SmartPitch pulls information from the following knowledge sources:  AI Integration  Key Features  MQL – SQL Summary Generator  Users can request MQL – SQL document which contains   The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX.  HTTP Request to Logic App  At Logic App we used ChatGPT API to fetch company and client information  Extract the company location from the company information, and similarly, extract the industry as well.  Render it to custom copilot via request to the Logic App.   Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions.  Generative AI can also be instructed to directly create a formatted json based on parsed values.     This formatted JSON can be passed to converted to an actual JSON and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person.   This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry.   This triggers an auto download of the MQL-SQL created as a PDF file on your system.    Content Search  Users can ask questions related to –  1. Case Study FAQ: Helps users ask questions about client success stories and project case studies, retrieves relevant information from a knowledge source, and offers follow-up FAQs before ending the conversation. Cloudfronts official website is used for fetching Case Studies information.  2. Opportunities: Helps users inquire about past projects or opportunities, detailing client names, roles, estimated revenue and outcomes.  3. SOPs: Provides quick answers and summaries for frequently asked questions related to organizational processes and SOPs.  Users can ask questions like   “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume.  Security & Governance  Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions.  To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com 

Share Story :

Copy On-Premises SQL Database to Azure SQL Server Using ADF: A Step-by-Step Guide

Posted On June 12, 2025 by Deepak Chauhan Posted in

Migrating an on-premises SQL database to the cloud can streamline operations and enhance scalability. Azure Data Factory (ADF) is a powerful tool that simplifies this process by enabling seamless data transfer to Azure SQL Server. In this guide, we’ll walk you through the steps to copy your on-premises SQL database to Azure SQL Server using ADF, ensuring a smooth and efficient migration. Prerequisites Before you begin, ensure you have: Step 1: Create an Azure SQL Server Database First, set up your target database in Azure: Step 2: Configure the Azure Firewall To allow ADF to access your Azure SQL Database, configure the firewall settings: Step 3: Connect Your On-Premises SQL Database to ADF Next, use ADF Studio to link your on-premises database: Step 4: Set Up a Linked Service A Linked Service is required to connect ADF to your on-premises SQL database: Step 5: Install the Integration Runtime for On-Premises Data Since your data source is on-premises, you need an Integration Runtime: Finally, ensure everything is set up correctly: Step 6: Verify and Test the Connection To conclude, migrating you’re on-premises SQL database to Azure SQL Server using ADF is a straightforward process when broken down into these steps. By setting up the database, configuring the firewall, and establishing the necessary connections, you can ensure a secure and efficient data transfer. With your data now in the cloud, you can leverage Azure’s scalability and performance to optimize your workflows. Happy migrating! Please refer to our case study of the city Council https://www.cloudfronts.com/case-studies/city-council/ to know more about how we used the Azure Data Factory and other AIS to deliver seamless integration. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com.

Share Story :

Error Handling in Azure Data Factory (ADF): Part 1

Posted On June 10, 2025 by Deepak Chauhan Posted in

Azure Data Factory (ADF) is a powerful ETL tool, but when it comes to error handling, things can get tricky—especially when you’re dealing with parallel executions or want to notify someone on failure. In this two-part blog series, we’ll walk through how to build intelligent error handling into your ADF pipelines. This post—Part 1—focuses on the planning phase: understanding ADF’s behavior, the common pitfalls, and how to set your pipelines up for reliable error detection and notification. In Part 2, we’ll implement everything you’ve planned to use ADF control flows. Part 1: Planning for Failures Step 1: Understand ADF Dependency Behavior In ADF, activities can be connected via dependency conditions like: When multiple dependencies are attached to a single activity, ADF uses an OR condition. However, if you have parallel branches, ADF uses an AND condition for the following activity—meaning the next activity runs only if all parallel branches succeed. Step 2: Identify the Wrong Approach Many developers attempt to add a “failure email” activity after each pipeline activity, assuming it will trigger if any activity fails. This doesn’t work as expected: Step 3: Design with a Centralized Failure Handler in Mind So, what’s the right approach? Plan your pipeline in a way that allows you to handle any failure from a centralized point—a dedicated failure handler. Here’s how: Step 4: Plan Your Notification Strategy Error detection is one half of the equation. The other half is communication. Ask yourself: To conclude, start thinking about Logic Apps, Webhooks, or Azure Functions that you can plug in later to send customized notifications. We’ll cover the “how” in the next blog, but the “what” needs to be defined now. Planning for failure isn’t pessimism—it’s smart architecture.By understanding ADF’s behavior and avoiding common mistakes with parallel executions, you can build pipelines that fail gracefully, alert intelligently, and recover faster. In Part 2, we’ll take this plan and show you how to implement it step-by-step using ADF’s built-in tools. Please refer to our case study https://www.cloudfronts.com/case-studies/city-council/ to know more about how we used the Azure Data Factory and other AIS to deliver seamless integration. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com.

Share Story :

Automating File Transfers from Azure File Share to Blob Storage with a Function App

Efficient file management is essential for businesses leveraging Azure cloud storage. Automating file transfers between Azure File Share and Azure Blob Storage enhances scalability, reduces manual intervention, and ensures data availability. This blog provides a step-by-step guide to setting up an Azure Timer Trigger Function App to automate the transfer process. Why Automate File Transfers? Steps to Implement the Solution 1. Prerequisites To follow this guide, ensure you have: 2. Create a Timer Trigger Function App 3. Install Required Packages For C#: For Python: 4. Implement the File Transfer Logic C# Implementation 5. Deploy and Monitor the Function To conclude, automating file transfers from Azure File Share to Blob Storage using a Timer Trigger Function streamlines operations and enhances reliability. Implementing this solution optimizes file management, improves cost efficiency, and ensures compliance with best practices. Begin automating your file transfers today! Need expert assistance? Reach out for tailored Azure solutions to enhance your workflow. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

From Commit to Inbox: Automating Change Summaries with Azure AI

In our small development team, we usually merge code without formal pull requests. Instead, changes are committed directly by the developer responsible for the project, and while I don’t need to approve every change in my role as the senior developer, I still need to stay aware of what’s being merged.  Manually reviewing each commit was becoming too time-consuming, so I built an automated process using Power Automate, Azure DevOps, and Azure AI.Now, whenever a commit is made, it triggers a workflow that summarizes the changes and sends me an email.This simple system keeps me informed without slowing down the team’s work. Although I kept the automation straightforward, it could easily be extended further.For example, it could be improved to allow me to reply directly to the committer from the email or even display file changes in detail using a text comparison feature in Outlook.We didn’t need that level of detail, but it’s a good option if deeper insights are ever required. Journey We get started with the Azure DevOps trigger “When a code is pushed”. Here we specify the organization name, project name and repository name. We can also specify a specific branch if we want to limit our tracking to simply that branch otherwise it tracks all the available branches to the User. Then we have a foreach loop that iterates over the “Ref Updates” object array. It contains a list of all the changes but not the exact details.This action pops up automatically as well when we configure the next action. Then we set up a “Azure DevOps REST API request to invoke” action. This has connection capabilities to Azure DevOps directly so it is better to use over a simple REST API action. We specify the relative URL as {Repository Name}/_apis/git/repositories/{Repository ID}/commits/{Commit ID}/changes?api-version=6.0 The Commit ID shows up as newObjectId in the “When code is pushed” trigger. Then we pass the output of this action to a “Create Text with GPT using a prompt” action under the AI Builder group.I’ve passed the prompt as below but it took several trials and errors to get exactly what I wanted. The last action is a simple “Send an email” one where I’ve kept myself as a recepient and I’ve added a subject and a body. Now to put it all together and run it – And here is the final output – When the hyperlinks are clicked they take me straight to azure while pointing to the file which is referred. For instance, if I click on the Events Codeunit – Conclusion Summarizing commit changes is just one way automation can make life easier.This same idea can be applied to other tasks, like summarizing meeting notes, project updates, or customer feedback.With a bit of creativity, we can use tools like this to cut down on repetitive work and free up time to focus on learning new skills or tackling more challenging projects.By finding smart ways to streamline our workflows, we can work more efficiently and open up more time for growth and development. If you need further assistance or have specific questions about your ERP setup, feel free to reach out for personalized guidance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

How to Recover Azure Function App Code

Azure Function Apps are a powerful tool for creating serverless applications, but losing the underlying code can be a stressful experience. Whether due to a missing backup, accidental deletion, or unclear deployment pipelines, the need to recover code becomes critical. Thankfully, even without backups, there are ways to retrieve and reconstruct your Azure Function App code using the right tools and techniques. In this blog, we’ll guide you through a step-by-step process to recover your code, explore the use of decompilation tools, and share preventive tips to help you avoid similar challenges in the future. Step 1: Understand Your Function App Configuration Step 2: Retrieve the DLL File To recover your code, you need access to the compiled assembly file (DLL).From Kudu (Advanced Tools), navigate to the site/wwwroot/bin directory where the YourFunctionApp.dll file resides and download it. Step 3: Decompile the DLL File Once you have the DLL file, use a .NET decompiler to extract the source code by opening .dll file using a .Net decompiler and running the decompiler script. The decompiler I have used here is dotPeek which is a free .Net decompiler. To Conclude, recovering a Function App without backups might seem daunting, but by understanding its configuration, retrieving the compiled DLL, and using decomplication tools, you can successfully reconstruct your code. To prevent such situations in the future you can enable Source Control to Integrate your Function App with GitHub or Azure DevOps or set backups. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com. Please refer to our customer success story Customer Success Story – BUCHI | CloudFronts to know more about how we used the function app and other AIS to deliver seamless integration. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Real-Time Monitoring with Azure Live Metrics

In modern cloud-based applications, real-time monitoring is crucial for detecting performance bottlenecks, identifying failures, and maintaining application health. Azure Live Metrics is a powerful feature of Application Insights that allows developers and operations teams to monitor application telemetry with minimal latency. Unlike traditional logging and telemetry solutions that rely on post-processing, Live Metrics enables real-time diagnostics, reducing the time to identify and resolve issues. What is Azure Live Metrics? Azure Live Metrics is a real-time monitoring tool within Azure Application Insights. It provides instant visibility into application performance without the overhead of traditional logging. Key features include: Benefits of Azure Live Metrics 1. Instant Issue Detection With real-time telemetry, developers can detect failed requests, exceptions, and performance issues instantly rather than waiting for logs to be processed. 2. Optimized Performance Traditional logging solutions can slow down applications by writing large amounts of telemetry data. Live Metrics minimizes overhead by using adaptive sampling and streaming only essential data. 3. Customizable Dashboards Developers can filter and customize Live Metrics dashboards to track specific KPIs, making it easier to diagnose performance trends and anomalies. 4. No Data Persistence Overhead Unlike standard telemetry logging, Live Metrics does not require data to be persisted in storage, reducing storage costs and improving performance. How to Enable Azure Live Metrics To use Azure Live Metrics in your application, follow these steps: Step 1: Install Application Insights SDK For .NET applications, install the required NuGet package: For Java applications, include the Application Insights agent: Step 2: Enable Live Metrics Stream In your Application Insights resource, navigate to Live Metrics Stream and ensure it is enabled. Step 3: Configure Application Insights Modify your appsettings.json (for .NET) to include Application Insights: For Azure Functions, set the APPLICATIONINSIGHTS_CONNECTION_STRING in Application Settings. Step 4: Start Monitoring in Azure Portal Go to the Application Insights resource in the Azure Portal, navigate to Live Metrics, and start observing real-time telemetry from your application. Key Metrics to Monitor Best Practices for Using Live Metrics To conclude, Azure Live Metrics is an essential tool for real-time application monitoring, providing instant insights into application health, failures, and performance. By leveraging Live Metrics in Application Insights, developers can reduce troubleshooting time and improve system reliability. If you’re managing an Azure-based application, enabling Live Metrics can significantly enhance your monitoring capabilities. Ready to implement Live Metrics? Start monitoring your Azure application today and gain real-time visibility into its performance! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange