From Legacy Middleware Debt to AI Innovation: Rebuilding the Digital Backbone of a 150-Year-Old Manufacturer
Why This Story Matters to Today’s CIOs and CTOs For many large manufacturing organizations, integration platforms quietly become one of the most expensive and least visible parts of the IT landscape. Licensing renewals happen in the background, operational risks remain hidden, and innovation conversations get delayed because the digital backbone is simply not ready. This is the story of a 150-year-old global manufacturer that reached that exact inflection point and how rethinking integration architecture helped them reduce costs dramatically while laying the foundation for AI-driven decision-making. The Breaking Point: When Middleware Became a Business Risk The manufacturer had relied on traditional middleware platforms for years to connect Dynamics 365 Field Service, Finance & Operations, Sales, Shopify, and SQL-based systems. Over time, the middleware layer grew complex, opaque, and expensive. The wake-up call came during a contract renewal discussion. a. Middleware licensing had increased from $20,000 to $50,000 per year. b. A mandatory three-year commitment pushed the proposal to $160,000. c. Despite the cost, the platform still behaved like a black box failures were hard to trace, and teams often learned about issues only after business users raised concerns. For leadership, this was no longer just an IT problem. It was a structural constraint on scalability, transparency, and future AI initiatives. CloudFronts’ Perspective: Cost Is a Symptom, Not the Root Cause When CloudFronts assessed the environment, the issue was clear: the organization was paying enterprise-level licensing fees for integration workloads that ran only a handful of times per day. From an architectural standpoint, this created two forms of debt: 1. Financial debt – High fixed costs with limited flexibility 2. Technical debt – Opaque integrations with no real-time visibility or standardized transformation logic Our recommendation was not a like-for-like migration, but a fundamental shift to a cloud-native, consumption-based model using Azure Integration Services (AIS). Rebuilding the Backbone with Azure Integration Services The new architecture replaced legacy middleware and Scribe with: 1. Azure Logic Apps for orchestration 2. Azure Functions for transformation and reusable logic 3. Azure Blob Storage for configuration, templates, and checkpoints Designed for Global Complexity The manufacturer operates across multiple legal entities and regions: a. United States (TOUS) b United Kingdom (TOUK) c. India (TOIN) d. China (TOCN) Each entity has unique account number formats, compliance rules, and data behaviors. The solution introduced branching logic and region-specific mappings while maintaining a single, governed integration framework. Eliminating the Black Box: Visibility by Design One of the most impactful changes was not technical it was operational. Legacy middleware offered limited insight into what was running, failing, or slowing down. CloudFronts replaced this with first-class monitoring and observability. What Changed a Power BI dashboard built on Azure Log Analytics provide real-time visibility into integration health b. Automated alerts notify teams within one hour of failures c. Integration teams can now proactively resolve issues before they impact order-to-cash or service operations This shift alone reduced firefighting and restored confidence in the integration layer. From Cost Optimization to AI Readiness While the immediate outcome was cost reduction, the strategic impact went far beyond savings. By standardizing transformations and ensuring clean, reliable data flows, the organization created the foundation required for: a. Databricks-based analytics b. Unity Catalog for governance and lineage c. Future Generative AI use cases across operations For example, leadership can now envision scenarios where users ask: “Is raw material available for this production order?” “Which service orders are likely to breach SLA next week?” These are not AI experiments they depend entirely on trusted, unified data. As an early validation step, 32 fragmented reports were consolidated into a governed catalog, proving the readiness of the new backbone. The Integration Framework Behind the Scenes The solution follows a modular, scalable framework: a. Liquid templates (JSON-to-JSON) decouple transformations from orchestration b. Templates are stored in Azure Blob Storage, allowing updates without redeploying Logic Apps c. Incremental synchronization ensures only changed data is processed every five minutes d. This approach balances performance, maintainability, and governance critical for long-term sustainability. Results That Matter to Leadership Business and Technology Outcomes a. Annual integration cost reduced by ~95% b. Spend dropped from $50,000 to approximately $2,500–$4,000 per year c. Estimated annual savings: ~$140,000 d. Systems connected: D365 Field Service, Sales, Finance & Operations, Shopify, SQL Server e. Scalability: Designed to modernize over 600 legacy reports More importantly, integration is no longer a blocker it is an enabler. A Practical Playbook for CIOs Facing Similar Challenges 1. Start with transparency if you can’t see failures, you can’t fix them 2. Challenge fixed-cost licensing models for low-frequency workloads 3. Standardize transformations before investing in AI platforms 4. Treat integration as a product, not plumbing To conclude, for this 150-year-old manufacturer, modernization was not about replacing tools it was about reclaiming control of their digital backbone. By moving away from legacy middleware and embracing Azure Integration Services, they reduced cost, eliminated blind spots, and unlocked a clear path toward AI-driven operations. At CloudFronts, we see this pattern repeatedly. The organizations that succeed with AI are not the ones experimenting first but the ones fixing their foundations first. Read full story here: A practical case study on modernizing legacy integration. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Create records in Dynamics CRM using Microsoft Excel Online
Importing customer data into Dynamics 365 doesn’t have to be complicated. Whether you’re migrating from another system or onboarding a large volume of new customers, using Microsoft Excel Online provides a quick, user-friendly, and efficient way to create multiple records at once-without any technical setup. In this blog, I’ll walk you through a simple step-by-step process to import customer (or any entity) records directly into your Dynamics 365 environment using Excel Online, ensuring clean, fast, and accurate data entry. Let’s say you want to import customer records or any entity records in dynamics CRM in this blog I will show you how you can import multiple customer records into your dynamics 365 environment simply using Microsoft Excel online. Step 1: Go to the entity’s home page who’s records you want to create (In my case it is customer entity). Step 2: On the active accounts view (or any view) click on edit columns and add the columns as per the data you want to be fill in. (Don’t forget to hit apply button at the bottom) Step 2 : Once your view is ready click on Export to Excel Button on the top left and select Open in excel online. Step 3: If you are using a system view like in this example you will see existing records on the online excel, you can clear those records or keep them as is. If you change any existing record, it will update the data of that record so you can also use this to update existing records at once (I will write a separate blog post for updating records for now let’s focus on creating records) Step 4: You can then add the data which you want to create to the online excel sheet, in this example I am transferring data from a local excel sheet to the online excel. Step 5: Once you have added your data on the online excel, hit apply button. Step 6: You will get a Popup about your data being submitted for import, hit Track Progress. Step 7: You will see your data has been submitted and is parsing. (It will take couple of minutes to hours depending upon the amount of data you have submitted keep refreshing to see the progress of the records). Step 8: Once the import job is completed you will see how many records were created successfully and how many failed or partially failed. You can open the import job and check failed entries, correct the entries and re-import Failed records All the successfully parsed records will be created in your system. Importing customer records in Dynamics 365 becomes incredibly seamless with Excel Online. With just a few steps-preparing your view, exporting to Excel, adding your data, and submitting the import-you can create hundreds or even thousands of records in a fraction of the time. This approach not only speeds up data entry but also ensures consistency and reduces manual errors. Hope this helps! 😊 I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
Update any number of entity records in dynamics CRM using Microsoft Excel Online
There are many ways to update multiple records of a dynamics CRM entity, in this blog let’s see one of the easiest and faster way to do it that is by using excel online. Let’s consider an example, let’s say you have a fixed number of account records and you manually want to update the account number. Step 1: Go to the entity’s home page who’s records you want to update. Step 2: On the All-Accounts view (or any view) clicks on edit columns and add the columns as which you want to update in my case it is Account Number. Step 2 : Once your view is ready click on Export to Excel Button on the top left and select Open in excel online. Step 3: This will open all your accounts in an excel sheet in a pop-up window. Step 4: Now you just need to update the columns which you want to update and hit save (I am adding all the account numbers). Step 6: You will get a Popup about your data being submitted for import, hit Track Progress. Step 7: You will see your data has been submitted for updating and is parsing. (It will take couple of minutes to hours depending upon the amount of data you have submitted keep refreshing to see the progress of the records). Step 8: Once the import job is completed you will see how many records were created successfully and how many failed or partially failed. You can open the import job and check failed entries, correct the entries and re-import (All my reports where successfully updates) Failed records (Sample from some older imports) All the successfully parsed records will be updated in your system. Before Update: After Update: Hope this helps! 😊 I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
Submit Attachments Over 1GB Through MS Forms
One limitation while working with MS forms is the 1 GB limit on file submission through the forms. Many of you guys are must be using Forms to get files from users or clients outside your organizations and those files can be over 1GB. In this blog I will show you how you let users submit files over 1 GB through MS Forms and store this response into a SharePoint list. So let’s being.. Approach: MS Form stores all the files onto your one drive, One drive also offers a feature called “Request Files” using which you can create a shareable link to a one drive folder in which anyone with the link can upload files and it has no limit over the size of the file. So instead of using the forms upload file feature we will be using shareable link from the Request File feature on the form using which users will be able to submit documents of any size. Let’s see how to do this. Create Shareable link to a one drive folder using Request File Feature. Copy this link and save it we will be using this link in our MS form. Create MS Form. You can add the link as you want on the form you can also add it in your sections sub title (Both these are just examples or ideas of how you can show users this link.) Stored attachments in one drive. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Post Microsoft Form submissions response in Teams Channel
Teams is one of the best forms of notifying users about a form submission, in this blog let’s see how we can post new Microsoft Form responses into a team’s channel. Step 1: Go to https://make.powerautomate.com/ -> Click on Environments on the top left and select the environment you want to create your flow in if you don’t have any environments you can select the default environment. Step 2: Click on My flows -> New flow and select Automated cloud flow. Step 3: Name your flow and search for “When a new response is submitted trigger” Step: 4 Select the form for which you want to send the notification Step 5: click on new step and search for Forms -> under Actions select “Get response details” Step 6: Reselect the same Form in the first column of the Get response details action and in the second column you need to add the Response Id which is coming from the first step, you will get through the dynamics content just by clicking on the column. Step 7: Now add a new step and search for Send Email V2 action. (We are using this action so that we can make our post content in Rich Text Format) Step 8: You will get all the form files which are coming from the Get response details step, you can add them using the dynamics contents. Step 9: In the Send Email V2 action you can create your message style it using the Rich text editor, once you are done styling your message click on code view button as shown in the below image. Step 10: In the code view you will get the rich text message in HTML, copy this code and Delete the Send an email (V2) step. Step 11: Click on new step and search for compose Step 12: Rename this compose to Message body and paste the HTLM code of the message body from step 10 Step 13: Now click new step and search for Post message in a chat or channel action. Step 14: Fill in the details as shown below, you can post this as user or a flow bot select the teams and the teams channel and paste the output of compose in the message. Output Hope this helps 😊! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
How to Enable Recycle Bin in Dynamics 365 CRM
When working with Dynamics 365 CRM, one common request from users and admins is:“How do we get a Recycle Bin to recover accidentally deleted records?” Unlike SharePoint or Windows, Dynamics 365 doesn’t come with a native Recycle Bin. But that doesn’t mean you’re out of luck! There are a few smart ways to implement soft delete or restore capabilities depending on your organization’s needs. In this blog, we’ll explore all the available options — from built-in Power Platform features to custom approaches — to simulate or enable Recycle Bin-like functionality in Dynamics 365 CRM. Option 1: Use the Built-in Dataverse Recycle Bin (Preview/GA in Some Regions) Microsoft is gradually rolling out a Recycle Bin feature for Dataverse environments. How to Enable: Option 2: Implement a Custom Recycle Bin (Recommended for Full Control) You can also write a bulk delete after 15-30 days to actually clear these records from Dataverse. Option 3: Restore from Environment Backups If a record is permanently deleted, your last line of defence is a full environment restore. Not ideal for frequent recovery, but lifesaving in major accidents. Tips and Tools you can use. If you also want to track who deleted what and when, Auditing might be helpful. You cannot restore deleted records using this. It is useful only for traceability and compliance, not recovery. XrmToolBox Plugins like Recycle Bin Manager simulate soft delete and allow browsing deleted records. While Dynamics 365 CRM doesn’t provide a built-in Recycle Bin like other Microsoft products, there are several reliable ways to implement soft-delete or recovery mechanisms that fit your organization’s needs. Whether you leverage Dataverse’s native capabilities, create a custom status based Recycle Bin, or track deletions through auditing and backups, it’s essential to plan ahead for data protection and user experience. By proactively enabling recovery options, you not only safeguard critical business data but also empower users with confidence and control over their CRM operations. What’s Your Approach? Have you built your own Recycle Bin experience in Dynamics 365? Share your thoughts or tips in the comments below! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Add or Remove Sample Data in a Dynamics 365 CRM Environment
Let’s say you configured a Dynamics 365 Sales or Project Operation or a field service trial for a client demo to save your efforts on creating sample data dynamics gives you an option to add data in any dynamics 365 environment, you can either choose to install the sample data while creating the environment however if you forgot to do so, here is how you can add sample data within your dynamics 365 environment. Step 1 – Go to https://admin.powerplatform.microsoft.com/environments select your dynamics 365 environment and click on view details. Step 2 – On the details page click on setting. Step 3 – On the setting page under data management you will see an option named sample data, click on it. Step 4 – Click installed and after a few minutes sample data will be added within your dynamics 365 environment. Similarly, if sample data is already installed and you wish to remove it, you will see a button Remove sample data instead of Install sample data. Hope this helps! 😊 I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
QA Made Easy with KQL in Azure Application Insights
In today’s world of modern DevOps and continuous delivery, having the ability to analyze application behavior quickly and efficiently is key to Quality Assurance (QA). Azure Application Insights offers powerful telemetry collection, but what makes it truly shine is the Kusto Query Language (KQL)—a rich, expressive query language that enables deep-dive analytics into your application’s performance, usage, and errors. Whether you’re testing a web app, monitoring API failures, or validating load test results, KQL can become your best QA companion. What is KQL? KQL stands for Kusto Query Language, and it’s used to query telemetry data collected by Azure Monitor, Application Insights, and Log Analytics. It’s designed to be read like English, with SQL-style expressions, yet much more powerful for telemetry analysis. Challenges Faced with Application Insights in QA 1. Telemetry data doesn’t always show up immediately after execution, causing delays in debugging and test validation. 2.When testing involves thousands of records, isolating failed requests or exceptions becomes tedious and time-consuming. 3.The default portal experience lacks intuitive filters for QA-specific needs like test case IDs, custom payloads, or user roles. 4.Repeated logs from expected failures (e.g., negative test cases) can clutter insights, making it hard to focus on actual issues. 5.Out-of-the-box telemetry doesn’t group actions by test scenario or user session unless explicitly configured, making traceability difficult during test case validation. To overcome these limitations, QA teams need more than just default dashboards—they need flexibility, precision, and speed in analyzing telemetry. This is where Kusto Query Language (KQL) becomes invaluable. With KQL, testers can write custom queries to filter, group, and visualize telemetry exactly the way they need, allowing them to focus on real issues, validate test scenarios, and make data-driven decisions faster and more efficiently. Let’s take some examples for better understanding: Some Common scenarios where a KQL proves to be very effective. Check if the latest deployment introduced new exceptions Example: Find all failed requests Example: Analyse performance of a specific page or operation Example: Correlate request with exceptions Example: Validate custom event tracking (like button clicks) Example: Track specific user sessions for end-to-end QA testing Example: Test API performance under load Example: All of this can be Visualized too – You can pin your KQL queries to Azure Dashboards or even Power BI for real-time tracking during QA sprints. To conclude, KQL is not just for developers or DevOps. QA engineers can significantly reduce manual log-hunting and accelerate issue detection by writing powerful queries in Application Insights. By incorporating KQL into your testing lifecycle, you add an analytical edge to your QA process—making quality not just a gate but a continuous insight loop.Start with a few basic queries, and soon you’ll be building powerful dashboards that QA, Dev, and Product can all share! Hope this helps ! I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Merging Unmanaged Solutions in Power Platform with XRMToolBox
Let’s say you are developing a module driven app or some custom app development in CRM and multiple teams have created multiple different solutions involving customizations for the develop. Best would be to have all the customizations in a single solution before and then move it to UAT or Production. In this blog I will show you how you can move components of multiple solutions into a single main solution using the Solutions Component Mover tool in XRM Tool Box. So let’s begin. Step 1: Download XRM Tool Box from this link – https://www.xrmtoolbox.com/ Step 2: Make a connection to your Dynamics 365 Environment inside of the XRM Tool Box by clicking on Create a new connection. Step 2: Click on Microsoft Login Control Step 3: Click on Open Microsoft Login Control Step 4: Now Select Display list of available organizations & show advance –> put your username and password -> after successful authentication Name your Connection. Step 5: Now in Took Library Search for “Solution Component Mover” and hit install. Step 6: Once the tool is installed it will appear in your tool list click on it Step 7: once you are in the solution component mover tool click on Load Solution. To conclude, now, you will get a list of all Managed and Unmanaged solutions. Select the solutions you want to merge in the Source Solution section and select the target solution in which you want to move the components. All the elements from source solutions will be moved to the target solution (Selected Solutions are highlighted in light grey colour). Once you have selected the source and target solutions hit Copy Components and we are done. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Azure Integration Services (AIS): The Key to Scalable Enterprise Integrations
In today’s dynamic business environment, organizations rely on multiple applications, systems, and cloud services to drive operations, making scalable enterprise integrations essential. As businesses grow, their data flow and process complexity increase, demanding integrations that can handle expanding workloads without performance bottlenecks. Scalable integrations ensure seamless data exchange, real-time process automation, and interoperability between diverse platforms like CRM, ERP, and third-party services. They also provide the flexibility to adapt to evolving business needs, supporting digital transformation and innovation. Without scalable integration frameworks, enterprises risk inefficiencies, data silos, and high maintenance costs, limiting their ability to scale operations effectively. Are you finding it challenging to scale your business operations efficiently? In this blog, we’ll look into key Azure Integration Services that can help overcome common integration hurdles. Before we get into AIS, let’s start with some business numbers—after all, money is what matters most to any business. Several organizations have reported significant cost savings and operational efficiencies after implementing Azure Integration Services (AIS). Here are some notable examples: Measurable Business Benefits with AIS A financial study evaluating the impact of deploying AIS found that organizations experienced benefits totalling $868,700 over three years. These included: Here are some articles to support this data: Modernizing Legacy Integration: BizTalk to AIS A financial institution struggling with outdated integration adapters transitioned to Azure Integration Services. By leveraging Service Bus for reliable message delivery and API Management for secure external API access, they reduced operational costs by 25% and improved system scalability. These examples demonstrate the substantial cost reductions and efficiency improvements that businesses can achieve by leveraging Azure Integration Services. To put this into perspective, we’ll explore real-world industry challenges and how Azure’s integration solutions can effectively resolve them. Example 1: Secure & Scalable API Management for a Manufacturing Company Scenario: A global auto parts manufacturer supplies components to multiple automobile brands. They expose APIs for: Challenges: However, they are facing serious challenges These are some simple top-level issues there can be many more complexities. Solution: Azure API Management (APIM) The manufacturer deploys Azure API Management (APIM) to secure, manage, and monitor their APIs. Step 1: Secure APIs – APIM enforces OAuth-based authentication so only authorized suppliers can access APIs. Rate limiting prevents overuse. Step 2: API Versioning – Different suppliers use v1 and v2 of APIs. APIM ensures smooth version transitions without breaking old integrations. Step 3: Analytics & Monitoring – The company gets real-time insights on API usage, detecting slow queries and bottlenecks. Result: Example 2: Reliable Order Processing with Azure Service Bus for an E-commerce Company Scenario: A fast-growing e-commerce company processes over 50,000 orders daily across multiple sales channels (website, mobile app, and third-party marketplaces). Orders are routed to: Challenges: Solution: Azure Service Bus (Message Queueing) Instead of direct connections, the company decouples services using Azure Service Bus. Step 1: Queue-Based Processing – Orders are sent to an Azure Service Bus queue, ensuring no data loss even if systems go down. Step 2: Asynchronous Processing – Inventory, payment, and fulfilment consume messages independently, avoiding system overload. Step 3: Dead Letter Queue (DLQ) Handling – Failed orders are sent to a DLQ for retry instead of getting lost. Result: Example 3: Automating Invoice Processing with Logic Apps for a Logistics Company Scenario: A global shipping company receives thousands of invoices from suppliers every month. These invoices must be: Challenges: Solution: Azure Logic Apps for End-to-End Automation The company automates the entire invoice workflow using Azure Logic Apps. Step 1: Extract Invoice Data – Logic Apps connects to Office 365 & Outlook, extracts PDFs, and uses AI-powered OCR to read invoice details. Step 2: Validate Data – The system cross-checks invoice amounts and supplier details against purchase orders in the ERP. Step 3: Approval Workflow – If all details match, the invoice is auto-approved. If there’s a discrepancy, it’s sent to finance via Teams for review. Step 4: Update SAP & Notify Suppliers – Once approved, the invoice is automatically logged in SAP, and the supplier gets a payment confirmation email. Result: With Azure API Management, Service Bus, and Logic Apps, businesses can: Many organizations are also shifting towards no-code solutions like Logic Apps for faster integrations. Whether you’re looking for API security, event-driven automation, or workflow orchestration, Azure Integration Services has a solution for you. Azure Integration Services (AIS) is not just a collection of tools—it’s a game-changer for businesses looking to modernize their integrations, reduce operational costs, and improve scalability. From secure API management to reliable messaging and automation, AIS provides the flexibility and efficiency needed to handle complex business workflows seamlessly. The numbers speak for themselves—organizations have saved hundreds of thousands of dollars while improving their integration capabilities. Whether you’re looking to streamline supplier connections, optimize order processing, or migrate from legacy systems, AIS has a solution for you. What’s Next? In our next article, we’ll take a deep dive into a real-world scenario, showcasing how we helped our customer Buchi transform their integration landscape with Azure Integration Services. Next Up: Why AIS? How Easily Azure Integration Services Can Adapt to Your EDI Needs. Would love to hear your thoughts! How are you handling enterprise integrations today? Comment down below ???? or contact us at transform@cloudfronts.com
