Struggling with Siloed Systems? Here’s How CloudFronts Gets You Connected
In today’s world, we use many different applications for our daily work. One single application can’t handle everything because some apps are designed for specific tasks. That’s why organizations use multiple applications, which often leads to data being stored separately or in isolation. In this blog, we’ll take you on a journey from siloed systems to connected systems through a customer success story. About BÜCHI Büchi Labortechnik AG is a Swiss company renowned for providing laboratory and industrial solutions for R&D, quality control, and production. Founded in 1939, Büchi specializes in technologies such as: Their equipment is widely used in pharmaceuticals, chemicals, food & beverage, and academia for sample preparation, formulation, and analysis. Büchi is known for its precision, innovation, and strong customer support worldwide. Systems Used by BÜCHI To streamline operations and ensure seamless collaboration, BÜCHI leverages a variety of enterprise systems: Infor and SAP Business One are utilized for managing critical business functions such as finance, supply chain, manufacturing, and inventory. Reporting Challenges Due to Siloed Systems Organizations often rely on multiple disconnected systems across departments — such as ERP, CRM, marketing platforms, spreadsheets, and legacy tools. These siloed systems result in: The Need for a Single Source of Truth To solve these challenges, it’s critical to establish a Single Source of Truth (SSOT) — a central, trusted data platform where all key business data is: How We Helped Büchi Connect Their Systems To build a seamless and scalable integration framework, we leveraged the following Azure services: >Azure Logic Apps – Enabled no-code/low-code automation for integrating applications quickly and efficiently. >Azure Functions – Provided serverless computing for lightweight data transformations and custom logic execution. >Azure Service Bus – Ensured reliable, asynchronous communication between systems with FIFO message processing and decoupling of sender/receiver availability. >Azure API Management (APIM) – Secured and simplified access to backend services by exposing only required APIs, enforcing policies like authentication and rate limiting, and unifying multiple APIs under a single endpoint. BÜCHI’s case study was published on the Microsoft website, highlighting how CloudFronts helped connect their systems and prepare their data for insights and AI-driven solutions. Why a Single Source of Truth (SSOT) Is Important A Single Source of Truth means having one trusted location where your business stores consistent, accurate, and up-to-date data. Key Reasons It Matters: How we did this We used Azure Function Apps, Service Bus, and Logic Apps to seamlessly connect the systems. Databricks was implemented to build a Unity Catalog, establishing a Single Source of Truth (SSOT). On top of this unified data layer, we enabled advanced analytics and reporting using Power BI. In May, we hosted an event with BÜCHI at the Microsoft Office in Zurich. During the session, one of the attending customers remarked, “We are five years behind BÜCHI.” Another added, “If we don’t start now, we’ll be out of the race in the future.” This clearly reflects the urgent need for businesses to evolve. Today, Connected Systems, a Single Source of Truth (SSOT), Advanced Analytics, and AI are not optional — they are essential for sustainable growth and improved human efficiency. The pace of transformation has accelerated: tasks that once took months can now be achieved in days — and soon, perhaps, with just a prompt. To conclude, if you’re operating with multiple disconnected systems and relying heavily on manual processes, it’s time to rethink your approach. System integration and automation free your teams from repetitive work and empower them to focus on high impact, strategic activities. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
CTEs vs Subqueries in SQL: What’s the Difference and When to Use Them?
What happens when a SQL query becomes too long or hard to follow?> It gets confusing> Difficult to debug> Hard to maintain or extend Use Subqueries or Common Table Expressions (CTEs) to break down the logic and improve readability. What is a Subquery? A subquery is a query inside another query.Below is a query which shows customers whose remaining amount is above the average. What is a CTE (Common Table Expression)? A CTE is a temporary result set you can reference in a main query.It starts with the WITH keyword and improves readability, especially with multi-step logic. To Conclude: Subqueries Advantages Disadvantages Quick and easy for simple filtering. Harder to read when nested. Good for one-off checks. Redundancy if used multiple times (no reuse). CTEs (Common Table Expressions) Advantages Disadvantages Clean, readable SQL for complex queries. May be slightly slower in some databases. Can be recursive. Not supported in old SQL engines. Both subqueries and CTEs help you write better SQL but choosing the right one depends on your needs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
How to Implement Incremental Refresh in Power BI
Refreshing large datasets in Power BI can become time-consuming and resource-intensive as data volume grows. If your reports are based on millions of rows of historical data, refreshing everything daily is neither efficient nor necessary. This is where Incremental Refresh comes in. It allows Power BI to only refresh new or changed data, drastically improving performance and reducing load on your data source. In this blog, you’ll learn how to set up incremental refresh step-by-step—so your Power BI reports stay fast and efficient even with big data. What Is Incremental Refresh in Power BI? Incremental Refresh enables Power BI to load data in partitions, refreshing only the latest ones (e.g., the past 7 days) while keeping the older data static. Why use it? Step 1: Define Parameters in Power Query · Open your report in Power BI Desktop (Pro or Premium workspace) · Go to Transform Data (Power Query Editor) · Create two parameters: · Set default values (e.g., RangeStart = 01/01/2020, RangeEnd = 01/01/2021) Step 2: Filter Your Data with These Parameters This tells Power BI what time range to load and eventually refresh incrementally. Step 3: Enable Incremental Refresh in Data Model 📝 Example: This configuration refreshes only the recent week of data each time, while keeping the rest intact. Step 4: Publish to Power BI Service ✅ Done! You’ve now implemented incremental refresh. Best Practices To conclude, Incremental Refresh is a game-changer when it comes to handling large datasets in Power BI. It not only saves refresh time but also optimizes resource usage. By learning how to configure it properly, you can scale your reports with confidence and efficiency Got a large dataset slowing down your Power BI refresh? Implement Incremental Refresh today and see the difference. Explore more Power BI performance tips in our blog series—or reach out for help setting up enterprise-grade models. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Seamlessly Generating and Downloading SSRS Reports in MFA-Enabled Power Pages Environments
Generating SSRS (SQL Server Reporting Services) reports from within Power Pages becomes more complex in environments secured by Multi-Factor Authentication (MFA). Traditional approaches that work in basic environments tend to fail silently or inconsistently when MFA, session tokens, or cookie-based auth are involved. In this blog, I’ll share a real-world solution where a Project-based SSRS report was generated securely, sent via email, and optionally downloaded — all within the constraints of a Power Pages + Power Automate architecture in a Dynamics 365 MFA-protected environment. Step 1: Pre-Outline Brief Pain Points Solved: Step 2: Core Content Scenario Overview Problem Statement Standard HTTP-based retrieval of SSRS reports using the Reserved.ReportViewerWebControl.axd endpoint fails in MFA-protected environments due to missing browser session cookies. This often results in 302 redirects or HTML-based error messages that cannot be processed by Power Automate. Initial Approach and Issue A flow was constructed to: The Project ID is captured from Power Pages and passed to a Power Automate flow using an HTTP trigger, which is initiated when a user clicks a button on the portal—triggered via embedded JavaScript. Build the SSRS report URL dynamically. Compose -> PDF Download Start – Index -> Compose -> PDF Download String Length -> Compose -> PDF Download URL -> (Replaced the PrintOnOpen=true parameter with PrintOnOpen=false in the report export URL to prevent the print dialog from automatically appearing when the PDF is opened) Perform an HTTP GET request to download the report. This failed consistently on the first try due to the report session page not being fully ready or authenticated, especially in an MFA environment. Working Solution: Retry with Delay in Power Automate To overcome the session-based delay, we implemented a retry pattern inside Power Automate: Outcome: The flow fails the first time (as expected), but succeeds on the second or third retry as the session becomes valid and the SSRS report is available. Power Automate Configuration Highlights: Added a Scope block after the first HTTP request and set the Configure run after to Skipped and Failed If needed, you can add the third delay if the second one fails. Benefits of This Approach: To conclude, sometimes, achieving reliability in secure environments isn’t about complex code—it’s about using the right orchestration patterns. By strategically delaying and retrying the HTTP request to SSRS within Power Automate, we achieved consistent, secure report generation that works even under MFA constraints. 🔗 Need help implementing this retry-based flow in your environment?Reach out to CloudFronts—we help businesses implement scalable, reliable solutions every day. You can contact us directly at transform@cloudfronts.com.
Share Story :
How to Send Emails with CC and BCC in Business Central Using AL Language
Sending emails programmatically in Microsoft Dynamics 365 Business Central (BC) is a common requirement for customization be it sending invoices, reminders, or notifications. With the release of enhanced email capabilities in recent versions, AL developers can now include CC (Carbon Copy) and BCC (Blind Carbon Copy) recipients when sending emails. In this blog, we’ll walk through how to send an email in Business Central using AL and how to include CC and BCC fields effectively. Steps to Achieve goal: Code for example. procedure SendInvoiceEmail(var PurchaseInvHeader: Record “Purch. Inv. Header”): BooleanvarEmailAccount: Codeunit “Email Account”;EmailMessage: Codeunit “Email Message”;Email: Codeunit Email;PurchaseInvLine: Record “Purch. Inv. Line”;Vendor: Record Vendor;Item: Record Item;Currency: Record Currency;CurrencySymbol: Text;ItemDescription: Text[2048];ItemNumber: Code[30];TotalAmount: Decimal;Link: Text[2048];EmailType: Enum “Email Recipient Type”;begin// Get vendor informationVendor.SetRange(“No.”, PurchaseInvHeader.”Buy-from Vendor No.”);if Vendor.FindFirst() then; // Process purchase invoice linesPurchaseInvLine.SetRange(“Document No.”, PurchaseInvHeader.”No.”);PurchaseInvLine.SetFilter(Quantity, ‘<>%1’, 0);if PurchaseInvLine.FindSet() then repeat Item.Reset(); Item.SetRange(“No.”, PurchaseInvLine.”No.”); Item.SetRange(“Second Hand Goods”, true); // Example filter if Item.FindFirst() then begin ItemDescription := PurchaseInvLine.Description; ItemNumber := PurchaseInvLine.”No.”; TotalAmount += PurchaseInvLine.”Amount Including VAT”; Currency.SetRange(Code, PurchaseInvHeader.”Currency Code”); if Currency.FindFirst() then CurrencySymbol := Currency.Symbol else CurrencySymbol := ‘€’; // Default symbol end; until PurchaseInvLine.Next() = 0; // Generate a form link with dynamic query parameters (example only)Link := ‘https://your-form-url.com?vendor=’ + Vendor.”No.” + ‘&invoice=’ + PurchaseInvHeader.”No.”; // Create and configure the emailEmailMessage.Create( Vendor.”E-Mail”, ‘Subject: Review Your Invoice’, ‘Hello ‘ + Vendor.Name + ‘,<br><br>’ + ‘Please review the invoice details for item: ‘ + ItemDescription + ‘<br>’ + ‘Total: ‘ + Format(TotalAmount) + ‘ ‘ + CurrencySymbol + ‘<br>’ + ‘Form Link: <a href=”‘ + Link + ‘”>Click here</a><br><br>’ + ‘Best regards,<br>Your Company Name’, true // IsBodyHtml); // Add BCC (could also use AddCc)EmailMessage.AddRecipient(EmailType::BCC, ‘example@yourdomain.com’); // Update invoice status or flags (optional business logic)PurchaseInvHeader.”Custom Status Field” := PurchaseInvHeader.”Custom Status Field”::Started;PurchaseInvHeader.Modify(); // Send the emailexit(Email.Send(EmailMessage)); end; To conclude, sending emails directly from AL in Business Central is a powerful way to streamline communication with vendors, customers, and internal users. By leveraging the Email Message and Email codeunits, developers can easily customize the subject, body, and recipients, including support for CC and BCC fields. This flexibility makes it easy to automate notifications, document sharing, or approval requests directly from your business logic. Whether you’re integrating forms, sending invoices, or just keeping stakeholders in the loop, this approach ensures your extensions are both professional and user-friendly. With just a few lines of code, you can improve efficiency and enhance communication across your organization. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
How We Built Smart Pitch — and What We Learned Along the Way
In today’s world, AI is no longer a luxury—it’s a necessity for driving smarter decisions, faster innovation, and personalized experiences. We have come up with our requirements for AI to support the conversion from MQL (Marketing Qualified Lead) to SQL (Sales Qualified Lead). How did the idea originate? In our organization, whenever a prospect reaches out to us, we search for company information like company size, revenue, location, industry type, contact person details, designation, decision-maker, and LinkedIn profile. This information helps the Sales team prepare better and deliver a stronger pitch by understanding the customer before the call. Also, during the MQL to SQL stage, we look for things like: This information helps the Sales team convert the prospect into a client and increases our chances of winning the deal. Earlier, this entire process was manual and time-consuming. So, we decided to automate it with an AI agent that can gather this information for us in just a few minutes. Implementation approach After the project was approved internally, we started exploring how to make it happen. Initially, we didn’t know where or how to start. During our research, we came across Copilot Studio, which allows us to build custom agents from scratch based on our needs. We learned about Copilot Studio’s and began building our agent. We named it Elevator Pitch. Version 1 Highlights: This feedback led to the idea for Version 2, which would automate more steps and also pull information from the internet. Version 2 Enhancements: Version 2 Features: Company & Contact information with a single click on MQL to SQL, the agent now generates the document within minutes—something that earlier used to take hours or even a full day. Live demo in Zurich & New York On 22nd May 2025, we had an event scheduled at the Microsoft office in Zurich with one of our clients, where we shared the Buchi journey with CloudFronts. We discussed how we collaborated to connect their multiple systems and prepared their data for insights and AI initiatives. At the same event, we had the opportunity to demonstrate our Smart Pitch product, which caught the audience’s attention. It was a proud moment for us to showcase our first AI product at the Microsoft office—delivered within just a few months of hard work. Our second opportunity came on 06 June 2025 in New York, at the AI Community Conference, where we presented again in front of a global audience. What Next in Version 3: So far, we have built this solution using Microsoft’s inbuilt Knowledge Center, ChatGPT API, SharePoint, company websites, and Dataverse. Since we were working with both structured and unstructured data, we faced some inconsistencies and performance issues. This led us to reflect and identify the need for Version 3 (V3), which will include: The development of Smart Pitch V3 is currently in progress. We’ll share our thoughts once it goes live. A demo video has also been shared, so you can see how smart and fast our Agent is at delivering useful insights. Delivering Answers in Minutes—Thanks to Smart Pitch I’d also like to share a quick story. One day, our Practice Manager was on leave, and we received a prospect inquiry about Project Operations to Business Central (PO-BC) pricing. I wasn’t sure where that information was stored, and suddenly our CEO asked me for the details. I was a bit stressed, unsure where to search or how to respond. Then I decided to ask our Smart Pitch agent the same question. To my surprise, the agent quickly gave me the exact information I needed. It was a big relief, and I was able to share the details with our CEO in just a few minutes—without even knowing where the document was uploaded. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
When Physical Inventory and Financial Inventory Don’t Match in Dynamics 365 Finance & Operations
In any organization, maintaining accurate inventory records is critical—not only for operational efficiency but also for financial accuracy, reporting, and compliance. In Dynamics 365 Finance and Operations (D365 F&O), inventory is tracked from two perspectives: Physical inventory and financial inventory. While these two should ideally be aligned at all times, mismatches are common in practice. Whether caused by pending invoices, misconfigured settings, or improper transaction handling, discrepancies between physical and financial inventory can create confusion, misstatements in financials, and operational bottlenecks. This blog explains why these mismatches occur, how to detect and resolve them, and what best practices you can adopt to ensure alignment between physical and financial inventory in Dynamics 365 F&O. Before diving straight into the blog let us first understand what these Inventory mean so it becomes essential to understand the distinction between the two inventory layers in D365 FNO: A mismatch occurs when the physical quantity and the financial value or quantity of an item do not align, leading to inconsistencies between what’s physically available and what’s financially accounted for. Reasons for Mismatch How to Detect the Mismatch The below points can be considered to identify mismatches between physical and financial inventory in D365 FNO: Tips to resolve the mismatch Let’s take an example to get the better understanding: Suppose a business receives 100 units of an item on a purchase order. The receipt is physically posted, making the stock available in inventory. However, if the invoice is not posted, no financial value is recorded. This results in a positive physical quantity but zero financial value. Once the invoice is posted and inventory is closed or recalculated, the financial value is updated, resolving the mismatch. Best Practices to Prevent Inventory Mismatches To conclude, Inventory mismatches between physical and financial layers in D365 F&O are more than just system issues—they are business-critical challenges. These discrepancies can distort financial reporting, mislead operational planning, and expose the organization to audit risks. The good news is that they are entirely preventable. By understanding the causes, implementing regular checks, and following best practices such as prompt financial posting and scheduled inventory closes, you can maintain accurate, reliable inventory data. Achieving alignment between your physical and financial inventory ensures operational clarity and financial integrity—foundations that are essential for confident decision-making and long-term success. Hope this helps. Thanks for reading! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Use Webhooks for Real-Time CRM Integrations in Dynamics 365
Are you looking for a reliable way to integrate Dynamics 365 CRM with external systems in real time? Polling APIs or scheduling batch jobs can delay updates and increase complexity. What if you could instantly notify external services when key events happen inside Dynamics 365? This blog will show you how to use webhooks—an event-driven mechanism to trigger real-time updates and data exchange with external services, making your integrations faster and more efficient. A webhook is a user-defined HTTP callback that is triggered by specific events. Instead of your external system repeatedly asking Dynamics 365 for updates, Dynamics 365 pushes updates to your service instantly. Dynamics 365 supports registering webhooks through plugin steps that execute when specific messages (create, update, delete, etc.) occur. This approach provides low latency and ensures that your external systems always have fresh data. This walkthrough covers the end-to-end process of configuring webhooks in Dynamics 365, registering them via plugins, and securely triggering external services. What You Need to Get Started Step 1: Create Your Webhook Endpoint Step 2: Register Your Webhook in Dynamics 365 Step 3: Create a Plugin to Trigger the Webhook public void Execute (IServiceProvider serviceProvider) { var notificationService = (IServiceEndpointNotificationService)serviceProvider.GetService(typeof(IServiceEndpointNotificationService)); Entity entity = (Entity)context.InputParameters[“Target”]; notificationService.Notify(“yourWebhookRegistrationName”, entity.ToEntityReference().Id.ToString()); } Register this plugin step for your message (Create, Update, Delete) on the entity you want to monitor. Step 4: Test Your Webhook Integration Step 5: Security and Best Practices Real-World Use Case A company wants to notify its external billing system immediately when a new invoice is created in Dynamics 365. By registering a webhook triggered by the Invoice creation event, the billing system receives data instantly and processes payment without delays or manual intervention. To conclude, webhooks offer a powerful way to build real-time, event-driven integrations with Dynamics 365, reducing latency and complexity in your integration solutions. We encourage you to start by creating a simple webhook endpoint and registering it with Dynamics 365 using plugins. This can transform how your CRM communicates with external systems. For deeper technical support and advanced integration patterns, explore CloudFront’s resources or get in touch with our experts to accelerate your Dynamics 365 integration project. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Migrating Data from Azure Files Share to Azure Blob Storage Using C#
For growing businesses, efficient data management is as critical as streamlined processes and actionable reporting. As organizations scale, the volume and complexity of data stored in systems like Azure Files Share increase, necessitating robust, scalable storage solutions like Azure Blob Storage. Are you struggling to manage your file storage efficiently? If you’re looking to automate data migration from Azure Files Share to Azure Blob Storage using C#, this article is for you. Research shows that 70% of customers value seamless experiences with efficient systems, impacting brand loyalty. Businesses automating data management processes can reduce retrieval times by up to 90%, while organizations leveraging cloud storage solutions like Azure Blob Storage report a 25% increase in operational productivity and 60% improved satisfaction in data workflows. This article provides a structured guide to migrating data using C#, drawing from practical implementation insights to help Team Leads, CTOs, and CEOs optimize their data storage for scalability and efficiency. Why Migrate to Azure Blob Storage? Azure Files Share offers managed file shares via the Server Message Block (SMB) protocol, suitable for traditional file system needs. However, Azure Blob Storage excels in scalability, cost efficiency, and integration with advanced Azure services like Azure Data Lake and AI/ML workloads. Key benefits include: Migrating Data Using C#: A Step-by-Step Approach To migrate data from Azure Files Share to Azure Blob Storage programmatically, you can leverage C# with Azure SDKs. Below is a structured approach, referencing a C# implementation that uses a timer-triggered Azure Function to automate the process. Step 1: Set Up Your Environment Step 2: Design the Migration Logic The C# code uses an Azure Function triggered on a schedule (e.g., every 5 seconds) to process files. Key components include: Step 3: Execute the Migration Step 4: Optimize and Automate Step 5: Validate and Test A Glimpse of the C# Implementation The C# code leverages an Azure Function to automate migration. It connects to the file share, enumerates files, uploads them to a blob container, and deletes them from the source upon successful transfer. Key features include: This approach ensures minimal manual intervention and robust error handling, aligning with the needs of growing businesses. Benefits of Programmatic Migration Using C# for migration offers: To conclude, migrating data from Azure File Share to Azure Blob Storage using C# empowers growing businesses to achieve scalable, cost-efficient, and automated data management. By implementing a structured approach with Azure Functions, you can streamline operations and unlock advanced analytics capabilities. Evaluate your current data management processes and identify one area for improvement, such as automating file transfers with C#. Start today to enhance efficiency and customer satisfaction. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
US-Based Non-Profit Organization Partners with CloudFronts for a Managed Services Agreement
We are pleased to announce that a leading US-based non-profit organization has partnered with CloudFronts for Dynamics 365 support & maintenance with a Managed Services Agreement (MSA). Founded in 2010, the organization is headquartered in San Francisco, California, with additional offices in Amsterdam, Venlo, and Raleigh, North Carolina. It is dedicated to advancing sustainable product design through its Certified™ program, which emphasizes material health, product circularity, renewable energy, water stewardship, and social fairness. By supporting global organizations, the non-profit plays a key role in creating safer, recyclable, and more circular products that contribute to a sustainable future. On this occasion, Priyesh Wagh, Practice Manager at CloudFronts, stated: ” Our first project with our client established a great way of working together, and we saw how we could take this implementation ahead and generate value through our work together. We are keen to look forward to building their systems that eases their customer service efforts. “ “Discover How We’ve Enabled Businesses Like Yours – Explore Our Client Testimonials!” About CloudFronts CloudFronts is a global AI- First Microsoft Solutions Partner for Business Applications, Data & AI, helping teams and organizations worldwide solve their complex business challenges with Microsoft Cloud, AI, and Azure Integration Services. We have a global presence with offices in U.S, Singapore & India. Since its inception in 2012, CloudFronts has successfully served over 200+ small and medium-sized clients all over the world, such as North America, Europe, Australia, MENA, Maldives & India, with diverse experiences in sectors ranging from Professional Services, Financial Services, Manufacturing, Retail, Logistics/SCM, and Non-profits. Please feel free to connect with us at transform@cloudfronts.com
