Blog Archives - Page 3 of 171 - - Page 3

Category Archives: Blog

Post Microsoft Form submissions response in Teams Channel 

Teams is one of the best forms of notifying users about a form submission, in this blog let’s see how we can post new Microsoft Form responses into a team’s channel.  Step 1: Go to https://make.powerautomate.com/  -> Click on Environments on the top left and select the environment you want to create your flow in if you don’t have any environments you can select the default environment.  Step 2: Click on My flows -> New flow and select Automated cloud flow.  Step 3: Name your flow and search for “When a new response is submitted trigger”  Step: 4 Select the form for which you want to send the notification  Step 5: click on new step and search for Forms -> under Actions select “Get response details”  Step 6: Reselect the same Form in the first column of the Get response details action and in the second column you need to add the Response Id which is coming from the first step, you will get through the dynamics content just by clicking on the column.  Step 7: Now add a new step and search for Send Email V2 action. (We are using this action so that we can make our post content in Rich Text Format)  Step 8: You will get all the form files which are coming from the Get response details step, you can add them using the dynamics contents.  Step 9: In the Send Email V2 action you can create your message style it using the Rich text editor, once you are done styling your message click on code view button as shown in the below image.  Step 10: In the code view you will get the rich text message in HTML, copy this code and Delete the Send an email (V2) step.  Step 11: Click on new step and search for compose   Step 12: Rename this compose to Message body and paste the HTLM code of the message body from step 10  Step 13: Now click new step and search for Post message in a chat or channel action.  Step 14:  Fill in the details as shown below, you can post this as user or a flow bot select the teams and the teams channel and paste the output of compose in the message.  Output  Hope this helps 😊!  We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Choosing Between Synchronous and Asynchronous Integration for Dynamics 365

When working with Dynamics 365, one of the key decisions during integration design is whether to implement synchronous or asynchronous communication. Understanding the differences and use cases for each approach is critical to building reliable, efficient, and scalable integrations. Understanding the Difference When to Use Synchronous Integration Synchronous integration is appropriate when: Advantages: Immediate confirmation, straightforward error detection.Considerations: Can slow down the system if the target application experiences latency, less scalable for high-volume scenarios. When to Use Asynchronous Integration Asynchronous integration is better suited for scenarios where: Advantages: Highly scalable, non-blocking operations, suitable for batch processing.Considerations: Errors may not be detected immediately, and tracking processing status requires additional monitoring. Real-World Examples Decision-Making Approach When evaluating which approach to use, consider these questions: To conclude, both synchronous and asynchronous integrations have distinct advantages and trade-offs. Synchronous workflows provide real-time feedback and simpler error handling, while asynchronous workflows offer scalability and efficiency for high-volume or non-urgent processes. Selecting the right approach for your Dynamics 365 integration requires careful consideration of business requirements, data volume, and system performance. By aligning the integration method with these factors, you can ensure reliable, efficient, and maintainable integrations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

How to Change Posting Number Series in Business Central: Configuration and Customization Guide

In many businesses, especially those involved in international trade, it’s common to handle both import and export transactions. To keep records clean and compliant, companies often want to assign separate posting number series for exports and imports in Microsoft Dynamics 365 Business Central. In this guide, we’ll walk you through how to configure and automate posting number series selection based on the sales order type Export or Import helping your business maintain accurate and organized documentation. Business Scenario: A customer requires that: When a Sales Order is created for EXPORT, a specific Export Posting No. Series should be applied. When a Sales Order is created for IMPORT, a different Import Posting No. Series should be used. This allows for easy tracking, filtering, and compliance with customs or internal auditing processes. Steps to achieve goal Step 1: Create Two Posting Number SeriesGo to “Number Series”. Create two new series: SO-EXP-2025 → for Export SO-IMP-2025 → for Import Set appropriate starting numbers, prefixes, and increment-by values. And then create another No series for Sales Order relationship S-ORD-R add above no series in relationship in Sales & receivable setup add the new S-ORD-R Step 2: Create a field add field in page extension and table extension of No series Line. Step 3: Add Logic in the Sales Order Page ExtensionIn your Sales Order page extension, implement logic to check if the selected No. Series is tagged as “Export”. If so, automatically assign the corresponding value from the “Posted No. Series Code” to the “Posting No. Series” field on the Sales Order. This ensures that when an Export-related number series is used, the correct posting series is set without manual intervention. To conclude, setting different posting number series based on whether a Sales Order is for Export or Import is a simple yet powerful customization in Business Central. With a small extension or logic-based workflow, you can automate this process to enhance control and compliance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Work Smarter, Not Harder: Use Copilot to Summarize Records in Dynamics 365 BC

With the 2025 Wave 1 release, Dynamics 365 Business Central becomes even more user-friendly thanks to a smart new feature powered by Copilot: Summarize a Record. This feature allows users to instantly view a plain-language summary of key records, such as customers, vendors, items, and sales or purchase documents. Instead of clicking through multiple tabs or analyzing raw data, you now get a clear, AI-generated overview in seconds. The Summarize with Copilot action reviews all relevant data from a record and provides a quick summary in natural language. Steps to use this feature:Open a supported record (e.g., a customer or sales order). Click on “Summarize with Copilot” which is on right side. Copilot instantly generates a readable summary based on available data. This works seamlessly across environments where Copilot is enabled and enhances the way you interact with Business Central data. To conclude, summarize a Record with Copilot is a perfect example of working smarter, not harder. Whether you’re preparing for a customer call, reviewing a vendor, or checking on an item, this feature gives you quick context without the clicks. It’s one more step toward making Business Central faster, simpler, and more intelligent just like modern business software should be. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

From Manual Orders to Global Growth: How E-Commerce + ERP Integration Transformed this company 

In today’s global manufacturing landscape, businesses need more than just strong products to stay competitive. They need digital operations that connect customers, distributors, and internal teams in different regions. One powerful way to achieve this is by integrating e-commerce platforms with enterprise resource planning (ERP) systems.  This is the story of a 140-year-old global leader in materials testing machine manufacturing that transformed its order-taking process through a Shopify–Dynamics 365 Finance & Operations integration.  The Challenge  With offices in five countries and sales across the UK, Europe, China, India and multiple U.S. territories, this manufacturer had a truly global footprint. Yet, order-taking remained manual and inefficient:  In short: their legacy setup couldn’t keep up with modern customer expectations or their own ambitions for global growth.  The Solution  Over the course of a decade long partnership, we helped the company modernize and digitize its business processes. The centre piece was a seamless integration between Shopify and Dynamics 365 Finance & Operations (F&O), built natively within F&O (no recurring middleware costs).  Key integrations included:  This solution ensured that high data volumes and complex processing demands could be handled efficiently within F&O.  The Results  The change has reshaped how the company works:  Lessons for Other Global Manufacturers  This journey highlights critical lessons for manufacturers, distributors, and global businesses alike:  The Road Ahead  After integrating Shopify with Dynamics 365 F&O, the company has launched a dedicated distributor website where approved distributors can place orders directly on behalf of customers. This portal creates a new revenue stream, strengthens the distribution network, and ensures orders flow into F&O with the same automation, inventory sync, and reporting as direct sales. By extending digital integration to distributors, the company is simplifying order-taking while expanding its business model for global growth.  Ending thoughts  The journey of this global manufacturer shows that true digital transformation isn’t about adding more tools, it’s about connecting the right ones. By integrating Shopify with Dynamics 365 F&O, they moved from fragmented, manual processes to a scalable, automated ecosystem that empowers customers, distributors, and internal teams alike.  For any organization operating across regions, the lesson is clear: e-commerce and ERP should not live in silos. When they work together, they create a foundation that not only accelerates order taking but also unlocks new revenue streams, sharper insights, and stronger global relationships.  In a world where speed, accuracy, and customer experience define competitiveness, the question isn’t whether you can afford to integrate, it’s whether you can afford not to.  What’s next:   Don’t let manual processes slow you down. Connect with us at transform@cloudfronts.com and let’s design an integration roadmap tailored for your business. 

Share Story :

Smarter Data Integrations Across Regions with Dynamic Templates

At CloudFronts Technologies, we understand that growing organizations often operate across multiple geographies and business units. Whether you’re working with Dynamics 365 CRM or Finance & Operations (F&O), syncing data between systems can quickly become complex—especially when different legal entities follow different formats, rules, or structures. To solve this, our team developed a powerful yet simple approach: Dynamic Templates for Multi-Entity Integration. The Business Challenge When a global business operates in multiple regions (like India, the US, or Europe), each location may have different formats for project codes, financial categories, customer naming, or compliance requirements. Traditional integrations hardcode these rules—making them expensive to maintain and difficult to scale as your business grows. Our Solution: Dynamic Liquid Templates We built a flexible, reusable template system that automatically adjusts to each legal entity’s specific rules—without the need to rebuild integrations for each one. Here’s how it works: Why This Matters for Your Business Real-World Success Story One of our client’s needs to integrate project data from CRM to F&O across three different regions. Instead of building three separate integrations, we implemented a single solution with dynamic templates. The result? What Makes CloudFronts Different At CloudFronts, we build future-ready integration frameworks. Our approach ensures you don’t just solve today’s problems—but prepare your business for tomorrow’s growth. We specialize in Microsoft Dynamics 365, Azure, and enterprise-grade automation solutions. “Smart integrations are the key to global growth. Let’s build yours.” We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Struggling to Bulk Upload “Item Revaluation Entries”? Here’s What Could Be Going Wrong

In this blog, one of our clients had made some small mistakes while providing the data for their opening balances, which caused the item costs to be wrong. After a lot of back-and-forth, we finally got a list of 100+ items with the correct costs. We thought it would be easy to fix using Edit in Excel but then we ran into an error:“Quantity must have a value in Item Journal Line.” But this is odd as when we are creating the entries manually, we don’t need to set the Quantity from anywhere. In fact, the Quantity field isn’t even editable, it is populated when the “Applies-to Entry” field is updated. We tried using Configuration Package.. same thing! We tried to create an excel import, that uploads data in the journal.. same thing! So what’s going on? Details After a bit of debugging we found this piece of code to be the problem –  And this -> When you update the “Unit Cost (Revalued)” field in Business Central, it also updates the “Inventory Value (Revalued)” field automatically. This part is simple. But the system also tries to update “Unit Cost (Revalued)” again based on the value you just changed—almost like it’s going in circles. To avoid this, the system checks which field is currently being updated. If it’s not “Unit Cost (Revalued)”, the update is allowed. When you make changes from the Business Central screen, the system knows which field you’re changing, thanks to something called CurrFieldNo. But when you use Edit in Excel, Config Packages, or AL code, this info is missing. That confuses the system and can cause it to divide by zero, which leads to an error. Also, there’s a rule that checks quantity in the “Applies-to Entry” field. This check only happens if the “Value Entry Type” is not set to “Revaluation”. This was raised back in 2018 on Github as a bug but it was closed as intended system design. At the end, we had to bypass the validation and assign the values directly to the fields. To conclude, what seemed like a simple task updating revalued costs turned into a deep dive into Business Central’s internal logic.  The issue stemmed from how the system handles field updates differently depending on the entry method.  While the manual interface sets background values like CurrFieldNo to help Business Central track changes properly, external methods like Edit in Excel or Config Packages don’t provide that context. If you need further assistance or have specific questions about your ERP setup, feel free to reach out for personalized guidance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Enhancing Workflow Observability with Open Telemetry in Azure Logic Apps

Struggling to Monitor Your Logic App Workflows End-to-End? Azure Logic Apps are a powerful tool for automating business workflows across services. But as these workflows grow in size and complexity, so do the challenges in tracking, debugging, and optimizing them. The built-in monitoring options, while helpful often don’t provide full visibility. This leaves teams scrambling to understand failures, bottlenecks, or performance issues. Here’s the good news: OpenTelemetry can change that. In this post, you’ll learn how to gain complete observability into your Logic Apps workflows using OpenTelemetry, the industry-standard framework for telemetry data. Why Observability Matters in Azure Logic Apps Logic Apps connect multiple services , APIs, databases, emails, on-prem systems, and more. But as you stitch these workflows together, it becomes harder to: While Azure provides diagnostics via Monitor and Application Insights, they often produce fragmented data. These tools lack native support for distributed tracing, which is essential when workflows span many components. That’s where OpenTelemetry helps. With it, you can gather: Together, these three “pillars of observability” give you actionable insights into your Logic App’s behavior. What is OpenTelemetry? OpenTelemetry is an open-source standard for collecting and exporting telemetry data. It supports multiple platforms, Azure, AWS, GCP and can export data to tools like Application Insights, Jaeger, or Prometheus. With OpenTelemetry, you can: It ensures a consistent observability strategy across your cloud-native systems — including Logic Apps. How to Integrate OpenTelemetry with Azure Logic Apps Azure Logic Apps don’t yet support OpenTelemetry out of the box. But with a smart setup, you can still plug them into an OpenTelemetry pipeline. 🛠️ Step-by-Step Guide: Real Example: Order Processing with Observability Imagine this: Without OpenTelemetry: With OpenTelemetry: This means faster resolution, less guesswork, and a better customer experience. ✅ Use correlation IDs across services✅ Add custom dimensions to enrich telemetry✅ Configure sampling to control trace volume✅ Monitor latency thresholds for each Logic App step✅ Log business-critical metadata (e.g., Order ID, region) Start Small, See Big Results Observability is no longer optional. It’s a must-have for teams building scalable, resilient workflows. Here’s your action plan:

Share Story :

The Hidden Power BI Feature That Will Transform Your Data Automation

Are you tired of manually writing complex DAX queries for your Power Automate flows? What if Power BI has been secretly recording every optimized query for you all along? The Challenge Every Power BI Developer Faces For growing businesses, as much as their dashboards and reports are important, automating data workflows becomes equally crucial. As organizations scale, the need to extract Power BI insights programmatically increases, making efficient query extraction essential to maintaining operational flow and development productivity. If you’re considering streamlining your Power BI to Power Automate integration process, this article is for you. I’m confident this article will guide you in mastering a Power BI technique that helps you achieve these impressive productivity gains. Key Takeaways What Exactly is Performance Analyzer? Performance Analyzer is Power BI’s built-in diagnostic tool that captures every single operation happening behind the scenes when you interact with your reports. Think of it as a detailed activity log that records not just what happened, but exactly how Power BI executed each query. Most developers use it for performance troubleshooting, but here’s the secret: it’s actually your gateway to extracting production-ready DAX queries for automation. Step 1: Unleashing the Performance Analyzer Accessing Your Hidden Toolkit The Performance Analyzer isn’t hidden in some obscure menu – it’s right there in your Power BI Desktop ribbon, waiting to revolutionize your workflow. To activate Performance Analyzer: Starting Your Query Capture Session Think of this as putting Power BI under a microscope. Every interaction you make will be recorded and analyzed. The capture process: Step 2: Extracting the Golden DAX Queries Decoding the Performance Data When you expand any visual event in the Performance Analyzer, you’ll see several components: Here’s where it gets exciting: Click on “Copy query” next to the DAX Query section. Real-World Example: Sales Dashboard Automation Let’s say you have a sales dashboard with a card showing total revenue. After recording and expanding the performance data, you might extract a DAX query like this: This is pure gold – it’s the exact query Power BI uses internally, optimized and ready for reuse! The DAX queries can be used in the following areas: To conclude, I encourage you to take a close look at your current Power BI automation processes. Identify one manual reporting task that you perform weekly – perhaps a sales summary, performance dashboard update, or data quality check. Start with this simple action today: Open one of your existing Power BI reports, activate Performance Analyzer, and extract just one DAX query. Then build a basic Power Automate flow using that query. This single step will demonstrate the power of this technique and likely save you hours in your next automation project. Need practical guidance on implementing this in your organization? Feel free to connect at transform@cloudfronts.com for specific solutions that can help you develop more effective Power BI automation workflows. Taking action now will lead to significant time savings and more robust automated reporting for your business.

Share Story :

From Clean Data to Insights: Integrating Azure Databricks with Power BI and MLflow

Cleaning data is only half the journey. The real value comes when that clean, reliable data powers dashboards for decision-makers and machine learning models for prediction. In this post, we’ll explore two powerful integrations of Azure Databricks: Why These Integrations Matter For growing businesses: Together, they create a bridge from cleaned data → insights → action. Practical Example 1: Databricks + Power BI 👉 Result: Executives can open Power BI and instantly see up-to-date sales performance across geographies. Practical Example 2: Databricks + MLflow 👉 Result: Your business can predict customer trends, forecast sales, or identify churn risk directly from cleaned Databricks data. To conclude, with these integrations: Together, they help organizations move from cleaned data → insights → intelligent action. ✅ Already cleaning data in Databricks? Try connecting your first Power BI dashboard today.✅ Want to explore AI? Start logging experiments with MLflow to track and deploy models seamlessly. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange