Latest Microsoft Dynamics 365 Blogs | CloudFronts - Page 6

Getting Started with the Event Recorder in Business Central

When developing customizations or extensions in Microsoft Dynamics 365 Business Central, working with events is a best practice. Events help ensure your code is upgrade-safe and cleanly decoupled from the standard application code. However, one common challenge developers face is figuring out which events are triggered during certain actions in the system. That’s where the Event Recorder comes in. What Is the Event Recorder? The Event Recorder is a built-in tool in Business Central that allows developers to monitor and log all published and subscribed events during a user session. Think of it as a “black box” recorder for event-driven development. It helps you identify: This tool is extremely helpful when you’re customizing functionality using event subscriptions (as per AL best practices) but aren’t sure which event to subscribe to. Why Use the Event Recorder? Traditionally, developers had to dig through AL code or documentation to find the right event to subscribe to. With the Event Recorder, this becomes faster and more efficient. Key benefits include: How to Use the Event Recorder Here’s a step-by-step guide: Step 1: Open Event Recorder Step 2: Start a New Recording Step 3: Stop Recording Step 4: Review the Results Best Practices Sample Use Case Suppose you’re trying to add custom logic every time a Sales Invoice is posted. You’re not sure which event gets triggered at that point. Using the Event Recorder: Now, you can write a subscriber in your AL code like this: [EventSubscriber(ObjectType::Page, Page::”Customer List”, ‘OnAfterGetRecordEvent’, ”, true, true)] local procedure MyProcedure() begin // Your custom logic here end; Limitations While powerful, the Event Recorder does have some limitations: To conclude, Event Recorder is an indispensable tool for any AL developer working in Business Central. It simplifies the discovery of relevant events, helps maintain clean and upgrade-safe extensions, and boosts overall development efficiency. Whether you’re new to AL or a seasoned developer, incorporating the Event Recorder into your workflow will save you time. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

How to Build a Scorecard in Power BI

What Is a Scorecard in Power BI? A Scorecard is a visual performance monitoring tool that allows you to track key metrics (goals) against predefined targets. Power BI’s Metrics (formerly Goals) feature helps you: Why Use Scorecards? Here’s why Scorecards are powerful for any team: Benefit Description Goal Alignment Track KPIs aligned to strategic objectives. Accountability Assign owners and collaborators for each goal. Real-time Tracking Monitor progress with live metrics. Visual Reporting Easy-to-read dashboards and history tracking. Step-by-Step: How to Build a Scorecard in Power BI Step 1: Navigate to Power BI Service Go to Power BI Service and choose the workspace where you want to create your Scorecard (Premium or Pro workspaces only). Step 2: Create a New Scorecard  You’ll now land on a blank Scorecard canvas. Step 3: Add Metrics to the Scorecard You can connect it to an existing Power BI dataset or manually input values. Step 4: Link Metrics to Data (Optional but Recommended) To automate tracking: This ensures your Scorecard updates automatically with data refreshes. Step 5: Customize the Scorecard You can also create hierarchies — group related goals under broader objectives. Step 6: Share & Collaborate Once your Scorecard is built: To conclude, Power BI Scorecards turn your data into action. They help track goals in real time, assign ownership, and keep teams focused on what matters most. Whether you’re managing a sales team, a project, or company-wide objectives — Power BI Scorecards are a game-changer for performance tracking. Want to bring visibility and accountability to your team goals? Head to Power BI Service and start building your first Scorecard today! Need help connecting metrics to your datasets? Reach out, and we’ll guide you step by step. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Mastering Multithreaded Batch Jobs in Dynamics 365 Finance & Operations

In the world of finance and operations, efficiency and accuracy are critical. Batch jobs play a vital role in automating repetitive tasks, processing large volumes of data, and ensuring seamless business operations. For organizations using Microsoft Dynamics 365 Finance and Operations (D365 F&O), the X++ programming language provides a powerful way to design, schedule, and execute batch jobs effectively. This blog explores how batch jobs function in D365 F&O, their importance in financial and operational workflows, and best practices for implementing them using X++. What Are Batch Jobs in D365 F&O? Batch jobs in Dynamics 365 Finance and Operations are automated processes that run in the background without user intervention. They are ideal for: Batch jobs help reduce manual effort, minimize errors, and improve efficiency. Example: A Simple X++ Batch Job class MyBatchJobTask extends RunBaseBatch { // Define variables str description; // Main execution logic public void run() {     info(“Batch job started: ” + description);  // Business logic here (e.g., update records, process transactions)     ttsbegin;         // Example: Update ledger entries         LedgerJournalTable journalTable;         journalTable.Description = description;         journalTable.insert();     ttscommit;  info(“Batch job completed successfully.”); }  // Constructor public static MyBatchJobTask construct() {     return new MyBatchJobTask(); }  // Main method to run the job public static void main(Args _args) {     MyBatchJobTask batchJob = MyBatchJobTask::construct();     batchJob.description = “End-of-Day Reconciliation”;  // Run the batch job     if (batchJob.prompt())     {         batchJob.run();     } }  } Key Benefits of Batch Jobs in Finance & Operations Best Practices for Batch Jobs in X++ – Example: x++ BatchHeader batchHeader = BatchHeader::construct(); batchHeader.addTask(this); batchHeader.addRuntimeTask(MyOtherBatchTask::construct(), 1); batchHeader.save(); 2. Implement Error Handling & Logging 3. Optimize Performance 4. Schedule Jobs Efficiently 5. Test in a Non-Production Environment Real-World Use Cases 1. Automated Invoice Posting 2. Inventory Revaluation 3. Bank Reconciliation Matching To conclude, batch jobs in Dynamics 365 Finance and Operations (using X++) remain a cornerstone of financial and operational automation. By following best practices—such as optimizing performance, implementing error handling, and leveraging batch groups—organizations can maximize efficiency while reducing manual effort. As Dynamics 365 Finance & Operations continues to evolve, integrating AI and cloud-based batch processing will further enhance speed and reliability. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Time & Expense Management in Dynamics 365 Project Operations

In a project-driven organization, time and expense tracking is not just about administrative accuracy—it’s essential for billing, cost control, compliance, and project profitability. Dynamics 365 Project Operations (D365 PO) offers a seamless and integrated module to manage employee time entries, expense submissions, and approval workflows with real-time visibility into project performance. This article explains the complete lifecycle of time and expense management in D365 PO, from entry to approval, validation, and integration with billing and costing. 1. Time Tracking in D365 PO D365 PO allows team members to enter time against project tasks directly. Time Entry Workflow: Time can be entered daily or weekly, based on organizational preference. Integration with Project Plan:  Time Entry Validation: 2. Approval Process Time entries follow a configurable approval workflow: Approver Typical Role Project Manager Reviews accuracy and relevance of effort Resource Manager Optional; verifies allocation validity Finance Team Optional; validates for billing cycle Approval settings can be defined per project, customer, or legal entity. Approved entries become part of:  3. Expense Management D365 PO supports tracking billable and non-billable expenses incurred during project delivery.  Expense Entry Steps:  Expense Policies: Administrators can define Expense Policies to control spending: Policy Area Examples Limits Max per diem, lodging cap, airfare budget Category Rules Travel allowed only if project is > X days Receipt Requirements Mandatory for amounts above X Currency Controls Only specified currencies allowed Violations can trigger warnings, hard stops, or workflow escalations.  Integration & Automation Post Approval: Time/Expense on the Go:  Reporting & Compliance Auditors and finance teams can rely on historical logs, comments, and attachments for audit trails and regulatory compliance. To conclude, Effective Time and Expense Management in Dynamics 365 Project Operations enables accurate billing, real-time cost tracking, and employee accountability. With intuitive entry interfaces, approval workflows, and policy enforcement, D365 PO ensures both operational efficiency and financial compliance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

When to Use Azure Data Factory vs Logic Apps in Dynamics 365 Integrations

You’re integrating Dynamics 365 CRM with other systems—but you’re confused:Should I use Azure Data Factory or Logic Apps?Both support connectors, data transformation, and scheduling—but serve different purposes. When you’re working on integrating Dynamics 365 with other systems, two Azure tools often come up: Azure Logic Apps and Azure Data Factory (ADF). I’ve been asked many times — “Which one should I use?” — and honestly, there’s no one-size-fits-all answer. Based on real-world experience integrating D365 CRM and Finance, here’s how I approach choosing between Logic Apps and ADF. When to Use Logic Apps Azure Logic Apps is ideal when your integration involves: 1. Event-Driven / Real-Time Integration 2. REST APIs and Lightweight Automation 3. Business Process Workflows 4. Quick and Visual Flow Creation Azure Data Factory is better for: 1. Large Volume, Batch Data Movement 2. ETL / ELT Scenarios 3. Integration with Data Lakes and Warehouses 4. Advanced Data Flow Transformation Feature Comparison Table Feature Logic Apps Data Factory Trigger on Record Creation/Update Yes No (Batch Only) Handles APIs (HTTP, REST, OData) Excellent Limited Real-time Integration Yes No Large Data Volumes (Batch) Limited Excellent Data Lake / Warehouse Integration Basic (via connectors) Deep support Visual Workflow Visual Designer Visual (for Data Flows) Custom Code / Transformation Limited (use Azure Function) Strong via Data Flows Cost for High Volume Higher (Per Run) Cost-efficient for batch Real-World Scenarios 2. Use ADF When: To conclude, choose Logic Apps for real-time, low-volume, API-based workflows.Use Data Factory for batch ETL pipelines, high-volume exports, and reporting pipelines. Integrations in Dynamics 365 CRM aren’t one-size-fits-all—pick the right tool based on the data size, speed, and transformation needs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

Create No Code Powerful AI Agents – Azure AI Foundry

An AI agent is a smart program that can think, make decisions, and do tasks. Sometimes it works alone, and sometimes it works with people or other agents. The main difference between an agent and a regular assistant is that agents can do things on their own. They don’t just help—you can give them a goal, and they’ll try to reach it. Every AI agent has three main parts: Agents can take input like a message or a prompt and respond with answers or actions.  For example, they might look something up or start a process based on what you asked. Azure AI Foundry is a platform that brings all these things together; so you can build, train, and manage AI agents easily. References What is Azure AI Foundry Agent Service? – Azure AI Foundry | Microsoft Learn Understanding deployment types in Azure AI Foundry Models – Azure AI Foundry | Microsoft Learnhttps://learn.microsoft.com/en-us/azure/ai-foundry/how-to/index-add Usage Firstly, we create a project in Azure AI Foundry. Click on Next and give a name to your project. Wait till the setup finishes. Once the project creation finishes we are greeted with this screen. Click on Agents tab and click on Next to choose the model. I’m currently using GPT-4o Mini. It also includes descriptions for all the available models. Then we configure the deployment details. There are multiple deployment types available such as – Global Deployments Data Zone Standard Deployments Standard deployments [Standard] follow a pay-per-use model perfect for getting started quickly.They’re best for low to medium usage with occasional traffic spikes.  However, for high and steady loads, performance may vary.Provisioned deployments [ProvisionedManaged] let you pre-allocate the amount of processing power you need.This is measured using Provisioned Throughput Units (PTUs).  Each model and version requires a different number of PTUs and offers different performance levels. Provisioned deployments ensure predictable and stable performance for large or mission-critical workloads. This is how the deployment details look for in Global Standard. I’ll be choosing Standard deployment for our use case. Click on deploy and wait for a few seconds. Once the deployment is completed, you can give your agent a name and some instructions for their behavior. You should specify the tone, end goal, verbosity, etc as well. You can also specify the Temperature and Top P values which are both a control on the randomness or creativeness of the model. Temperature controls how bold or cautious the model is. Lower temperature = Safer, more predictable answers. (Factual Q&A, Code Summarization)Higher temperature = More creative or surprising answers. (Poetry/Creative writing) Top P (Nucleus Sampling) controls how wide the model’s word choices are. Lower Top P = Only picks from the most likely words. (Legal or financial writing) Higher Top P = Includes less likely, more diverse words. (Brainstorming names) Next, I’ll add a knowledge base to my bot. For this example, I’ll just upload a single file.However, you have the option to add an sharepoint folder or files, connect it to Bing Search, MS Fabric, Azure AI search, etc as required. A Vector store in Azure AI Foundry helps your AI agent retrieve relevant information based on meaning rather than just keywords.It works by breaking your content (like a PDF) into smaller parts, converting them into numerical representations (embeddings), and storing them.When a user asks a question, the AI finds the most semantically similar parts from the vector store and uses them to generate accurate, context-aware responses. Once you select the file, click on Upload and save. At this point, you can start to interact with your model. To “play around” with your model, click on the “Try in Playground” button. And here, we can see the output based on our provided knowledge base. One more example, just because it is kind of fun. Every input that you provide to the agent is called as a “message”. Everytime the agent is invoked for processing the provided input is called a “run”. Every interaction session with the agent is called a “thread”. We can see all the open threads in the threads section. To conclude, Azure AI Foundry makes it easy to build and use AI agents without writing any code.  You can choose models, set how they behave, and connect your data all through a simple interface. Whether you’re testing ideas, automating tasks, or building custom bots, Foundry gives you the tools to do it.If you’re curious about AI or want to try building your agent, Foundry is a great place to begin. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

Struggling with Siloed Systems? Here’s How CloudFronts Gets You Connected

In today’s world, we use many different applications for our daily work. One single application can’t handle everything because some apps are designed for specific tasks. That’s why organizations use multiple applications, which often leads to data being stored separately or in isolation. In this blog, we’ll take you on a journey from siloed systems to connected systems through a customer success story. About BÜCHI Büchi Labortechnik AG is a Swiss company renowned for providing laboratory and industrial solutions for R&D, quality control, and production. Founded in 1939, Büchi specializes in technologies such as: Their equipment is widely used in pharmaceuticals, chemicals, food & beverage, and academia for sample preparation, formulation, and analysis. Büchi is known for its precision, innovation, and strong customer support worldwide.  Systems Used by BÜCHI To streamline operations and ensure seamless collaboration, BÜCHI leverages a variety of enterprise systems: Infor and SAP Business One are utilized for managing critical business functions such as finance, supply chain, manufacturing, and inventory. Reporting Challenges Due to Siloed Systems Organizations often rely on multiple disconnected systems across departments — such as ERP, CRM, marketing platforms, spreadsheets, and legacy tools. These siloed systems result in: The Need for a Single Source of Truth To solve these challenges, it’s critical to establish a Single Source of Truth (SSOT) — a central, trusted data platform where all key business data is: How We Helped Büchi Connect Their Systems To build a seamless and scalable integration framework, we leveraged the following Azure services: >Azure Logic Apps – Enabled no-code/low-code automation for integrating applications quickly and efficiently. >Azure Functions – Provided serverless computing for lightweight data transformations and custom logic execution. >Azure Service Bus – Ensured reliable, asynchronous communication between systems with FIFO message processing and decoupling of sender/receiver availability. >Azure API Management (APIM) – Secured and simplified access to backend services by exposing only required APIs, enforcing policies like authentication and rate limiting, and unifying multiple APIs under a single endpoint. BÜCHI’s case study was published on the Microsoft website, highlighting how CloudFronts helped connect their systems and prepare their data for insights and AI-driven solutions. Why a Single Source of Truth (SSOT) Is Important A Single Source of Truth means having one trusted location where your business stores consistent, accurate, and up-to-date data. Key Reasons It Matters: How we did this We used Azure Function Apps, Service Bus, and Logic Apps to seamlessly connect the systems. Databricks was implemented to build a Unity Catalog, establishing a Single Source of Truth (SSOT). On top of this unified data layer, we enabled advanced analytics and reporting using Power BI. In May, we hosted an event with BÜCHI at the Microsoft Office in Zurich. During the session, one of the attending customers remarked, “We are five years behind BÜCHI.” Another added, “If we don’t start now, we’ll be out of the race in the future.” This clearly reflects the urgent need for businesses to evolve. Today, Connected Systems, a Single Source of Truth (SSOT), Advanced Analytics, and AI are not optional — they are essential for sustainable growth and improved human efficiency. The pace of transformation has accelerated: tasks that once took months can now be achieved in days — and soon, perhaps, with just a prompt. To conclude, if you’re operating with multiple disconnected systems and relying heavily on manual processes, it’s time to rethink your approach. System integration and automation free your teams from repetitive work and empower them to focus on high impact, strategic activities. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

CTEs vs Subqueries in SQL: What’s the Difference and When to Use Them?

Posted On July 15, 2025 by Rahul Bansode Posted in Tagged in , ,

What happens when a SQL query becomes too long or hard to follow?> It gets confusing> Difficult to debug> Hard to maintain or extend Use Subqueries or Common Table Expressions (CTEs) to break down the logic and improve readability. What is a Subquery? A subquery is a query inside another query.Below is a query which shows customers whose remaining amount is above the average. What is a CTE (Common Table Expression)? A CTE is a temporary result set you can reference in a main query.It starts with the WITH keyword and improves readability, especially with multi-step logic. To Conclude: Subqueries Advantages Disadvantages Quick and easy for simple filtering. Harder to read when nested. Good for one-off checks. Redundancy if used multiple times (no reuse). CTEs (Common Table Expressions) Advantages Disadvantages Clean, readable SQL for complex queries. May be slightly slower in some databases. Can be recursive. Not supported in old SQL engines. Both subqueries and CTEs help you write better SQL but choosing the right one depends on your needs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

How to Implement Incremental Refresh in Power BI

Refreshing large datasets in Power BI can become time-consuming and resource-intensive as data volume grows. If your reports are based on millions of rows of historical data, refreshing everything daily is neither efficient nor necessary. This is where Incremental Refresh comes in. It allows Power BI to only refresh new or changed data, drastically improving performance and reducing load on your data source. In this blog, you’ll learn how to set up incremental refresh step-by-step—so your Power BI reports stay fast and efficient even with big data. What Is Incremental Refresh in Power BI? Incremental Refresh enables Power BI to load data in partitions, refreshing only the latest ones (e.g., the past 7 days) while keeping the older data static. Why use it? Step 1: Define Parameters in Power Query ·  Open your report in Power BI Desktop (Pro or Premium workspace) ·  Go to Transform Data (Power Query Editor) ·  Create two parameters: ·  Set default values (e.g., RangeStart = 01/01/2020, RangeEnd = 01/01/2021) Step 2: Filter Your Data with These Parameters This tells Power BI what time range to load and eventually refresh incrementally. Step 3: Enable Incremental Refresh in Data Model 📝 Example: This configuration refreshes only the recent week of data each time, while keeping the rest intact. Step 4: Publish to Power BI Service ✅ Done! You’ve now implemented incremental refresh. Best Practices To conclude, Incremental Refresh is a game-changer when it comes to handling large datasets in Power BI. It not only saves refresh time but also optimizes resource usage. By learning how to configure it properly, you can scale your reports with confidence and efficiency Got a large dataset slowing down your Power BI refresh? Implement Incremental Refresh today and see the difference. Explore more Power BI performance tips in our blog series—or reach out for help setting up enterprise-grade models. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

Seamlessly Generating and Downloading SSRS Reports in MFA-Enabled Power Pages Environments

Generating SSRS (SQL Server Reporting Services) reports from within Power Pages becomes more complex in environments secured by Multi-Factor Authentication (MFA). Traditional approaches that work in basic environments tend to fail silently or inconsistently when MFA, session tokens, or cookie-based auth are involved. In this blog, I’ll share a real-world solution where a Project-based SSRS report was generated securely, sent via email, and optionally downloaded — all within the constraints of a Power Pages + Power Automate architecture in a Dynamics 365 MFA-protected environment. Step 1: Pre-Outline Brief Pain Points Solved: Step 2: Core Content Scenario Overview Problem Statement Standard HTTP-based retrieval of SSRS reports using the Reserved.ReportViewerWebControl.axd endpoint fails in MFA-protected environments due to missing browser session cookies. This often results in 302 redirects or HTML-based error messages that cannot be processed by Power Automate. Initial Approach and Issue A flow was constructed to: The Project ID is captured from Power Pages and passed to a Power Automate flow using an HTTP trigger, which is initiated when a user clicks a button on the portal—triggered via embedded JavaScript. Build the SSRS report URL dynamically. Compose -> PDF Download Start – Index -> Compose -> PDF Download String Length -> Compose -> PDF Download URL -> (Replaced the PrintOnOpen=true parameter with PrintOnOpen=false in the report export URL to prevent the print dialog from automatically appearing when the PDF is opened) Perform an HTTP GET request to download the report. This failed consistently on the first try due to the report session page not being fully ready or authenticated, especially in an MFA environment. Working Solution: Retry with Delay in Power Automate To overcome the session-based delay, we implemented a retry pattern inside Power Automate: Outcome: The flow fails the first time (as expected), but succeeds on the second or third retry as the session becomes valid and the SSRS report is available. Power Automate Configuration Highlights: Added a Scope block after the first HTTP request and set the Configure run after to Skipped and Failed If needed, you can add the third delay if the second one fails. Benefits of This Approach: To conclude, sometimes, achieving reliability in secure environments isn’t about complex code—it’s about using the right orchestration patterns. By strategically delaying and retrying the HTTP request to SSRS within Power Automate, we achieved consistent, secure report generation that works even under MFA constraints. 🔗 Need help implementing this retry-based flow in your environment?Reach out to CloudFronts—we help businesses implement scalable, reliable solutions every day. You can contact us directly at transform@cloudfronts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange