The Future of Financial Reporting: How SSRS in Dynamics 365 is Transforming Finance Teams
In Microsoft Dynamics 365 Finance and Operations (D365 F&O), reporting is a critical aspect of delivering insights, decision-making data, and compliance information. While standard reports are available out-of-the-box, many organizations require customized reporting tailored to specific business needs. This is where X++ and SSRS (SQL Server Reporting Services) come into play. In this blog, we’ll explore how reporting works in D365 F&O, the role of X++, and how developers can create powerful, customized reports using standard tools. Overview: Reporting in D365 F&O Dynamics 365 F&O offers multiple reporting options: Among these, SSRS reports with X++ (RDP) are the most common for developers who need to generate transaction-based, formatted reports—like invoices, purchase orders, and audit summaries. Key Components of an SSRS Report Using X++ To create a custom SSRS report using X++ in D365 F&O, you typically go through these components: Step-by-Step: Building a Report with X++ 1. Create a Temporary Table Create a temporary table that stores the data used for the report. Use InMemory or TempDB depending on your performance and persistence requirements. TmpCustReport tmpCustReport; // Example TempDB table 2. Build a Contract Class This class defines the parameters users will input when running the report. [DataContractAttribute]class CustReportContract{ private CustAccount custAccount; [DataMemberAttribute(“CustomerAccount”)] public CustAccount parmCustAccount(CustAccount _custAccount = custAccount) { custAccount = _custAccount; return custAccount; }} 3. Write a Report Data Provider (RDP) Class This is where you write the business logic and data extraction in X++. This class extends SRSReportDataProviderBase. [SRSReportParameterAttribute(classStr(CustReportContract))]class CustReportDP extends SRSReportDataProviderBase{ TmpCustReport tmpCustReport; public void processReport() { CustReportContract contract = this.parmDataContract(); CustAccount custAccount = contract.parmCustAccount(); while select * from CustTable where CustTable.AccountNum == custAccount { tmpCustReport.AccountNum = CustTable.AccountNum; tmpCustReport.Name = CustTable.Name; tmpCustReport.insert(); } } public TmpCustReport getTmpCustReport() { return tmpCustReport; }} 4. Design the Report in Visual Studio 5. Create Menu Items and Add to Navigation To allow users to access the report: Security Considerations Always create a new Privilege and assign it to Duty and Role if this report needs to be secured. This ensures proper compliance with security best practices. Best Practices To conclude, creating reports using X++ in Dynamics 365 Finance and Operations is a powerful way to deliver customized business documents and analytical reports. With the structured approach of Contract → RDP → SSRS, developers can build maintainable and scalable reporting solutions. Whether you’re generating a sales summary, customer ledger, or compliance documentation, understanding how to use X++ for reporting gives you full control over data and design. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Automatically Update Lookup Fields in Dynamics 365 Using Power Automate: From Custom Tables to Standard Entities
Imagine this: you update a product’s purchase date in a registration record and—boom—a related case automatically gets refreshed with the accurate “Purchased From” lookup. Saves time, reduces errors, and keeps everything in sync without you lifting a finger. Let’s walk through how to make that happen using Power Automate. The goal: When a Product Registration’s cri_purchasedat field is changed, the system will retrieve the related “Purchased From” record and update any linked Case(s) with the appropriate lookup reference. Let’s break down the step-by-step process of how this is done in Power Automate. Step 1: Trigger the Flow When Purchase Date Changes Flow trigger: When a row is added, modified, or deleted (Dataverse) This setup ensures that our flow only fires when that specific date field is modified. Step 2: Pull in the “Purchased From” Record Next, use List rows on the “Purchased From” table with a FetchXML query. We’re searching for a record whose name matches the updated cri_purchasedat. Set Row Count to 1, since we expect only one match. 3. Identify Any Linked Case Records Add another List rows action, this time on the Cases table. We look for records where cri_productregistrationid equals the current product registration’s ID:We now use the List Rows action to fetch all related Case records tied to the updated Product Registration. This time we’re targeting the Cases table (which is internally incident in Dataverse) and using a FetchXML query to match records where cri_productregistrationid equals the current record being modified. This step is critical because it gives us the list of Case records we need to update, based on the link with the modified product registration. <fetch> <entity name=”incident”> <attribute name=”incidentid” /> <attribute name=”title” /> <attribute name=”cf_actualpurchasedfrom” /> <filter> <condition attribute=”cri_productregistrationid” operator=”eq” value=”@{triggerOutputs()?[‘body/cri_productregistrationid’]}” /> </filter> </entity></fetch> 5. Before updating anything, we add a Condition control to ensure that our previously fetched Purchased From record exists and is unique. Why? Because if there’s no match (or multiple matches), we don’t want to update the Cases blindly. We check if this length equals 1. If true → move forward with updates.If false → stop or handle the exception accordingly. To conclude, this kind of validation builds guardrails into your automation, making it more robust and preventing incorrect data from being applied across multiple records. After confirming a valid match, the flow loops through each related Case and updates the “Actual Purchased From” field with the correct value from the matched record, ensuring accurate linkage based on the latest update. Once this step runs, your staging automation is complete—with Cases now intelligently updated in real-time based on Product Registration changes. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Merging Unmanaged Solutions in Power Platform with XRMToolBox
Let’s say you are developing a module driven app or some custom app development in CRM and multiple teams have created multiple different solutions involving customizations for the develop. Best would be to have all the customizations in a single solution before and then move it to UAT or Production. In this blog I will show you how you can move components of multiple solutions into a single main solution using the Solutions Component Mover tool in XRM Tool Box. So let’s begin. Step 1: Download XRM Tool Box from this link – https://www.xrmtoolbox.com/ Step 2: Make a connection to your Dynamics 365 Environment inside of the XRM Tool Box by clicking on Create a new connection. Step 2: Click on Microsoft Login Control Step 3: Click on Open Microsoft Login Control Step 4: Now Select Display list of available organizations & show advance –> put your username and password -> after successful authentication Name your Connection. Step 5: Now in Took Library Search for “Solution Component Mover” and hit install. Step 6: Once the tool is installed it will appear in your tool list click on it Step 7: once you are in the solution component mover tool click on Load Solution. To conclude, now, you will get a list of all Managed and Unmanaged solutions. Select the solutions you want to merge in the Source Solution section and select the target solution in which you want to move the components. All the elements from source solutions will be moved to the target solution (Selected Solutions are highlighted in light grey colour). Once you have selected the source and target solutions hit Copy Components and we are done. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Building the AI Bridge: How CloudFronts Helps You Connect Systems That Talk to Each Other
When we say building a bridge? Does it mean something isn’t connected together? And what is it?It’s AI itself and your systems that are not connected. What this means if although your AI can access your systems to derive information, it’s still unreliable, slow. What is needed for AI to be successful? In order for AI to be successful, below is what to avoid: In order to eliminate the above, we must have a layer of ‘catalog’ which will house all business data together so that a common vocabulary is established between systems. AI then pools from this ‘Data catalog’ to perform agentic actions. The diagram below best explains, on a high level, how this looks : And all this is defined by how well the integrations between these systems are established. How CloudFronts Can Help? CloudFronts has deep integration expertise where we connected cloud-based applications with each other with the below in mind – Often times, we find ready-made plug and play cloud-based integration solutions which come with their own hefty licensing that keeps going up every few years. Using such integration tools not only affects cash flow but also adds a layer of opaqueness, as we don’t control the flow of integration, and we cannot granularize it beyond what’s offered. Custom integration gives you better control and analytics, which readymade solutions can’t.Here’s a CloudFronts Case Study published by Microsoft, wherein we connected systems for our customer with multiple systems driving data and insights. To conclude, AI Agents are meant to be for your organization aren’t optimized to work right away. This disconnect needs to be engineered just like any other implementation project today. As this gap is real and must be fulfilled by something called Unity Catalog and integrations, CloudFronts can help bridge this gap and make AI work for your organization to continue to optimize cash flow against rising costs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Getting Started with the Event Recorder in Business Central
When developing customizations or extensions in Microsoft Dynamics 365 Business Central, working with events is a best practice. Events help ensure your code is upgrade-safe and cleanly decoupled from the standard application code. However, one common challenge developers face is figuring out which events are triggered during certain actions in the system. That’s where the Event Recorder comes in. What Is the Event Recorder? The Event Recorder is a built-in tool in Business Central that allows developers to monitor and log all published and subscribed events during a user session. Think of it as a “black box” recorder for event-driven development. It helps you identify: This tool is extremely helpful when you’re customizing functionality using event subscriptions (as per AL best practices) but aren’t sure which event to subscribe to. Why Use the Event Recorder? Traditionally, developers had to dig through AL code or documentation to find the right event to subscribe to. With the Event Recorder, this becomes faster and more efficient. Key benefits include: How to Use the Event Recorder Here’s a step-by-step guide: Step 1: Open Event Recorder Step 2: Start a New Recording Step 3: Stop Recording Step 4: Review the Results Best Practices Sample Use Case Suppose you’re trying to add custom logic every time a Sales Invoice is posted. You’re not sure which event gets triggered at that point. Using the Event Recorder: Now, you can write a subscriber in your AL code like this: [EventSubscriber(ObjectType::Page, Page::”Customer List”, ‘OnAfterGetRecordEvent’, ”, true, true)] local procedure MyProcedure() begin // Your custom logic here end; Limitations While powerful, the Event Recorder does have some limitations: To conclude, Event Recorder is an indispensable tool for any AL developer working in Business Central. It simplifies the discovery of relevant events, helps maintain clean and upgrade-safe extensions, and boosts overall development efficiency. Whether you’re new to AL or a seasoned developer, incorporating the Event Recorder into your workflow will save you time. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Build a Scorecard in Power BI
What Is a Scorecard in Power BI? A Scorecard is a visual performance monitoring tool that allows you to track key metrics (goals) against predefined targets. Power BI’s Metrics (formerly Goals) feature helps you: Why Use Scorecards? Here’s why Scorecards are powerful for any team: Benefit Description Goal Alignment Track KPIs aligned to strategic objectives. Accountability Assign owners and collaborators for each goal. Real-time Tracking Monitor progress with live metrics. Visual Reporting Easy-to-read dashboards and history tracking. Step-by-Step: How to Build a Scorecard in Power BI Step 1: Navigate to Power BI Service Go to Power BI Service and choose the workspace where you want to create your Scorecard (Premium or Pro workspaces only). Step 2: Create a New Scorecard You’ll now land on a blank Scorecard canvas. Step 3: Add Metrics to the Scorecard You can connect it to an existing Power BI dataset or manually input values. Step 4: Link Metrics to Data (Optional but Recommended) To automate tracking: This ensures your Scorecard updates automatically with data refreshes. Step 5: Customize the Scorecard You can also create hierarchies — group related goals under broader objectives. Step 6: Share & Collaborate Once your Scorecard is built: To conclude, Power BI Scorecards turn your data into action. They help track goals in real time, assign ownership, and keep teams focused on what matters most. Whether you’re managing a sales team, a project, or company-wide objectives — Power BI Scorecards are a game-changer for performance tracking. Want to bring visibility and accountability to your team goals? Head to Power BI Service and start building your first Scorecard today! Need help connecting metrics to your datasets? Reach out, and we’ll guide you step by step. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Mastering Multithreaded Batch Jobs in Dynamics 365 Finance & Operations
In the world of finance and operations, efficiency and accuracy are critical. Batch jobs play a vital role in automating repetitive tasks, processing large volumes of data, and ensuring seamless business operations. For organizations using Microsoft Dynamics 365 Finance and Operations (D365 F&O), the X++ programming language provides a powerful way to design, schedule, and execute batch jobs effectively. This blog explores how batch jobs function in D365 F&O, their importance in financial and operational workflows, and best practices for implementing them using X++. What Are Batch Jobs in D365 F&O? Batch jobs in Dynamics 365 Finance and Operations are automated processes that run in the background without user intervention. They are ideal for: Batch jobs help reduce manual effort, minimize errors, and improve efficiency. Example: A Simple X++ Batch Job class MyBatchJobTask extends RunBaseBatch { // Define variables str description; // Main execution logic public void run() { info(“Batch job started: ” + description); // Business logic here (e.g., update records, process transactions) ttsbegin; // Example: Update ledger entries LedgerJournalTable journalTable; journalTable.Description = description; journalTable.insert(); ttscommit; info(“Batch job completed successfully.”); } // Constructor public static MyBatchJobTask construct() { return new MyBatchJobTask(); } // Main method to run the job public static void main(Args _args) { MyBatchJobTask batchJob = MyBatchJobTask::construct(); batchJob.description = “End-of-Day Reconciliation”; // Run the batch job if (batchJob.prompt()) { batchJob.run(); } } } Key Benefits of Batch Jobs in Finance & Operations Best Practices for Batch Jobs in X++ – Example: x++ BatchHeader batchHeader = BatchHeader::construct(); batchHeader.addTask(this); batchHeader.addRuntimeTask(MyOtherBatchTask::construct(), 1); batchHeader.save(); 2. Implement Error Handling & Logging 3. Optimize Performance 4. Schedule Jobs Efficiently 5. Test in a Non-Production Environment Real-World Use Cases 1. Automated Invoice Posting 2. Inventory Revaluation 3. Bank Reconciliation Matching To conclude, batch jobs in Dynamics 365 Finance and Operations (using X++) remain a cornerstone of financial and operational automation. By following best practices—such as optimizing performance, implementing error handling, and leveraging batch groups—organizations can maximize efficiency while reducing manual effort. As Dynamics 365 Finance & Operations continues to evolve, integrating AI and cloud-based batch processing will further enhance speed and reliability. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Time & Expense Management in Dynamics 365 Project Operations
In a project-driven organization, time and expense tracking is not just about administrative accuracy—it’s essential for billing, cost control, compliance, and project profitability. Dynamics 365 Project Operations (D365 PO) offers a seamless and integrated module to manage employee time entries, expense submissions, and approval workflows with real-time visibility into project performance. This article explains the complete lifecycle of time and expense management in D365 PO, from entry to approval, validation, and integration with billing and costing. 1. Time Tracking in D365 PO D365 PO allows team members to enter time against project tasks directly. Time Entry Workflow: Time can be entered daily or weekly, based on organizational preference. Integration with Project Plan: Time Entry Validation: 2. Approval Process Time entries follow a configurable approval workflow: Approver Typical Role Project Manager Reviews accuracy and relevance of effort Resource Manager Optional; verifies allocation validity Finance Team Optional; validates for billing cycle Approval settings can be defined per project, customer, or legal entity. Approved entries become part of: 3. Expense Management D365 PO supports tracking billable and non-billable expenses incurred during project delivery. Expense Entry Steps: Expense Policies: Administrators can define Expense Policies to control spending: Policy Area Examples Limits Max per diem, lodging cap, airfare budget Category Rules Travel allowed only if project is > X days Receipt Requirements Mandatory for amounts above X Currency Controls Only specified currencies allowed Violations can trigger warnings, hard stops, or workflow escalations. Integration & Automation Post Approval: Time/Expense on the Go: Reporting & Compliance Auditors and finance teams can rely on historical logs, comments, and attachments for audit trails and regulatory compliance. To conclude, Effective Time and Expense Management in Dynamics 365 Project Operations enables accurate billing, real-time cost tracking, and employee accountability. With intuitive entry interfaces, approval workflows, and policy enforcement, D365 PO ensures both operational efficiency and financial compliance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
When to Use Azure Data Factory vs Logic Apps in Dynamics 365 Integrations
You’re integrating Dynamics 365 CRM with other systems—but you’re confused:Should I use Azure Data Factory or Logic Apps?Both support connectors, data transformation, and scheduling—but serve different purposes. When you’re working on integrating Dynamics 365 with other systems, two Azure tools often come up: Azure Logic Apps and Azure Data Factory (ADF). I’ve been asked many times — “Which one should I use?” — and honestly, there’s no one-size-fits-all answer. Based on real-world experience integrating D365 CRM and Finance, here’s how I approach choosing between Logic Apps and ADF. When to Use Logic Apps Azure Logic Apps is ideal when your integration involves: 1. Event-Driven / Real-Time Integration 2. REST APIs and Lightweight Automation 3. Business Process Workflows 4. Quick and Visual Flow Creation Azure Data Factory is better for: 1. Large Volume, Batch Data Movement 2. ETL / ELT Scenarios 3. Integration with Data Lakes and Warehouses 4. Advanced Data Flow Transformation Feature Comparison Table Feature Logic Apps Data Factory Trigger on Record Creation/Update Yes No (Batch Only) Handles APIs (HTTP, REST, OData) Excellent Limited Real-time Integration Yes No Large Data Volumes (Batch) Limited Excellent Data Lake / Warehouse Integration Basic (via connectors) Deep support Visual Workflow Visual Designer Visual (for Data Flows) Custom Code / Transformation Limited (use Azure Function) Strong via Data Flows Cost for High Volume Higher (Per Run) Cost-efficient for batch Real-World Scenarios 2. Use ADF When: To conclude, choose Logic Apps for real-time, low-volume, API-based workflows.Use Data Factory for batch ETL pipelines, high-volume exports, and reporting pipelines. Integrations in Dynamics 365 CRM aren’t one-size-fits-all—pick the right tool based on the data size, speed, and transformation needs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Create No Code Powerful AI Agents – Azure AI Foundry
An AI agent is a smart program that can think, make decisions, and do tasks. Sometimes it works alone, and sometimes it works with people or other agents. The main difference between an agent and a regular assistant is that agents can do things on their own. They don’t just help—you can give them a goal, and they’ll try to reach it. Every AI agent has three main parts: Agents can take input like a message or a prompt and respond with answers or actions. For example, they might look something up or start a process based on what you asked. Azure AI Foundry is a platform that brings all these things together; so you can build, train, and manage AI agents easily. References What is Azure AI Foundry Agent Service? – Azure AI Foundry | Microsoft Learn Understanding deployment types in Azure AI Foundry Models – Azure AI Foundry | Microsoft Learnhttps://learn.microsoft.com/en-us/azure/ai-foundry/how-to/index-add Usage Firstly, we create a project in Azure AI Foundry. Click on Next and give a name to your project. Wait till the setup finishes. Once the project creation finishes we are greeted with this screen. Click on Agents tab and click on Next to choose the model. I’m currently using GPT-4o Mini. It also includes descriptions for all the available models. Then we configure the deployment details. There are multiple deployment types available such as – Global Deployments Data Zone Standard Deployments Standard deployments [Standard] follow a pay-per-use model perfect for getting started quickly.They’re best for low to medium usage with occasional traffic spikes. However, for high and steady loads, performance may vary.Provisioned deployments [ProvisionedManaged] let you pre-allocate the amount of processing power you need.This is measured using Provisioned Throughput Units (PTUs). Each model and version requires a different number of PTUs and offers different performance levels. Provisioned deployments ensure predictable and stable performance for large or mission-critical workloads. This is how the deployment details look for in Global Standard. I’ll be choosing Standard deployment for our use case. Click on deploy and wait for a few seconds. Once the deployment is completed, you can give your agent a name and some instructions for their behavior. You should specify the tone, end goal, verbosity, etc as well. You can also specify the Temperature and Top P values which are both a control on the randomness or creativeness of the model. Temperature controls how bold or cautious the model is. Lower temperature = Safer, more predictable answers. (Factual Q&A, Code Summarization)Higher temperature = More creative or surprising answers. (Poetry/Creative writing) Top P (Nucleus Sampling) controls how wide the model’s word choices are. Lower Top P = Only picks from the most likely words. (Legal or financial writing) Higher Top P = Includes less likely, more diverse words. (Brainstorming names) Next, I’ll add a knowledge base to my bot. For this example, I’ll just upload a single file.However, you have the option to add an sharepoint folder or files, connect it to Bing Search, MS Fabric, Azure AI search, etc as required. A Vector store in Azure AI Foundry helps your AI agent retrieve relevant information based on meaning rather than just keywords.It works by breaking your content (like a PDF) into smaller parts, converting them into numerical representations (embeddings), and storing them.When a user asks a question, the AI finds the most semantically similar parts from the vector store and uses them to generate accurate, context-aware responses. Once you select the file, click on Upload and save. At this point, you can start to interact with your model. To “play around” with your model, click on the “Try in Playground” button. And here, we can see the output based on our provided knowledge base. One more example, just because it is kind of fun. Every input that you provide to the agent is called as a “message”. Everytime the agent is invoked for processing the provided input is called a “run”. Every interaction session with the agent is called a “thread”. We can see all the open threads in the threads section. To conclude, Azure AI Foundry makes it easy to build and use AI agents without writing any code. You can choose models, set how they behave, and connect your data all through a simple interface. Whether you’re testing ideas, automating tasks, or building custom bots, Foundry gives you the tools to do it.If you’re curious about AI or want to try building your agent, Foundry is a great place to begin. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
