Blog Archives - Page 7 of 171 - - Page 7

Category Archives: Blog

Why Project-Based Firms Should Embrace AI Now (Not Later)

In project-based businesses, reporting is the final word. It tells you what was planned, what happened, where you made money, and where you lost it. But ask any project manager or CEO what they really think about project reporting today, and you’ll hear this: “It’s late. It’s manual. It’s siloed. And by the time I see it, it’s too late to act.” This is exactly why AI is no longer optional; it’s essential. Whether you’re in construction, consulting, IT services, or professional engineering, AI can elevate your project reporting from a reactive chore to a strategic asset. Here’s how. The Problem with Traditional Reporting. Most reporting today involves: Enter AI: The Game-Changer for Project Reporting AI isn’t about replacing humans; it’s about augmenting your decision-making. When embedded in platforms like Dynamics 365 Project Operations and Power BI, AI becomes the project manager’s smartest analyst and the CEO’s most trusted advisor. Here’s what that looks like: Imagine your system telling you: “Project Alpha is likely to overrun budget by 12% based on current burn rate and resource allocation trends.” AI models analyse historical patterns, resource velocity, and task progress to predict issues weeks in advance. That’s no longer science fiction—it’s happening today with AI-enhanced Power BI and Copilot in Dynamics 365. Instead of navigating dashboards, just ask: “Show me projects likely to miss deadlines this month.” With Copilot in Dynamics 365, you get answers in seconds with charts and supporting data. No need to wait for your analyst or export 10 spreadsheets. AI can clean, match, and validate data coming from: No more mismatched formats or chasing someone to update a spreadsheet. AI ensures your reports are built on clean, real-time data, not assumptions. You don’t need to check 12 dashboards daily. With AI, set intelligent alerts: These alerts are not static rules but learned over time based on project patterns and exceptions. To conclude, for CEOs and PMs alike: We can show you how AI and Copilot in Dynamics 365 can simplify reporting, uncover risks, and help your team act with confidence. Start small, maybe with reporting or forecasting, but start now. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Building a Scalable Integration Architecture for Dynamics 365 Using Logic Apps and Azure Functions

If you’ve worked with Dynamics 365 CRM for any serious integration project, you’ve probably used Azure Logic Apps. They’re great — visual, no-code, and fast to deploy. But as your integration needs grow, you quickly hit complexity: multiple entities, large volumes, branching logic, error handling, and reusability. That’s when architecture becomes critical. In this blog, I’ll share how we built a modular, scalable, and reusable integration architecture using Logic Apps + Azure Functions + Azure Blob Storage — with a config-driven approach. Whether you’re syncing data between D365 and Finance & Operations, or automating CRM workflows with external APIs, this post will help you avoid bottlenecks and stay maintainable. Architecture Components Component Purpose Parent Logic App Entry point, reads config from blob, iterates entities Child Logic App(s) Handles each entity sync (Project, Task, Team, etc.) Azure Blob Storage Hosts configuration files, Liquid templates, checkpoint data Azure Function Performs advanced transformation via Liquid templates CRM & F&O APIs Source and target systems Step-by-Step Breakdown 1. Configuration-Driven Logic We didn’t hardcode URLs, fields, or entities. Everything lives in a central config.json in Blob Storage: { “integrationName”: “ProjectToFNO”,   “sourceEntity”: “msdyn_project”,   “targetEntity”: “ProjectsV2”,   “liquidTemplate”: “projectToFno.liquid”,   “primaryKey”: “msdyn_projectid” } 2. Parent–Child Logic App Model Instead of one massive workflow, we created a parent Logic App that: Each child handles: 3. Azure Function for Transformation Why not use Logic App’s Compose or Data Operations? Because complex mapping (especially D365 → F&O) quickly becomes unreadable. Instead: {   “ProjectName”: “{{ msdyn_subject }}”,   “Customer”: “{{ customerid.name }}” } 4. Handling Checkpoints For batch integration (daily/hourly), we store last run timestamp in Blob: {   “entity”: “msdyn_project”,   “modifiedon”: “2025-07-28T22:00:00Z” } This allows delta fetches like: ?$filter=modifiedon gt 2025-07-28T22:00:00Z After each run, we update the checkpoint blob. 5. Centralized Logging & Alerts We configured: This helped us track down integration mismatches fast. Why This Architecture Works Need How It’s Solved Reusability Config-based logic + modular templates Maintainability Each Logic App has one job Scalability Add new entities via config, not code Monitoring Blob + Monitor integration Transformation complexity Handled via Azure Functions + Liquid Key Takeaways To conclude, this architecture has helped us deliver scalable Dynamics 365 integrations, including syncing Projects, Tasks, Teams, and Time Entries to F&O all without rewriting Logic Apps every time a client asks for a tweak. If you’re working on medium to complex D365 integrations, consider going config-driven and breaking your workflows into modular components. It keeps things clean, reusable, and much easier to maintain in the long run. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

The Future of Financial Reporting: How SSRS in Dynamics 365 is Transforming Finance Teams

In Microsoft Dynamics 365 Finance and Operations (D365 F&O), reporting is a critical aspect of delivering insights, decision-making data, and compliance information. While standard reports are available out-of-the-box, many organizations require customized reporting tailored to specific business needs. This is where X++ and SSRS (SQL Server Reporting Services) come into play. In this blog, we’ll explore how reporting works in D365 F&O, the role of X++, and how developers can create powerful, customized reports using standard tools. Overview: Reporting in D365 F&O Dynamics 365 F&O offers multiple reporting options: Among these, SSRS reports with X++ (RDP) are the most common for developers who need to generate transaction-based, formatted reports—like invoices, purchase orders, and audit summaries. Key Components of an SSRS Report Using X++ To create a custom SSRS report using X++ in D365 F&O, you typically go through these components: Step-by-Step: Building a Report with X++ 1. Create a Temporary Table Create a temporary table that stores the data used for the report. Use InMemory or TempDB depending on your performance and persistence requirements. TmpCustReport tmpCustReport; // Example TempDB table 2. Build a Contract Class This class defines the parameters users will input when running the report. [DataContractAttribute]class CustReportContract{    private CustAccount custAccount; [DataMemberAttribute(“CustomerAccount”)]    public CustAccount parmCustAccount(CustAccount _custAccount = custAccount)    {        custAccount = _custAccount;        return custAccount;    }} 3. Write a Report Data Provider (RDP) Class This is where you write the business logic and data extraction in X++. This class extends SRSReportDataProviderBase. [SRSReportParameterAttribute(classStr(CustReportContract))]class CustReportDP extends SRSReportDataProviderBase{    TmpCustReport tmpCustReport; public void processReport()    {        CustReportContract contract = this.parmDataContract();        CustAccount custAccount = contract.parmCustAccount();         while select * from CustTable where CustTable.AccountNum == custAccount        {            tmpCustReport.AccountNum = CustTable.AccountNum;            tmpCustReport.Name = CustTable.Name;            tmpCustReport.insert();        }    } public TmpCustReport getTmpCustReport()    {        return tmpCustReport;    }} 4. Design the Report in Visual Studio 5. Create Menu Items and Add to Navigation To allow users to access the report: Security Considerations Always create a new Privilege and assign it to Duty and Role if this report needs to be secured. This ensures proper compliance with security best practices. Best Practices To conclude, creating reports using X++ in Dynamics 365 Finance and Operations is a powerful way to deliver customized business documents and analytical reports. With the structured approach of Contract → RDP → SSRS, developers can build maintainable and scalable reporting solutions. Whether you’re generating a sales summary, customer ledger, or compliance documentation, understanding how to use X++ for reporting gives you full control over data and design. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Automatically Update Lookup Fields in Dynamics 365 Using Power Automate: From Custom Tables to Standard Entities

Imagine this: you update a product’s purchase date in a registration record and—boom—a related case automatically gets refreshed with the accurate “Purchased From” lookup. Saves time, reduces errors, and keeps everything in sync without you lifting a finger. Let’s walk through how to make that happen using Power Automate. The goal: When a Product Registration’s cri_purchasedat field is changed, the system will retrieve the related “Purchased From” record and update any linked Case(s) with the appropriate lookup reference. Let’s break down the step-by-step process of how this is done in Power Automate. Step 1: Trigger the Flow When Purchase Date Changes Flow trigger: When a row is added, modified, or deleted (Dataverse) This setup ensures that our flow only fires when that specific date field is modified. Step 2: Pull in the “Purchased From” Record Next, use List rows on the “Purchased From” table with a FetchXML query. We’re searching for a record whose name matches the updated cri_purchasedat. Set Row Count to 1, since we expect only one match. 3. Identify Any Linked Case Records Add another List rows action, this time on the Cases table. We look for records where cri_productregistrationid equals the current product registration’s ID:We now use the List Rows action to fetch all related Case records tied to the updated Product Registration. This time we’re targeting the Cases table (which is internally incident in Dataverse) and using a FetchXML query to match records where cri_productregistrationid equals the current record being modified. This step is critical because it gives us the list of Case records we need to update, based on the link with the modified product registration. <fetch> <entity name=”incident”>     <attribute name=”incidentid” />     <attribute name=”title” />     <attribute name=”cf_actualpurchasedfrom” />     <filter>       <condition attribute=”cri_productregistrationid” operator=”eq” value=”@{triggerOutputs()?[‘body/cri_productregistrationid’]}” />     </filter>  </entity></fetch> 5. Before updating anything, we add a Condition control to ensure that our previously fetched Purchased From record exists and is unique. Why? Because if there’s no match (or multiple matches), we don’t want to update the Cases blindly. We check if this length equals 1. If true → move forward with updates.If false → stop or handle the exception accordingly. To conclude, this kind of validation builds guardrails into your automation, making it more robust and preventing incorrect data from being applied across multiple records. After confirming a valid match, the flow loops through each related Case and updates the “Actual Purchased From” field with the correct value from the matched record, ensuring accurate linkage based on the latest update. Once this step runs, your staging automation is complete—with Cases now intelligently updated in real-time based on Product Registration changes. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Merging Unmanaged Solutions in Power Platform with XRMToolBox

Let’s say you are developing a module driven app or some custom app development in CRM and multiple teams have created multiple different solutions involving customizations for the develop.  Best would be to have all the customizations in a single solution before and then move it to UAT or Production.  In this blog I will show you how you can move components of multiple solutions into a single main solution using the Solutions Component Mover tool in XRM Tool Box. So let’s begin.  Step 1: Download XRM Tool Box from this link – https://www.xrmtoolbox.com/   Step 2: Make a connection to your Dynamics 365 Environment inside of the XRM Tool Box by clicking on Create a new connection.  Step 2: Click on Microsoft Login Control  Step 3: Click on Open Microsoft Login Control  Step 4: Now Select Display list of available organizations & show advance –> put your username and password -> after successful authentication Name your Connection.  Step 5: Now in Took Library Search for “Solution Component Mover” and hit install.  Step 6: Once the tool is installed it will appear in your tool list click on it   Step 7: once you are in the solution component mover tool click on Load Solution.  To conclude, now, you will get a list of all Managed and Unmanaged solutions. Select the solutions you want to merge in the Source Solution section and select the target solution in which you want to move the components.  All the elements from source solutions will be moved to the target solution (Selected Solutions are highlighted in light grey colour).  Once you have selected the source and target solutions hit Copy Components and we are done.  We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Building the AI Bridge: How CloudFronts Helps You Connect Systems That Talk to Each Other

When we say building a bridge? Does it mean something isn’t connected together? And what is it?It’s AI itself and your systems that are not connected. What this means if although your AI can access your systems to derive information, it’s still unreliable, slow. What is needed for AI to be successful? In order for AI to be successful, below is what to avoid: In order to eliminate the above, we must have a layer of ‘catalog’ which will house all business data together so that a common vocabulary is established between systems. AI then pools from this ‘Data catalog’ to perform agentic actions. The diagram below best explains, on a high level, how this looks : And all this is defined by how well the integrations between these systems are established. How CloudFronts Can Help? CloudFronts has deep integration expertise where we connected cloud-based applications with each other with the below in mind – Often times, we find ready-made plug and play cloud-based integration solutions which come with their own hefty licensing that keeps going up every few years. Using such integration tools not only affects cash flow but also adds a layer of opaqueness, as we don’t control the flow of integration, and we cannot granularize it beyond what’s offered. Custom integration gives you better control and analytics, which readymade solutions can’t.Here’s a CloudFronts Case Study published by Microsoft, wherein we connected systems for our customer with multiple systems driving data and insights. To conclude, AI Agents are meant to be for your organization aren’t optimized to work right away. This disconnect needs to be engineered just like any other implementation project today. As this gap is real and must be fulfilled by something called Unity Catalog and integrations, CloudFronts can help bridge this gap and make AI work for your organization to continue to optimize cash flow against rising costs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Getting Started with the Event Recorder in Business Central

When developing customizations or extensions in Microsoft Dynamics 365 Business Central, working with events is a best practice. Events help ensure your code is upgrade-safe and cleanly decoupled from the standard application code. However, one common challenge developers face is figuring out which events are triggered during certain actions in the system. That’s where the Event Recorder comes in. What Is the Event Recorder? The Event Recorder is a built-in tool in Business Central that allows developers to monitor and log all published and subscribed events during a user session. Think of it as a “black box” recorder for event-driven development. It helps you identify: This tool is extremely helpful when you’re customizing functionality using event subscriptions (as per AL best practices) but aren’t sure which event to subscribe to. Why Use the Event Recorder? Traditionally, developers had to dig through AL code or documentation to find the right event to subscribe to. With the Event Recorder, this becomes faster and more efficient. Key benefits include: How to Use the Event Recorder Here’s a step-by-step guide: Step 1: Open Event Recorder Step 2: Start a New Recording Step 3: Stop Recording Step 4: Review the Results Best Practices Sample Use Case Suppose you’re trying to add custom logic every time a Sales Invoice is posted. You’re not sure which event gets triggered at that point. Using the Event Recorder: Now, you can write a subscriber in your AL code like this: [EventSubscriber(ObjectType::Page, Page::”Customer List”, ‘OnAfterGetRecordEvent’, ”, true, true)] local procedure MyProcedure() begin // Your custom logic here end; Limitations While powerful, the Event Recorder does have some limitations: To conclude, Event Recorder is an indispensable tool for any AL developer working in Business Central. It simplifies the discovery of relevant events, helps maintain clean and upgrade-safe extensions, and boosts overall development efficiency. Whether you’re new to AL or a seasoned developer, incorporating the Event Recorder into your workflow will save you time. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

How to Build a Scorecard in Power BI

What Is a Scorecard in Power BI? A Scorecard is a visual performance monitoring tool that allows you to track key metrics (goals) against predefined targets. Power BI’s Metrics (formerly Goals) feature helps you: Why Use Scorecards? Here’s why Scorecards are powerful for any team: Benefit Description Goal Alignment Track KPIs aligned to strategic objectives. Accountability Assign owners and collaborators for each goal. Real-time Tracking Monitor progress with live metrics. Visual Reporting Easy-to-read dashboards and history tracking. Step-by-Step: How to Build a Scorecard in Power BI Step 1: Navigate to Power BI Service Go to Power BI Service and choose the workspace where you want to create your Scorecard (Premium or Pro workspaces only). Step 2: Create a New Scorecard  You’ll now land on a blank Scorecard canvas. Step 3: Add Metrics to the Scorecard You can connect it to an existing Power BI dataset or manually input values. Step 4: Link Metrics to Data (Optional but Recommended) To automate tracking: This ensures your Scorecard updates automatically with data refreshes. Step 5: Customize the Scorecard You can also create hierarchies — group related goals under broader objectives. Step 6: Share & Collaborate Once your Scorecard is built: To conclude, Power BI Scorecards turn your data into action. They help track goals in real time, assign ownership, and keep teams focused on what matters most. Whether you’re managing a sales team, a project, or company-wide objectives — Power BI Scorecards are a game-changer for performance tracking. Want to bring visibility and accountability to your team goals? Head to Power BI Service and start building your first Scorecard today! Need help connecting metrics to your datasets? Reach out, and we’ll guide you step by step. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Mastering Multithreaded Batch Jobs in Dynamics 365 Finance & Operations

In the world of finance and operations, efficiency and accuracy are critical. Batch jobs play a vital role in automating repetitive tasks, processing large volumes of data, and ensuring seamless business operations. For organizations using Microsoft Dynamics 365 Finance and Operations (D365 F&O), the X++ programming language provides a powerful way to design, schedule, and execute batch jobs effectively. This blog explores how batch jobs function in D365 F&O, their importance in financial and operational workflows, and best practices for implementing them using X++. What Are Batch Jobs in D365 F&O? Batch jobs in Dynamics 365 Finance and Operations are automated processes that run in the background without user intervention. They are ideal for: Batch jobs help reduce manual effort, minimize errors, and improve efficiency. Example: A Simple X++ Batch Job class MyBatchJobTask extends RunBaseBatch { // Define variables str description; // Main execution logic public void run() {     info(“Batch job started: ” + description);  // Business logic here (e.g., update records, process transactions)     ttsbegin;         // Example: Update ledger entries         LedgerJournalTable journalTable;         journalTable.Description = description;         journalTable.insert();     ttscommit;  info(“Batch job completed successfully.”); }  // Constructor public static MyBatchJobTask construct() {     return new MyBatchJobTask(); }  // Main method to run the job public static void main(Args _args) {     MyBatchJobTask batchJob = MyBatchJobTask::construct();     batchJob.description = “End-of-Day Reconciliation”;  // Run the batch job     if (batchJob.prompt())     {         batchJob.run();     } }  } Key Benefits of Batch Jobs in Finance & Operations Best Practices for Batch Jobs in X++ – Example: x++ BatchHeader batchHeader = BatchHeader::construct(); batchHeader.addTask(this); batchHeader.addRuntimeTask(MyOtherBatchTask::construct(), 1); batchHeader.save(); 2. Implement Error Handling & Logging 3. Optimize Performance 4. Schedule Jobs Efficiently 5. Test in a Non-Production Environment Real-World Use Cases 1. Automated Invoice Posting 2. Inventory Revaluation 3. Bank Reconciliation Matching To conclude, batch jobs in Dynamics 365 Finance and Operations (using X++) remain a cornerstone of financial and operational automation. By following best practices—such as optimizing performance, implementing error handling, and leveraging batch groups—organizations can maximize efficiency while reducing manual effort. As Dynamics 365 Finance & Operations continues to evolve, integrating AI and cloud-based batch processing will further enhance speed and reliability. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Time & Expense Management in Dynamics 365 Project Operations

In a project-driven organization, time and expense tracking is not just about administrative accuracy—it’s essential for billing, cost control, compliance, and project profitability. Dynamics 365 Project Operations (D365 PO) offers a seamless and integrated module to manage employee time entries, expense submissions, and approval workflows with real-time visibility into project performance. This article explains the complete lifecycle of time and expense management in D365 PO, from entry to approval, validation, and integration with billing and costing. 1. Time Tracking in D365 PO D365 PO allows team members to enter time against project tasks directly. Time Entry Workflow: Time can be entered daily or weekly, based on organizational preference. Integration with Project Plan:  Time Entry Validation: 2. Approval Process Time entries follow a configurable approval workflow: Approver Typical Role Project Manager Reviews accuracy and relevance of effort Resource Manager Optional; verifies allocation validity Finance Team Optional; validates for billing cycle Approval settings can be defined per project, customer, or legal entity. Approved entries become part of:  3. Expense Management D365 PO supports tracking billable and non-billable expenses incurred during project delivery.  Expense Entry Steps:  Expense Policies: Administrators can define Expense Policies to control spending: Policy Area Examples Limits Max per diem, lodging cap, airfare budget Category Rules Travel allowed only if project is > X days Receipt Requirements Mandatory for amounts above X Currency Controls Only specified currencies allowed Violations can trigger warnings, hard stops, or workflow escalations.  Integration & Automation Post Approval: Time/Expense on the Go:  Reporting & Compliance Auditors and finance teams can rely on historical logs, comments, and attachments for audit trails and regulatory compliance. To conclude, Effective Time and Expense Management in Dynamics 365 Project Operations enables accurate billing, real-time cost tracking, and employee accountability. With intuitive entry interfaces, approval workflows, and policy enforcement, D365 PO ensures both operational efficiency and financial compliance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange