Latest Microsoft Dynamics 365 Blogs | CloudFronts

Seamlessly Switching Lead-Based BPFs After Qualification in Dynamics 365 CRM

In Microsoft Dynamics 365 CRM, Business Process Flows (BPFs) are powerful tools that guide users through defined business stages. However, when working with Lead-based BPFs that persist into Opportunity, certain platform limitations surface-especially when multiple Lead-rooted BPFs are involved. This blog walks through a real-world problem I encountered with Lead → Opportunity BPF switching, why the out-of-the-box behavior falls short, and how I designed a robust client-side + server-side solution to safely and reliably switch BPFs-even after the Lead has already been qualified. How BPFs Work (Quick Recap) It ideally won’t allow a switch, either via brute forcing via client side or server side as – The Problem In my scenario: The Challenge Once a Lead is qualified: This is non-intuitive, error-prone, and inefficient, especially considering the manual effort that goes into it. Solution Overview I implemented a guided, safe, and reversible BPF switching mechanism that: High-Level Architecture This solution uses: Step-by-Step Methodology 1. Entry Point: Opportunity Ribbon Button A custom ribbon button on the Opportunity form: These fields act as a controlled handshake between Opportunity and Lead. 2. Lead OnLoad: Controlled Trigger Execution On Lead form load: if (diffSeconds > 20) { return;} Xrm.WebApi.updateRecord(“lead”, formContext.data.entity.getId(), { cf_shouldtrigger: false}); This ensures: 3. Identifying and Aborting the Existing BPF Before switching: var activeProcess = formContext.data.process.getActiveProcess(); Xrm.WebApi.updateRecord(bpfEntityName,result.entities[0].businessprocessflowinstanceid,{statecode: 1, // Inactivestatuscode: 3 // Aborted}); This is a critical step—without aborting the old instance, Dynamics can behave unpredictably. 4. Switching the UI BPF After aborting: 5. Handling BPF Instance Creation (First-Time Switch Logic) The solution explicitly checks: If it exists: If it does NOT exist (first switch): This dual-path logic makes the solution idempotent and reusable. 6. Server-Side Plugin: Persisting the Truth A plugin ensures that: // Identify BPF typebool isNewBpf = (context.PrimaryEntityName == “new_bpf_entity”); // Resolve related LeadGuid leadId = isNewBpf? ((EntityReference)target[“bpf_leadid”]).Id: ((EntityReference)target[“leadid”]).Id; // Retrieve related Opportunity via LeadEntity opportunity = GetOpportunityByLead(service, leadId); // Determine stages and pathstring qualifyStageId = isNewBpf ? NEW_QUALIFY_STAGE : OLD_QUALIFY_STAGE;string finalStageId = isNewBpf ? NEW_FINAL_STAGE : OLD_FINAL_STAGE;string traversedPath =START_STAGE + “,” + qualifyStageId + “,” + finalStageId; // PATCH 1 – Qualify stageservice.Update(new Entity(target.LogicalName, target.Id){[“activestageid”] = new EntityReference(“processstage”, new Guid(qualifyStageId)),[“traversedpath”] = START_STAGE + “,” + qualifyStageId}); // PATCH 2 – Final stage + Opportunity bindservice.Update(new Entity(target.LogicalName, target.Id){[“activestageid”] = new EntityReference(“processstage”, new Guid(finalStageId)),[“traversedpath”] = traversedPath,[isNewBpf ? “bpf_opportunityid” : “opportunityid”] =new EntityReference(“opportunity”, opportunity.Id)}); // Mark Lead as successfully processedservice.Update(new Entity(“lead”, leadId){[“cf_pluginsuccess”] = new OptionSetValue(1) // Yes}); This guarantees data consistency and auditability. 7. Final UI Sync & Redirect After successful completion:   Xrm.Navigation.openForm({ entityName: “opportunity”, entityId: opportunityId }); From the user’s perspective: “I clicked a button, confirmed the switch, and landed back in my Opportunity—done.” Why This Solution Works ✔ Respects Dynamics 365 BPF constraints✔ Prevents orphaned or conflicting BPF instances✔ Handles first-time and repeat switches✔ Ensures server-side persistence✔ Minimal user disruption✔ Fully reversible Most importantly, it bridges the gap between platform limitations and real business needs. Final Thoughts Dynamics 365 BPFs are powerful—but when multiple Lead-rooted processes coexist, manual switching is not enough. This solution demonstrates how: can be combined to deliver a seamless, enterprise-grade experience without unsupported hacks. If you’re facing similar challenges with Lead → Opportunity BPF transitions, this pattern can be adapted and reused with confidence. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Triggering Power Automate Flows Directly from Power BI Reports

Power BI is excellent at visualizing insights, but insights often need action. That’s where the Power Automate visual comes in. With this visual, report consumers can trigger instant Power Automate flows directly from a Power BI report, using the data and filters already applied on the page. No switching tools. No exporting data. Just click and act. This blog walks through how the Power Automate visual works, how to configure it, and what to consider before rolling it out. Understanding Power Automate Visuals – The Power Automate visual adds a button to your Power BI report. When clicked, it runs an instant cloud flow. Key capabilities: From a user’s perspective, it feels like a native action button inside Power BI. Adding the Power Automate Visual In Power BI Desktop One can add the visual in two ways: Once added, the visual appears on the report page with built-in instructions. In Power BI Service The process is identical: One can resize or reposition the button like any other visual. Choosing the Flow Environment Before creating or attaching a flow, select the environment where the flow will live. The environment picker: Choosing the right environment upfront avoids permission and governance issues later. Making the Flow Data-Contextual One of the most powerful features of the Power Automate visual is data context. How it works Example: This makes flows responsive to how users are interacting with the report. Creating or Editing the Flow Editing from Power BI Desktop or Service With the flow selected, add any data fields to the Power Automate Data region, to use as dynamic inputs for the flow. Select More options (…) > Edit to configure the button. In edit mode of the visual, either select an existing flow to apply to the button, or create a new flow to apply to the button. One can start from scratch or start with one of the built-in templates as an example. To start from scratch, select New > Instant cloud flow. Select New step. Here, one can choose a subsequent action or specify a Control if you want to add more logic to determine the subsequent action. Optionally, one can reference the data fields as dynamic content if they want the flow to be data contextual. This example uses the Region data field to create an item in a SharePoint list. Based on the end-user’s selection, Region could have multiple values or just one. After you configure your flow logic, name the flow, and select Save. Select the arrow button to go to the Details page of the flow you created. Here’s the Details page for a saved flow. Select the Apply button  to attach the flow you created to your button. Formatting the Button The Power Automate button is fully customizable: This allows the button to match your report’s design and UX standards. Test the flow After the flow is applied to the button, we need to test it before you share the flow with others. These Power BI flows can only run in the context of a Power BI report. Thus one can’t run these flows in a Power Automate web app or elsewhere. If the flow is data contextual, make sure to test how the filter selections in the report affect the flow outcome. Sharing the Flow with Report Users When the flow runs successfully, it can be shared concerned personas of the report. Give users edit access Alternatively, you can give any users edit access to the flow, not just run permissions. Considerations and Limitations Before adopting the Power Automate visual, keep these points in mind: These constraints help maintain performance, security, and governance. When to Use the Power Automate Visual This pattern works best when you want to: In short, it bridges the gap between analysis and execution. Final Thoughts The Power Automate visual transforms Power BI from a read-only analytics tool into an interactive action surface: Analyze → Filter → Click → Automate When used thoughtfully, it empowers users to act on insights at the exact moment they discover them — without breaking their flow. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Transforming Lessor Reporting with Dynamics 365 Finance & Operations + Power BI

For global lessors, reporting is more than just a compliance requirement -it’s a strategic capability. Investors, regulators, and executives all expect real-time insights into lease performance, profitability, and funding structures. Traditional spreadsheets and disconnected tools can be replaced with Dynamics 365 Finance & Operations + Power BI. With Microsoft Dynamics 365 Finance & Operations (F&O) combined with Power BI, lessors can achieve compliance-ready reporting while unlocking deep financial and operational insights. The Reporting Challenges Lessors Face Lessor reporting must comply with these questions: Without automation, finance teams would need manual reconciliations. Dynamics 365 Finance & Operations + Power BI: The Reporting Engine Compliance Reporting Power BI dashboard with lease liabilities trend line and revenue recognition chart. Funding & ROI Transparency Stacked bar chart showing funding mix with KPI cards for ROI by source. Billing & Revenue Recognition Column chart comparing recurring vs usage-based revenue streams. Profitability Analysis Heatmap of profitability by customer with KPI margin %. Renewal & Churn Insights To conclude, with Dynamics 365 F&O and Power BI, lessors achieve: Reporting is no longer a back-office activity. With Dynamics 365 Finance & Operations and Power BI, lessors can transform reporting into a strategic driver – ensuring compliance while delivering actionable insights that improve investor confidence and portfolio profitability. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Why Growing Businesses Are Replacing Custom ERPs with Business Central

For many small and medium-sized organizations, the ERP that once powered early growth is now slowing progress. Custom-built systems, often implemented long before the cloud era, were developed for a different time: smaller product catalogs, simpler compliance requirements, and fewer integration demands. Today’s businesses need more: more visibility, more agility, and more operational resilience. That is where Microsoft Dynamics 365 Business Central stands out. Its cloud-native architecture, rich financial and operational capabilities, and strong talent availability make it an ideal next step for organizations evolving from aging, home-grown systems. When “It Still Works” Is Not Enough Leaders often tell us their legacy ERP is still functioning. But “functioning” is not the same as “fit for the future.” Common challenges we hear include: 1) Systems Built for a Smaller Business Custom ERPs often cannot scale with new product lines, acquisitions, or international expansion. What once felt tailored now feels restrictive. 2) Rising Skill Gaps The original developers and architects are long gone. Each new change requires specialized workarounds, creating dependency on limited IT support and extending delivery timelines. 3) Infrastructure and Security Risks On-premises systems demand constant upkeep: servers, backups, security patches, disaster recovery, and more. Maintaining all this diverts attention from core business priorities and increases risk exposure. 4) Limited Audit and Compliance Capabilities Regulatory expectations have evolved. Many legacy ERPs lack traceability, standardized reporting, and audit-ready controls, making compliance costly and inefficient. These challenges create operational drag. Instead of enabling efficiency, the ERP becomes a barrier to progress. That is why many organizations are accelerating their move to the cloud, and Business Central has become the preferred direction. Why Business Central Is the Right Upgrade Path Modern Skills and Easier Adoption Business Central aligns with competencies already familiar to finance and IT teams. Talent is more widely available compared to niche ERP platforms, lowering hiring and training efforts. The Right Size for SMB Growth It offers robust ERP capabilities without the cost and complexity associated with larger enterprise systems. Cloud as a Differentiator With Microsoft handling security, performance, and updates, organizations free up resources for innovation instead of infrastructure maintenance. Designed for Integration CloudFronts has helped many organizations successfully transition from custom ERPs to Business Central Online. To further simplify operations, we have developed the PO BC Integration Module 2.0. This connects Dynamics 365 Project Operations and Business Central, delivering process continuity that is missing in standard connectors. A Foundation for the Future Migrating to Business Central is not just a technology upgrade. It is a strategic shift. It builds the foundation for advanced reporting, AI-driven insights, automation, and scalable growth. Businesses that make this move gain a system that: ✔ Supports today’s operations✔ Adapts to future changes✔ Reduces risk and complexity✔ Strengthens competitiveness Ready to Modernize Your ERP? CloudFronts helps organizations move from custom, outdated systems to Business Central with a structured, low-risk transformation approach. If you are considering your next ERP move, we are here to support you at every step. Connect with our experts: transform@cloudfronts.com

Share Story :

How Unity Catalog Improves Data Governance for Power BI and Databricks Projects

As organizations scale their analytics platforms, governance often becomes the hardest problem to solve. Data may be accurate, pipelines may run on time, and reports may look correct, but without proper governance, the platform becomes fragile. We see this pattern frequently in environments where Power BI reporting has grown around a mix of SQL Server databases, direct Dataverse connections, shared storage accounts, and manually managed permissions. Over time, access control becomes inconsistent, ownership is unclear, and even small changes introduce risk. Unity Catalog addresses this problem by introducing a centralized, consistent governance layer across Databricks and downstream analytics tools like Power BI. The Governance Problem Most Teams Face In many data platforms, governance evolves as an afterthought. Access is granted at different layers depending on urgency rather than design. Common symptoms include: As reporting expands across departments like Finance, HR, PMO, and Operations, this fragmented governance model becomes difficult to control and audit. Why Unity Catalog Changes the Governance Model Unity Catalog introduces a unified governance layer that sits above storage and compute. Instead of managing permissions at the file or database level, governance is applied directly to data assets in a structured way. At its core, Unity Catalog provides: This shifts governance from an operational task to an architectural capability. A Structured Data Hierarchy That Scales Unity Catalog organizes data into a simple, predictable hierarchy: Catalog → Schema → Table This structure brings clarity to large analytics environments. Business domains can be separated cleanly, such as CRM, Finance, HR, or Projects, while still being governed centrally. For Power BI teams, this means datasets are easier to discover, understand, and trust. There is no ambiguity about where data lives or who owns it. Centralized Access Control Without Storage Exposure One of the biggest advantages of Unity Catalog is that access is granted at the data object level, not the storage level. Instead of giving Power BI users or service principals direct access to storage accounts, permissions are granted on catalogs, schemas, or tables. This significantly reduces security risk and simplifies access management. From a governance perspective, this enables: Power BI connects only to governed datasets, not raw storage paths. Cleaner Integration with Power BI When Power BI connects to Delta tables governed by Unity Catalog, the reporting layer becomes simpler and more secure. Benefits include: This model works especially well when combined with curated Gold-layer tables designed specifically for reporting. Governance at Scale, Not Just Control Unity Catalog is not only about restricting access. It is about enabling teams to scale responsibly. By defining ownership, standardizing naming, and centralizing permissions, teams can onboard new data sources and reports without reworking governance rules each time. This is particularly valuable in environments where multiple teams build and consume analytics simultaneously. Why This Matters for Decision Makers For leaders responsible for data, analytics, or security, Unity Catalog offers a way to balance speed and control. It allows teams to move quickly without sacrificing governance. Reporting platforms become easier to manage, easier to audit, and easier to extend as the organization grows. More importantly, it reduces long-term operational risk by replacing ad-hoc permission models with a consistent governance framework. To conclude, strong governance is not about slowing teams down. It is about creating a structure that allows analytics platforms to grow safely and sustainably. Unity Catalog provides that structure for Databricks and Power BI environments. By centralizing access control, standardizing data organization, and removing the need for direct storage exposure, it enables a cleaner, more secure analytics foundation. For organizations modernizing their reporting platforms or planning large-scale analytics initiatives, Unity Catalog is not optional. It is foundational. If your Power BI and Databricks environment is becoming difficult to govern as it scales, it may be time to rethink how access, ownership, and data structure are managed. We have implemented Unity Catalog–based governance in real enterprise environments and have seen the impact it can make. If you are exploring similar initiatives or evaluating how to strengthen governance across your analytics platform, we are always open to sharing insights from real-world implementations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

How Project Operations – Business Central Integration Impacts Financial Posting

Project Operations and Business Central are designed to work together, one managing project execution, the other ensuring financial accuracy. When integrated thoughtfully, they create a clean and reliable flow from project activity to financial reporting. Clear Ownership of Responsibilities In a PO–BC integration: This separation allows project teams to focus on delivery while finance maintains full control over accounting outcomes. Smooth Cost Flow from Projects to Finance Costs captured in Project Operations- time, expenses, and materials – are transferred to Business Central as project journals. Business Central then: This ensures project activity is reflected accurately in financial statements. Consistent Project, Task, and Dimension Mapping A well-designed mapping between: ensures costs and revenue are visible: This makes both project reviews and financial reporting easier and more reliable. Period Control and Financial Accuracy Project Operations captures real-world project activity. Business Central applies: Together, they ensure project data flows into the correct accounting periods without compromising financial governance. Strong Visibility into Commitments and Actuals With the right setup: This combination provides management with a clear view of: To conclude, Project Operations tells the story of the project. Business Central tells the story of the business. When aligned, both stories match, and decision-making becomes easier. Final Thought Project Operations and Business Central integration works best when designed as a financial process, not just a system connection. With the right structure, it delivers clarity for project teams and confidence for finance. We have packaged our Project Operations-Business Central integration to help organizations achieve this alignment with minimal complexity You can explore our PO–BC integration on Microsoft AppSource here: PO-BC Integration I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

How to Enable Attachment Functionality in Dynamics 365 Finance and Operations

Efficient document management is vital for seamless business operations. Are you looking to enable and customize the attachment functionality in Dynamics 365 Finance and Operations (D365FO)? This guide will walk you through the steps to activate this feature and enhance your document-handling capabilities. Understanding the Business NeedBusinesses often handle scenarios where attachments—such as invoices, purchase orders, or additional documents-are tied to forms like Sales Orders or Journals. These attachments streamline communication, improve transparency, and provide essential references. For example, attaching specific documents to Sales Order lines ensures clarity and supports collaboration. Steps to Enable the Attachment Functionality Bonus: Enabling Attachment CountsAttachment counts provide a quick overview of the number of documents linked to a record. This feature offers instant visibility into attachment volumes, supporting better decision-making. Why Use the Built-in Functionality?While attachments can be enabled via backend configurations, the platform’s built-in tools are more efficient, aligning with best practices. Most forms already support this functionality by default, emphasizing its importance in D365FO’s design. To conclude, by enabling the attachment functionality in D365FO, businesses can effectively manage critical documents, streamlining operations and communication. Don’t forget to implement the attachment count feature for quick insights. Explore this functionality today to enhance your document management process We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

Real-Time vs Batch Integration in Dynamics 365: How to Choose

When integrating Dynamics 365 with external systems, one of the first decisions you’ll face is real-time vs batch (scheduled) integration. It might sound simple, but choosing the wrong approach can lead to performance issues, unhappy users, or even data inconsistency. In this blog, I’ll Walk through the key differences, when to use each, and lessons we’ve learned from real projects across Dynamics 365 CRM and F&O. The Basics: What’s the Difference? Type Description Real-Time Data syncs immediately after an event (record created/updated, API call). Batch Data syncs periodically (every 5 mins, hourly, nightly, etc.) via schedule. Think of real-time like WhatsApp you send a message, it goes instantly. Batch is like checking your email every hour you get all updates at once. When to Use Real-Time Integration Use It When: Example: When a Sales Order is created in D365 CRM, we trigger a Logic App instantly to create the corresponding Project Contract in F&O. Key Considerations When to Use Batch Integration Use It When: Example: We batch sync Time Entries from CRM to F&O every night using Azure Logic Apps and Azure Blob checkpointing. Key Considerations Our Experience from the Field On one recent project: As a Result, the system was stable, scalable, and cost-effective. To conclude, you don’t have to pick just one. Many of our D365 projects use a hybrid model: Start by analysing your data volume, user expectations, and system limits — then pick what fits best. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

Designing Event-Driven Integrations Between Dynamics 365 and Azure Services

When integrating Dynamics 365 (D365) with other systems, most teams traditionally rely on scheduled or API-driven integrations. While effective for simple use cases, these approaches often introduce delays, unnecessary API calls, and scalability issues.That’s where event-driven architecture comes in. By designing integrations that react to business events in real-time, organizations can build faster, more scalable, and more reliable systems. In this blog, we’ll explore how to design event-driven integrations between D365 and Azure services, and walk through the key building blocks that make it possible. Core Content 1. What is Event-Driven Architecture (EDA)? Example in D365:Instead of running a scheduled job every hour to check for new accounts, an event is raised whenever a new account is created, and downstream systems are notified immediately. 2. How Events Work in Dynamics 365 Dynamics 365 doesn’t publish events directly, but it provides mechanisms to capture them: By connecting these with Azure services, we can push events to the cloud in near real-time. 3. Azure Services for Event-Driven D365 Integrations Once D365 emits an event, Azure provides services to process and route them: 4. Designing an Event-Driven Integration Pattern Here’s a recommended architecture: Example Flow:  5. Best Practices for Event-Driven D365 Integrations 6. Common Pitfalls to Avoid To conclude, moving from batch-driven to event-driven integrations with Dynamics 365 unlocks real-time responsiveness, scalability, and efficiency. With Azure services like Event Grid, Service Bus, Functions, and Logic Apps, you can design integrations that are robust, cost-efficient, and future proof. If you’re still relying on scheduled D365 integrations, start experimenting with event-driven patterns. Even small wins (like real-time customer syncs) can drastically improve system responsiveness and business agility. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

Databricks Notebooks Explained – Your First Steps in Data Engineering

If you’re new to Databricks, chances are someone told you “Everything starts with a Notebook.” They weren’t wrong. In Databricks, a Notebook is where your entire data engineering workflow begins from reading raw data, transforming it, visualizing trends, and even deploying jobs. It’s your coding lab, dashboard, and documentation space all in one. What Is a Databricks Notebook? A Databricks Notebook is an interactive environment that supports multiple programming languages such as Python, SQL, R, and Scala. Each Notebook is divided into cells you can write code, add text (Markdown), and visualize data directly within it. Unlike local scripts, Notebooks in Databricks run on distributed Spark clusters. That means even your 100 GB dataset is processed within seconds using parallel computation. So, Notebooks are more than just code editors they are collaborative data workspaces for building, testing, and documenting pipelines. How Databricks Notebooks Work Under the hood, every Notebook connects to a cluster a group of virtual machines managed by Databricks. When you run code in a cell, it’s sent to Spark running on the cluster, processed there, and results are sent back to your Notebook. This gives you the scalability of big data without worrying about servers or configurations. Setting Up Your First Cluster Before running a Notebook, you must create a cluster it’s like starting the engine of your car. Here’s how: Step-by-Step: Creating a Cluster in a Standard Databricks Workspace Once the cluster is active, you’ll see a green light next to its name now it’s ready to process your code. Creating Your First Notebook Now, let’s build your first Databricks Notebook: Your Notebook is now live ready to connect to data and start executing. Loading and Exploring Data Let’s say you have a sales dataset in Azure Blob Storage or Data Lake. You can easily read it into Databricks using Spark: df = spark.read.csv(“/mnt/data/sales_data.csv”, header=True, inferSchema=True)display(df.limit(5)) Databricks automatically recognizes your file’s schema and displays a tabular preview.Now, you can transform the data: from pyspark.sql.functions import col, sumsummary = df.groupBy(“Region”).agg(sum(“Revenue”).alias(“Total_Revenue”))display(summary) Or, switch to SQL instantly: %sqlSELECT Region, SUM(Revenue) AS Total_RevenueFROM sales_dataGROUP BY RegionORDER BY Total_Revenue DESC Visualizing DataDatabricks Notebooks include built-in charting tools.After running your SQL query:Click + → Visualization → choose Bar Chart.Assign Region to the X-axis and Total_Revenue to the Y-axis.Congratulations — you’ve just built your first mini-dashboard! Real-World Example: ETL Pipeline in a Notebook In many projects, Databricks Notebooks are used to build ETL pipelines: Each stage is often written in a separate cell, making debugging and testing easier.Once tested, you can schedule the Notebook as a Job running daily, weekly, or on demand. Best Practices To conclude, Databricks Notebooks are not just a beginner’s playground they’re the backbone of real data engineering in the cloud.They combine flexibility, scalability, and collaboration into a single workspace where ideas turn into production pipelines. If you’re starting your data journey, learning Notebooks is the best first step.They help you understand data movement, Spark transformations, and the Databricks workflow everything a data engineer need. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange