Real-Time PDF Report Generation on Power Pages: Replacing SSRS with Azure Function Apps for a US-Based Cybersecurity Firm

Summary A Houston-based cybersecurity firm eliminated report failures (~65%) by replacing SSRS with an Azure Function App pipeline. Dynamics 365 bound action ensured authentication stayed internal, bypassing Defender-related token failures. Integrated Power Pages, Power Automate, Dynamics 365, and Azure Functions for real-time PDF generation. Report generation time reduced from 3–8 minutes to under 15 seconds with zero infrastructure overhead. Table of Contents 1. About the Customer 2. The Challenge 3. The Solution 4. Technical Implementation 5. Business Impact 6. FAQs 7. Conclusion 1. About the Customer The client is a technology consulting and cybersecurity services firm based in Houston, Texas. They manage multiple concurrent client engagements using Dynamics 365 Project Operations as their core platform. Project managers and clients access live project data through a customer-facing portal built on Microsoft Power Pages. 2. The Challenge The organization needed one-click downloadable Project Status Reports from their Power Pages portal covering risks, issues, logs, and timelines. Their SSRS-based solution failed frequently due to authentication breakdowns caused by Microsoft Defender for Cloud Apps across multiple service boundaries. Key pain points: Silent authentication failures with no clear errors Retry delays of 60–90 seconds per attempt Separate SSRS infrastructure dependency Slow report customization cycle Project managers avoided generating reports during live meetings due to reliability concerns. 3. The Solution At Cloudfronts, while working on this project, I replaced the SSRS pipeline entirely with a synchronous, serverless architecture that keeps the authentication context inside the Dynamics 365 service layer. Technologies Used: Dynamics 365 Project Operations Power Pages Power Automate Plugins Azure Function Apps The solution generates fully formatted PDFs in real time using structured JSON payloads. This eliminated authentication failures while significantly improving speed and reliability. 4. Technical Implementation 1] Power Pages Button triggers Flow A “Download Report” button captures the project GUID and triggers a Power Automate flow with real-time progress feedback. 2] Dynamics 365 Plugin prepares JSON payload A bound action plugin retrieves all project data and converts it into a clean JSON payload for PDF generation. 3] Azure Function generates PDF The Azure Function processes the JSON and generates a formatted PDF, returning it as a Base64 string. 4] SharePoint Integration The generated PDF is automatically stored in the associated SharePoint document location linked to the project. This ensures centralized document management, version control, and easy access for stakeholders directly within the project workspace. 5] Portal PDF Preview The Base64 PDF is rendered directly in the portal using an iframe, allowing instant preview and download. Video: End-to-end implementation of real-time PDF report generation. 5. Business Impact 100% success rate — zero failures post deployment Under 15 seconds report generation time No infrastructure — fully serverless Zero authentication failures Faster iteration for report updates Project managers can now confidently generate reports during live client meetings. 6. FAQs Why not fix the SSRS authentication issue instead of replacing SSRS entirely? The authentication failures were a structural consequence of traversing multiple service boundaries in an environment with strict Defender for Cloud Apps session policies. Fixing them would have required either relaxing those policies — which the client’s security posture did not permit — or re-architecting the data retrieval to stay inside the platform, which is exactly what the bound action approach achieves. Replacing SSRS also removed a separate infrastructure dependency and gave the client full control over report formatting in code. Can this pattern be reused for other document types in Dynamics 365? Yes. The Azure Function App’s renderer is data-driven — it consumes a JSON payload and builds tables from whatever keys are present. The Dynamics 365 plugin can be adapted to query any entity and produce an equivalent payload. CloudFronts has applied the same pattern to inspection records, summary reports, and client-facing status documents across Professional Services and Manufacturing implementations. Does this work for environments without Microsoft Defender for Cloud Apps? Yes. The architectural benefits — synchronous generation, serverless PDF rendering, no SSRS infrastructure, and in-browser preview — apply regardless of the security layer on the environment. 7. Conclusion Replacing SSRS with an Azure Function App-based PDF renderer resolved both the reliability and authentication problems in a single architectural shift, delivering instant, professional-quality Project Status Reports from a Microsoft Power Pages portal with no legacy reporting infrastructure to maintain. The key lesson from this project is that keeping authentication within the Dynamics 365 service layer — rather than bridging to external systems — eliminates an entire category of environment-specific failures that are otherwise very difficult to diagnose and fix. By keeping authentication within Dynamics 365 and leveraging serverless architecture, the solution delivers instant, high-quality reports without infrastructure overhead. This approach demonstrates how modern cloud-native patterns can eliminate entire classes of system failures while improving user experience dramatically. Ready to modernise document generation in your Dynamics 365 environment?CloudFronts builds scalable Power Platform and Dynamics 365 solutions that replace legacy reporting infrastructure and automate document workflows. Reach out at transform@cloudfronts.com. Shashank Keny Associate Consultant · CloudFronts Shashank Keny is an Associate Consultant at CloudFronts with 1.5+ years of experience in cloud, data, and business applications. He specializes in building scalable, API-driven architectures and integrating enterprise systems across the Microsoft ecosystem. He is a Certified Databricks Data Engineer with hands-on experience in Dynamics 365 Project Operations and Dynamics 365 Sales, along with delivering business intelligence solutions using Power BI. His expertise also extends to modern AI solutions, including building custom copilots and implementing intelligent applications using Azure AI Foundry. Passionate about solving real-world business challenges through data and AI, he focuses on delivering efficient, scalable, and production-ready solutions. Experience: 1.5+ years Certification: Databricks Certified Data Engineer Specialization: Dynamics 365 Project Operations, Power BI, Azure Integrations, AI Solutions View LinkedIn Profile

Share Story :

How We Built & Deployed a Mobile-Based Canvas App for Unified Time, Expense (with Receipts) & Material Submission with Project-Based Approvals for a US Cybersecurity Firm

Summary A US-based oil & gas cybersecurity firm implemented a mobile-first Canvas App integrated with Dynamics 365 Project Operations to unify time, expense, and material submission, tracking, and approval. The solution enabled project-specific approval workflows where only assigned approvers could validate submitted records. CloudFronts introduced a dual-mode interface (Day Mode and Week Mode) to improve usability for both field engineers and managers. Submission and approval cycle time reduced from hours/days to near real-time visibility. Table of Contents 1. Customer Scenario 2. Solution Overview 3. Key UX Features 4. Functional Implementation 5. Solution Walkthrough 6. Architecture & Integration Approach 7. Business Impact 8. FAQs 9. Conclusion Customer Scenario A Texas-based cybersecurity firm specializing in operational technology (OT) security for oil rigs manages multiple concurrent field projects using Dynamics 365 Project Operations. Employees and resources were responsible for logging: Time entries Expense entries (travel, accommodation, airfare, etc.) Material usage logs (equipment, parts, consumables, etc.) However, the system was not designed for mobile-first usage, and processes were fragmented across multiple interfaces. Key Challenges Field engineers & other Resources could not efficiently submit entries from mobile devices Time, expense, and material tracking existed in separate workflows Approval processes had to be restricted to project-specific stakeholders Project managers lacked real-time visibility into resource usage • Delays in submission can cause downstream billing and reporting issues Project tracking accuracy can get compromised, and reporting delays directly affected client communication and billing cycles. Solution Overview CloudFronts designed and deployed a unified mobile application using Power Apps (Canvas Apps) integrated with Dynamics 365 Project Operations. Objective: One app → All submissions → Controlled approvals → Real-time visibility What the App Enables For Field Users: Submit time entries (daily or weekly) Create expense entries with receipt validation Log material consumption against projects Track submission status instantly For Project Approvers: View only entries related to assigned projects Approve or reject submissions directly from mobile Maintain audit-ready approval workflows Key UX Features The application is designed with a strong focus on usability for both resources and project approvers, ensuring a seamless mobile experience across submission and approval workflows. 1. Day Mode / Week Mode Toggle The app provides a flexible entry experience through a dual-mode interface: Day Mode: Enables detailed entry for a single day, ideal for precise logging and corrections. Week Mode: Allows bulk entry across multiple days, reducing effort for repetitive data entry. This flexibility significantly improves usability across different working styles and scenarios. 2. Calendar-Based Swipe Navigation The application introduces a Dynamics-style calendar navigation with swipe support, allowing users to: Traverse across multiple days or weeks effortlessly View and manage multiple submission records in sequence Navigate between historical and current entries with minimal effort This mobile-first interaction design reduces friction in high-frequency data entry scenarios. 3. Unified Submission & Approval Experience The UI/UX is intentionally designed to mirror the complete lifecycle of a record, ensuring consistency between submission and approval stages. Each record follows a structured lifecycle aligned with Dynamics 365 stages: Submitted Pending Approved Rejected Recall Requested Recall Request Approved Recall Request Rejected The interface dynamically adapts based on the current stage: Action buttons (Approve, Reject, Recall, etc.) are conditionally visible Status indicators are clearly displayed Users experience the same structured flow from creation to closure This ensures clarity, reduces errors, and improves user confidence in the system. 4. Dynamic Action-Based UI (Smart Button Behavior) The app intelligently modifies UI controls based on record state: Submit button appears only for draft entries Approve/Reject buttons are visible only to project approvers Recall option is available only after submission Post-approval states restrict further edits This enforces role-based and state-based control, preventing invalid actions and maintaining process integrity. 5. Conditional Receipt Upload for Expense Entries Expense submission logic is enhanced with category-driven validation: Mandatory: Airline tickets, OT hardware purchases Optional: Meals, local travel This balances compliance requirements with user convenience, avoiding unnecessary friction. 6. On-Demand Data Refresh Users can manually refresh data within the app to: Fetch the latest submission and approval statuses Sync newly created or updated records Ensure real-time visibility without relying solely on background refresh Especially useful in environments with intermittent connectivity. 7. Mobile-First Interaction Design Touch-friendly controls Swipe navigation Lightweight screens for faster performance Minimal navigation depth This ensures field engineers working in remote or on-site environments can operate efficiently. Functional Implementation This section outlines how the solution was implemented within Dynamics 365 Project Operations and the Power Platform to enable end-to-end submission and approval management. 1. Unified Data Model in Dataverse All three entry types — Time, Expense, and Material — are structured within Dataverse and linked to: Project Resource (User) Approval records Supporting documents (for expenses) Each submission creates a corresponding record with a defined lifecycle stage, ensuring consistency across all entry types. 2. Submission Logic from Canvas App Each submission type follows a structured flow: User selects project and entry type (Time / Expense / Material) Required fields are validated based on entry type Conditional logic enforces: Receipt requirement (for specific expense categories) Mandatory fields (based on business rules) Record is created in Dataverse Submission triggers backend approval workflow This ensures that all records entering the system are complete, validated, and ready for approval processing. 3. Approval Record Creation & Routing Upon submission: A corresponding approval record is automatically created The system identifies project-specific approvers Key behavior: Only assigned project approvers can view and act on records Approval actions update the main record status 4. Record Lifecycle Management (Status-Driven System) Lifecycle: Draft → Submitted → Pending → Approved / Rejected → Recall Flow Users submit records → moves to Submitted Approvers review → Approved or Rejected Users request recall → Recall Requested Approvers respond → Recall Approved or Rejected Controlled through: Power Apps UI logic MS Bound Actions for submission and approval handling Dataverse status fields 5. Expense Receipt Handling (Integrated from Previous Solution) Receipt upload enforced conditionally Files stored as Notes (Annotations) in Dataverse Linked to expense records This eliminates manual document handling and ensures compliance. Solution Walkthrough The following walkthrough … Continue reading How We Built & Deployed a Mobile-Based Canvas App for Unified Time, Expense (with Receipts) & Material Submission with Project-Based Approvals for a US Cybersecurity Firm

Share Story :

Transforming Return Logistics for a USA Manufacturer: Automating Shipment Processing with Dynamics 365 Customer Service

Summary This blog highlights the integration of Microsoft Dynamics 365 Customer Service Hub with FedEx Shipping Manager to handle automated email return shipments for a consumer electronic appliances company based in Massachusetts, USA. In the original process, customer service representatives were required to manually register each return shipment through the FedEx Shipping Manager portal. This process involved copying customer details, creating shipments, generating labels, and capturing tracking numbers — a workflow that typically required 20–30 minutes per request. The integration project automated the entire return shipment process directly within the Dynamics 365 Customer Service Hub. With a single click, the system now registers the shipment using FedEx Shipment APIs, generates a return label, captures the tracking number, and updates the case record automatically. This innovation eliminated the need for agents to switch between systems and reduced shipment registration time from 20–30 minutes to just a few seconds, significantly improving operational efficiency and the overall customer service experience. This blog explains: 1] The operational challenges caused by manual shipment registration. 2] How Dynamics 365 Customer Service Hub was integrated with FedEx Shipping Manager. 3] The functional workflow used to automate shipment creation. 4] How customer service representatives trigger shipments directly from CRM. 5] The business impact achieved through automation and system integration. Table of Contents 1. Customer Scenario 2. Solution Overview 3. Functional Implementation Approach 4. Email Return Label Experience 5. Handling Complex Data Automatically 6. Business Impact 7. Preview Video 8. Final Thoughts Customer Scenario A Massachusetts-based consumer appliance manufacturer known for building innovative kitchen technology was experiencing a growing operational challenge in its customer service operations. As demand for its products increased across major retail channels, the number of customer support cases related to product returns and replacements also grew significantly. The company’s customer support team handled all service requests through Microsoft Dynamics 365 Customer Service. However, when a product needed to be returned for inspection, replacement, or warranty evaluation, agents were required to manually create a shipment in FedEx Ship Manager. This manual process involved several steps: 1] Opening the customer case in the CRM system 2] Copying customer information and shipping details 3] Logging into the FedEx portal 4] Registering the shipment manually 5] Generating a return label 6] Capturing the tracking number 7] Returning to CRM to update the case Each shipment registration typically took 20–30 minutes. When hundreds of return requests were processed weekly, this created several operational challenges: 1] Agents constantly switched between multiple systems 2] Manual data entry increased the risk of errors 3] Customer response times increased, leading to customer resentment 4] Tracking information was not always immediately available in the case record The organization needed a more efficient way to handle returns while keeping the entire process inside their CRM platform. Solution Overview To streamline the returns process, I implemented an integration between Microsoft Dynamics 365 Customer Service and FedEx shipping manager services. The goal was simple: Allow customer service representatives to generate a return shipment directly from the case record with a single click. Instead of navigating to the separate external shipping portal, agents can now initiate a return shipment directly from the CRM case page. Once triggered, the system automatically handles the entire shipment (Email/Return/Label) registration process. With this solution in place, the workflow now looks like this: A customer contacts support regarding a product return via their website, which registers an associated Case record in D365 Case Management (via existing case automation). The support agent opens the case in Dynamics 365. A “Create Return Shipment” button becomes available when the case meets the required conditions, e.g., Case Stage, RMA availability, Region of Customer, etc., thus validating and restricting shipment privileges. With one click, the system registers the shipment with FedEx (via appropriate FedEx Shipment APIs, as per customer requirements). The shipment tracking number is automatically captured and stored in the case record. This tracking number is useful for the customer support team as well as the customer to check the progress of the shipment on the FedEx Shipping Manager portal. The customer receives an email return label that they can print and attach to their package. FedEx Email Return Shipment Process Flow This transformation reduced a 20–30 minute process to just a few seconds. Functional Implementation Approach The implementation focused on simplifying the experience for customer service agents while maintaining strict control over when and how shipments could be created. Intelligent Shipment Trigger Visibility Within the CRM case interface, the return shipment button appears only when specific conditions are met. This ensures that shipments are created only for valid return scenarios. Examples of conditions include: The case must have an approved return authorization The case must be in an appropriate service stage The customer address must be eligible for shipment Required customer information must be available Example: Return Shipment Trigger inside Dynamics 365 Customer Service Hub By embedding these conditions into the CRM interface, agents are guided through the correct service workflow without needing to remember complex procedures. Automated Shipment Creation Once the button is clicked, the system automatically gathers key information from the case record, such as: Customer details Shipping address Product description Return authorization number Contact phone number This information is then used to register the shipment through the FedEx shipping system. The system generates: A unique shipment tracking number A return shipment registration A digital return label The warehouse where the shipment would reach based on the product and end consumer requirement – e.g., return, replacement, or repair of the product Example: A Successful Return Shipment to a specific warehouse. Example: Tracking a Return Shipment using the Tracking No. updated on D365 Customer Service Hub. Example: The FedEx Shipping Manager for Tracking the Integrated Shipments. The tracking number is immediately written back to the case record in Microsoft Dynamics 365 Customer Service, ensuring that support agents can track the return shipment without leaving the case. Email Return Label Experience After the shipment is registered, the customer automatically receives an email containing their return label. … Continue reading Transforming Return Logistics for a USA Manufacturer: Automating Shipment Processing with Dynamics 365 Customer Service

Share Story :

How to use Dynamics 365 CRM Field-Level Security to maintain confidentiality of Intra-Organizational Data

Summary In most CRM implementations, data exposure should be encapsulated for both inside & outside the organization. Sales, Finance, Operations, HR, everyone works in the same system. Collaboration increases. Visibility increases. But so does risk. This is based on real-world project experience, for a practical example I had implemented for a technology consulting and cybersecurity services firm based in Houston, Texas, USA, specializing in modern digital transformation and enterprise security solutions. This blog explains: 1] Why Security Roles alone are not enough. 2] How users can still access data through Advanced Find, etc. 3] What Field-Level Security offers beyond entity-level restriction. 4] Step-by-step implementation. 5] Business advantages you gain. Table of Contents The Real Problem: Intra-Organizational Data Exposure Implementation of Field-Level Security Results Why Was a Solution Required? Business Impact The Real Problem: Intra-Organizational Data Exposure Let’s take a practical cross-department scenario. Both X Department and Y Department work in the same CRM system built on Microsoft Dynamics 365. Entities Involved 1] Entity 1 2] Entity 2 Working Model X Department Fully owns and manages Entity 1 Occasionally needs to refer to specific information in Entity 2 Y Department Fully owns and manages Entity 2 Occasionally needs to refer to specific information in Entity 1 This is collaborative work. You cannot isolate departments completely. But here’s the challenge: Each entity contains sensitive fields that should not be editable — or sometimes not even visible — to the other department. Security Roles in Microsoft Dynamics 365 operate at the entity (table) level, not at the field (column) level. Approach Result Remove Write access to Entity 2 for X Dept X Dept cannot update anything in Entity 2 — even non-sensitive fields Remove Read access to sensitive fields in Entity 2 Not possible at field level using Security Roles Restrict Entity 2 entirely from X Dept X Dept loses visibility — collaboration breaks Hide fields from the form only Data still accessible via Advanced Find or exports This is the core limitation. Security Roles answer: “Can the user access this record?” They do NOT answer: “Which specific data inside this record can the user access?” Implementation of Field-Level Security Step 1: Go to your Solution & Identify Sensitive Fields, usually Personal info, facts & figures, etc. e.g. cf_proficiencyrating. Step 2: Select the field and “Enable” it for Field Level Security (This is not possible for MS Out of the Box fields) Step 3: Go to Settings and then select “Security” Step 4: Go to Settings and then select “Security” -> “Field Security Profiles” Step 5: Either create or use existing Field Security Profile, as required Step 6: Within this one can see all the fields across Dataverse which are enabled for Field Security, Here the user should select their field and set create/read/update privileges (Yes/No). Step 7: Then select the system users, or the Team (having the stakeholder users), and save it. Results: Assume you are a user from X dept. who wants to access Entity 2 Record, and you need to see only the Proficiency Rating & Characteristic Name, but not Effective Date & Expiration Date; now since all fields have Field Level Security they would have a Key Icon on them, but the fields which do not have read/write access for you/your team, would have the Key Icon as well as a “—“. The same thing would happen in Views, subgrids, as well as if the user uses Advanced Find. Why this Solution was Required? The organization needed: 1] Cross-functional collaboration 2] Protection of confidential internal data 3] Clear separation of duties 4] No disruption to operational workflows They required a solution that: 1] Did not block entity access 2] Did not require custom development 3] Enforced true data-level protection Business Impact 1. Confidential Data Protection Sensitive internal data was secured without restricting overall entity access, enabling controlled collaboration. 2. Reduced Internal Data Exposure Risk Unauthorized users could no longer retrieve protected fields via Advanced Find, significantly lowering governance risk. 3. Clear Separation of Duties Departmental ownership of sensitive fields was enforced without disrupting cross-functional visibility. 4. Improved Audit Readiness Every modification to protected fields became traceable, strengthening accountability and compliance posture. 5. Reduced Operational Friction System-enforced field restrictions eliminated the need for entity blocking, duplicate records, and manual approval workarounds. 6. Efficiency Gains The solution was delivered through configuration — no custom code, no complex business rules, and minimal maintenance overhead. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Plugin Class Code Recovery using XRMToolBox & C# DotPeek.

In an ideal Dynamics 365 (Dataverse) project, plugin source code lives safely in a version-controlled repository, flows cleanly through Azure DevOps pipelines, and is always recoverable. In reality, many of us inherit environments where that discipline didn’t exist. I recently worked with a customer where: This created a common but uncomfortable challenge in the Dynamics 365 world:How do you maintain, debug, or enhance plugins when the source code is lost? Rewriting everything from scratch was risky and time-consuming. Guessing behavior based on runtime results wasn’t reliable. Fortunately, Dynamics 365 and the .NET ecosystem give us a practical and effective alternative. Using XrmToolBox and JetBrains dotPeek, it is possible to recover readable C# plugin code directly from the deployed assembly. (Though the C# Class code recovered won’t be 100% exact, as the variable names would be different and generic; it is only suitable for close logic, structure & functional recovery) The Practical Solution The approach consists of two main steps: This does not magically restore comments or original formatting, but it does give a working, understandable code that closely reflects the original plugin logic. Tools Used Step 1: Extract the Plugin Assembly from Dataverse 1. Connect to the Environment 2. Load the Assembly Recovery Tool 3. Download the DLL At this point, you have successfully recovered the compiled plugin assembly exactly as it exists in the environment. Step 2: Decompile the DLL Using JetBrains dotPeek 1. Open dotPeek 2. Explore the Decompiled Code dotPeek will: One can now browse through: This is usually more than enough to understand how the plugin works. 3. Export to a Visual Studio Project (Optional but Recommended) One of dotPeek’s most powerful features is Export to Project: This gives you a proper .csproj with class files that you can open, build, and extend. Possibilities with the Recovered Code Once you have the decompiled C# code, several options open up: 1. Rebuild the Plugin Assembly 2. Re-register the Plugin 3. Maintain or Enhance Functionality Important Considerations Key Takeaway Losing plugin source code does not mean losing control of your Dynamics 365 solution. With XrmToolBox’s Assembly Recovery Tool and JetBrains dotPeek, you can: There are chances while working in Dynamics 365 technologies, that a developer might face this situation. Knowing this technique can save days-or weeks-of effort and give your customer confidence that their system remains fully supportable. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Browser-Level State Retention in Dynamics 365 CRM: Improving Performance & UX with Session Storage

Dynamics 365 model-driven apps are excellent at storing business data, but not every piece of information belongs in Dataverse. A common design folly is using Dataverse fields to store temporary UI state-things like selected views, filters, or user navigation preferences. While this works technically, it introduces unnecessary performance overhead and can create incorrect behavior in multi-user environments. In this blog, I’ll focus on browser-level retention of CRM UI data using “sessionStorage“, using subgrid view retention as a practical example for a technology consulting and cybersecurity services firm based in Houston, Texas, USA, specializing in modern digital transformation and enterprise security solutions. The Real Problem: UI State vs Business Data Let’s separate concerns clearly: Type Example Where it should live Business data Status, owner, amounts Dataverse UI state Selected view, filter, scroll position Browser Subgrid views fall squarely into the UI state category. Scenario: Subgrid View Resetting on Navigation Users reported the following behavior: This breaks user workflow, especially for records that users revisit frequently. Possible Solution: Persisting UI State in Dataverse (Original Approach) This would attempt to fix it by storing the selected subgrid view GUID in a Dataverse field on the parent record. How It Works Why this might look reasonable The Hidden Problems 1] Slower Form Execution 2] Data Model Pollution 3] Incorrect Multi-User Behavior 4] Scalability Issues In short, Dataverse was doing work it should never have been asked to do. Workaround to this Approach: Keep UI State in the Browser for that session, But practically: The selected subgrid view belongs to the user’s session, not the record. Once that boundary is respected, the solution becomes much cleaner. Practical Solution: Browser Session Storage (Improved Approach) Instead of persisting view selection in Dataverse, we store it locally in the browser using sessionStorage. sessionStorage is part of the Web Storage API, which provides a way to store key-value pairs in a web browser. Unlike localStorage, which persists data even after the browser is closed, sessionStorage is designed to store data only for the duration of a single session. This means that the data is available as long as the tab or window is open, and it is cleared when the tab or window is closed. Why Session Storage? How the Improved Solution Works 1. Store the View Locally on Subgrid Change 2. Restore the View on Form/Grid Load This ensures that when the user revisits the form, the subgrid opens exactly where they left off. Why This Approach Is Superior 1] Faster Execution 2] Correct User Experience 3] Clean Architecture 4] Zero Backend Impact When to Use Browser-Level Retention Use this pattern when: Examples: To conclude, not all data deserves to live in Dataverse. When you store UI state in the browser instead of the database, you gain: Subgrid view retention is just one example-but the principle applies broadly across Dynamics 365 customizations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Exposing Plugins as Bound Actions for Power Automate Flows: A Practical Procedure for Efficient Record Processing, involving several records.

In complex business processes, like calculating commissions or validating data across multiple records, applying the same logic repeatedly in a Power Automate flow can quickly become inefficient and difficult to maintain. A more scalable approach is to encapsulate the logic in a Dataverse plugin, expose it as a bound action, and then call this action from a flow. This method centralizes business rules, reduces redundancy, and improves maintainability. In this post, we’ll walk through the steps to implement this approach and examine its advantages over applying the same logic directly within a flow for each individual record. We’ll illustrate this with a practical example from a Houston-based technology consulting and cybersecurity services firm that specializes in modern digital transformation and enterprise security solutions. Flow Diagram Step 1: Create the PluginThe first step is to write a plugin that contains the logic you want to apply to each record. Example: DuplicateCommissionsCounter Step 2: Expose the Plugin as a Bound ActionInstead of running plugin logic manually for each record, you can register it as a bound action in Dataverse. Procedure: E.g. 2. Attach your plugin to this action. Outcome: This exposes your plugin logic as a reusable, callable bound action. Any process or flow can now invoke it for a specific invoice record. Step 3: Use Power Automate to Call the Bound ActionOnce the plugin is exposed, you can loop through multiple records in a flow and call the action. Procedure in Power Automate: This approach ensures that all complex logic resides in the plugin, while the flow orchestrates which records need processing. Advantages Over Logic Directly in the Flow To conclude, exposing plugins as bound actions is a robust, maintainable way to apply complex logic across multiple records in Dataverse. It allows Power Automate flows to focus on orchestration rather than logic execution, leading to cleaner, faster, and easier-to-manage solutions. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Seamlessly Switching Lead-Based BPFs After Qualification in Dynamics 365 CRM

In Microsoft Dynamics 365 CRM, Business Process Flows (BPFs) are powerful tools that guide users through defined business stages. However, when working with Lead-based BPFs that persist into Opportunity, certain platform limitations surface-especially when multiple Lead-rooted BPFs are involved. This blog walks through a real-world challenge I encountered while working with a Houston-based technology consulting and cybersecurity services firm. The firm specializes in modern digital transformation and enterprise security solutions. I explore issues with Lead → Opportunity Business Process Flow (BPF) switching, explain why the out-of-the-box behavior falls short, and detail how I designed a robust client-side and server-side solution to safely and reliably switch BPFs-even after a Lead has already been qualified. How BPFs Work (Quick Recap) It ideally won’t allow a switch, either via brute forcing via client side or server side as – The Problem In my scenario: The Challenge Once a Lead is qualified: This is non-intuitive, error-prone, and inefficient, especially considering the manual effort that goes into it. Solution Overview I implemented a guided, safe, and reversible BPF switching mechanism that: High-Level Architecture This solution uses: Step-by-Step Methodology 1. Entry Point: Opportunity Ribbon Button A custom ribbon button on the Opportunity form: These fields act as a controlled handshake between Opportunity and Lead. 2. Lead OnLoad: Controlled Trigger Execution On Lead form load: if (diffSeconds > 20) { return;} Xrm.WebApi.updateRecord(“lead”, formContext.data.entity.getId(), { cf_shouldtrigger: false}); This ensures: 3. Identifying and Aborting the Existing BPF Before switching: var activeProcess = formContext.data.process.getActiveProcess(); Xrm.WebApi.updateRecord(bpfEntityName,result.entities[0].businessprocessflowinstanceid,{statecode: 1, // Inactivestatuscode: 3 // Aborted}); This is a critical step—without aborting the old instance, Dynamics can behave unpredictably. 4. Switching the UI BPF After aborting: 5. Handling BPF Instance Creation (First-Time Switch Logic) The solution explicitly checks: If it exists: If it does NOT exist (first switch): This dual-path logic makes the solution idempotent and reusable. 6. Server-Side Plugin: Persisting the Truth A plugin ensures that: // Identify BPF typebool isNewBpf = (context.PrimaryEntityName == “new_bpf_entity”); // Resolve related LeadGuid leadId = isNewBpf? ((EntityReference)target[“bpf_leadid”]).Id: ((EntityReference)target[“leadid”]).Id; // Retrieve related Opportunity via LeadEntity opportunity = GetOpportunityByLead(service, leadId); // Determine stages and pathstring qualifyStageId = isNewBpf ? NEW_QUALIFY_STAGE : OLD_QUALIFY_STAGE;string finalStageId = isNewBpf ? NEW_FINAL_STAGE : OLD_FINAL_STAGE;string traversedPath =START_STAGE + “,” + qualifyStageId + “,” + finalStageId; // PATCH 1 – Qualify stageservice.Update(new Entity(target.LogicalName, target.Id){[“activestageid”] = new EntityReference(“processstage”, new Guid(qualifyStageId)),[“traversedpath”] = START_STAGE + “,” + qualifyStageId}); // PATCH 2 – Final stage + Opportunity bindservice.Update(new Entity(target.LogicalName, target.Id){[“activestageid”] = new EntityReference(“processstage”, new Guid(finalStageId)),[“traversedpath”] = traversedPath,[isNewBpf ? “bpf_opportunityid” : “opportunityid”] =new EntityReference(“opportunity”, opportunity.Id)}); // Mark Lead as successfully processedservice.Update(new Entity(“lead”, leadId){[“cf_pluginsuccess”] = new OptionSetValue(1) // Yes}); This guarantees data consistency and auditability. 7. Final UI Sync & Redirect After successful completion:   Xrm.Navigation.openForm({ entityName: “opportunity”, entityId: opportunityId }); From the user’s perspective: “I clicked a button, confirmed the switch, and landed back in my Opportunity—done.” Why This Solution Works ✔ Respects Dynamics 365 BPF constraints✔ Prevents orphaned or conflicting BPF instances✔ Handles first-time and repeat switches✔ Ensures server-side persistence✔ Minimal user disruption✔ Fully reversible Most importantly, it bridges the gap between platform limitations and real business needs. Dynamics 365 BPFs are powerful-but when multiple Lead-rooted processes coexist, manual switching is not enough. This solution demonstrates how: can be combined to deliver a seamless, enterprise-grade experience without unsupported hacks. If you’re facing similar challenges with Lead → Opportunity BPF transitions, this pattern can be adapted and reused with confidence. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Triggering Power Automate Flows Directly from Power BI Reports

Power BI is excellent at visualizing insights, but insights often need action. That’s where the Power Automate visual comes in. With this visual, report consumers can trigger instant Power Automate flows directly from a Power BI report, using the data and filters already applied on the page. No switching tools. No exporting data. Just click and act. This blog walks through how the Power Automate visual works, how to configure it, and what to consider before rolling it out. Understanding Power Automate Visuals – The Power Automate visual adds a button to your Power BI report. When clicked, it runs an instant cloud flow. Key capabilities: From a user’s perspective, it feels like a native action button inside Power BI. Adding the Power Automate Visual In Power BI Desktop One can add the visual in two ways: Once added, the visual appears on the report page with built-in instructions. In Power BI Service The process is identical: One can resize or reposition the button like any other visual. Choosing the Flow Environment Before creating or attaching a flow, select the environment where the flow will live. The environment picker: Choosing the right environment upfront avoids permission and governance issues later. Making the Flow Data-Contextual One of the most powerful features of the Power Automate visual is data context. How it works Example: This makes flows responsive to how users are interacting with the report. Creating or Editing the Flow Editing from Power BI Desktop or Service With the flow selected, add any data fields to the Power Automate Data region, to use as dynamic inputs for the flow. Select More options (…) > Edit to configure the button. In edit mode of the visual, either select an existing flow to apply to the button, or create a new flow to apply to the button. One can start from scratch or start with one of the built-in templates as an example. To start from scratch, select New > Instant cloud flow. Select New step. Here, one can choose a subsequent action or specify a Control if you want to add more logic to determine the subsequent action. Optionally, one can reference the data fields as dynamic content if they want the flow to be data contextual. This example uses the Region data field to create an item in a SharePoint list. Based on the end-user’s selection, Region could have multiple values or just one. After you configure your flow logic, name the flow, and select Save. Select the arrow button to go to the Details page of the flow you created. Here’s the Details page for a saved flow. Select the Apply button  to attach the flow you created to your button. Formatting the Button The Power Automate button is fully customizable: This allows the button to match your report’s design and UX standards. Test the flow After the flow is applied to the button, we need to test it before you share the flow with others. These Power BI flows can only run in the context of a Power BI report. Thus one can’t run these flows in a Power Automate web app or elsewhere. If the flow is data contextual, make sure to test how the filter selections in the report affect the flow outcome. Sharing the Flow with Report Users When the flow runs successfully, it can be shared concerned personas of the report. Give users edit access Alternatively, you can give any users edit access to the flow, not just run permissions. Considerations and Limitations Before adopting the Power Automate visual, keep these points in mind: These constraints help maintain performance, security, and governance. When to Use the Power Automate Visual This pattern works best when you want to: In short, it bridges the gap between analysis and execution. Final Thoughts The Power Automate visual transforms Power BI from a read-only analytics tool into an interactive action surface: Analyze → Filter → Click → Automate When used thoughtfully, it empowers users to act on insights at the exact moment they discover them — without breaking their flow. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Automating Data Cleaning and Storage in Azure Using Databricks, PySpark, and SQL.

Managing and processing large datasets efficiently is a key requirement in modern data engineering. Azure Databricks, an optimized Apache Spark-based analytics platform, provides a seamless way to handle such workflows. This blog will explore how PySpark and SQL can be combined to dynamically process, and clean data using the medallion architecture (Only Raw → Silver) and store the results in Azure Blob Storage as PDFs. Understanding the Medallion Architecture: – The medallion architecture follows a structured approach to data transformation: Aggregated Layer (Gold): Optimized for analytics, reports, and machine learning. In our use case, we extract raw tables from Databricks, clean them dynamically, and store the refined data into the silver schema. Key technologies / dependencies used: – Step-by-Step Code Breakdown 1. Setting Up the Environment Install & import necessary libraries The above command installs reportlab, which is used to generate PDFs. This imports essential libraries for data handling, visualization, and storage. 2. Connecting to Azure Blob Storage This snippet authenticates the Databricks notebook with Azure Blob Storage and prepares a connection to upload the final PDFs; Initiates the Spark Session as well. 3. Cleaning Data: Raw to Silver Layer Fetch all raw tables This dynamically removes NULL values from raw data and creates a cleaned table in the silver layer. 4. Verifying and comparing the Raw and the Cleaned (Silver) 4. Converting Cleaned Data to PDFs 5. Converting Cleaned Data to PDFs Output at the Azure Storage Container This process reads cleaned tables, converts them into PDFs with structured formatting, and uploads them to Azure Blob Storage. 6. Automating cleaning at Databricks at fixed scheduleThis is automated by scheduling the notebook & it’s associated compute instance to run at fixed intervals and timestamps. Further actions: – Why Store Data in Azure Blob Storage? To conclude, by leveraging Databricks, PySpark, SQL, ReportLab, and Azure Blob Storage, we have automated the pipeline from raw data ingestion to cleaned and formatted PDF reports. This approach ensures: a. Efficient data cleansing using SQL queries dynamically. b. Structured data transformation within the medallion architecture. c. Seamless storage and accessibility through Azure Blob Storage. This methodology can be extended to include Gold Layer processing for advanced analytics and reporting. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Categories

Secured By miniOrange