Category Archives: Power Automate
Auto Refresh Subgrid in Dynamics 365 CRM Based on Changes in Another Subgrid
In Dynamics 365 CRM implementations, subgrids are used extensively to show related records within the main form. But what if you want Subgrid B to automatically refresh whenever a new record is added to Subgrid A, especially when that record triggers some automation like a Power Automate flow or a plugin that creates or updates related data? In this blog, I’ll walk you through how to make one subgrid refresh when another subgrid is updated — a common real-world scenario that enhances user experience without needing a full form refresh. Let’s say you have two subgrids on your form: Whenever a new record is added in the Chargeable Categories subgrid, a Power Automate flow or backend logic creates corresponding records in Order Line Categories. However, these new records are not immediately visible in the second subgrid unless the user manually refreshes the entire form or clicks on the refresh icon. This can be confusing or frustrating for end-users. Solution Overview To solve this, we’ll use JavaScript to listen for changes in Subgrid A and automatically refresh Subgrid B once something is added. Here’s the high-level approach: Implementation Steps 1. Create the JavaScript Web Resource Create a new JS web resource and add the following code: How It Works To conclude, this simple yet effective approach ensures a smoother user experience by reflecting backend changes instantly without needing to manually refresh the entire form. It’s particularly helpful when automations or plugins create or update related records that must appear in real-time. By combining JavaScript with Dynamics’ form controls, you can add polish and usability to your applications without heavy customization. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Automatically Update Lookup Fields in Dynamics 365 Using Power Automate: From Custom Tables to Standard Entities
Imagine this: you update a product’s purchase date in a registration record and—boom—a related case automatically gets refreshed with the accurate “Purchased From” lookup. Saves time, reduces errors, and keeps everything in sync without you lifting a finger. Let’s walk through how to make that happen using Power Automate. The goal: When a Product Registration’s cri_purchasedat field is changed, the system will retrieve the related “Purchased From” record and update any linked Case(s) with the appropriate lookup reference. Let’s break down the step-by-step process of how this is done in Power Automate. Step 1: Trigger the Flow When Purchase Date Changes Flow trigger: When a row is added, modified, or deleted (Dataverse) This setup ensures that our flow only fires when that specific date field is modified. Step 2: Pull in the “Purchased From” Record Next, use List rows on the “Purchased From” table with a FetchXML query. We’re searching for a record whose name matches the updated cri_purchasedat. Set Row Count to 1, since we expect only one match. 3. Identify Any Linked Case Records Add another List rows action, this time on the Cases table. We look for records where cri_productregistrationid equals the current product registration’s ID:We now use the List Rows action to fetch all related Case records tied to the updated Product Registration. This time we’re targeting the Cases table (which is internally incident in Dataverse) and using a FetchXML query to match records where cri_productregistrationid equals the current record being modified. This step is critical because it gives us the list of Case records we need to update, based on the link with the modified product registration. <fetch> <entity name=”incident”> <attribute name=”incidentid” /> <attribute name=”title” /> <attribute name=”cf_actualpurchasedfrom” /> <filter> <condition attribute=”cri_productregistrationid” operator=”eq” value=”@{triggerOutputs()?[‘body/cri_productregistrationid’]}” /> </filter> </entity></fetch> 5. Before updating anything, we add a Condition control to ensure that our previously fetched Purchased From record exists and is unique. Why? Because if there’s no match (or multiple matches), we don’t want to update the Cases blindly. We check if this length equals 1. If true → move forward with updates.If false → stop or handle the exception accordingly. To conclude, this kind of validation builds guardrails into your automation, making it more robust and preventing incorrect data from being applied across multiple records. After confirming a valid match, the flow loops through each related Case and updates the “Actual Purchased From” field with the correct value from the matched record, ensuring accurate linkage based on the latest update. Once this step runs, your staging automation is complete—with Cases now intelligently updated in real-time based on Product Registration changes. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Implement Incremental Refresh in Power BI
Refreshing large datasets in Power BI can become time-consuming and resource-intensive as data volume grows. If your reports are based on millions of rows of historical data, refreshing everything daily is neither efficient nor necessary. This is where Incremental Refresh comes in. It allows Power BI to only refresh new or changed data, drastically improving performance and reducing load on your data source. In this blog, you’ll learn how to set up incremental refresh step-by-step—so your Power BI reports stay fast and efficient even with big data. What Is Incremental Refresh in Power BI? Incremental Refresh enables Power BI to load data in partitions, refreshing only the latest ones (e.g., the past 7 days) while keeping the older data static. Why use it? Step 1: Define Parameters in Power Query · Open your report in Power BI Desktop (Pro or Premium workspace) · Go to Transform Data (Power Query Editor) · Create two parameters: · Set default values (e.g., RangeStart = 01/01/2020, RangeEnd = 01/01/2021) Step 2: Filter Your Data with These Parameters This tells Power BI what time range to load and eventually refresh incrementally. Step 3: Enable Incremental Refresh in Data Model 📝 Example: This configuration refreshes only the recent week of data each time, while keeping the rest intact. Step 4: Publish to Power BI Service ✅ Done! You’ve now implemented incremental refresh. Best Practices To conclude, Incremental Refresh is a game-changer when it comes to handling large datasets in Power BI. It not only saves refresh time but also optimizes resource usage. By learning how to configure it properly, you can scale your reports with confidence and efficiency Got a large dataset slowing down your Power BI refresh? Implement Incremental Refresh today and see the difference. Explore more Power BI performance tips in our blog series—or reach out for help setting up enterprise-grade models. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Seamlessly Generating and Downloading SSRS Reports in MFA-Enabled Power Pages Environments
Generating SSRS (SQL Server Reporting Services) reports from within Power Pages becomes more complex in environments secured by Multi-Factor Authentication (MFA). Traditional approaches that work in basic environments tend to fail silently or inconsistently when MFA, session tokens, or cookie-based auth are involved. In this blog, I’ll share a real-world solution where a Project-based SSRS report was generated securely, sent via email, and optionally downloaded — all within the constraints of a Power Pages + Power Automate architecture in a Dynamics 365 MFA-protected environment. Step 1: Pre-Outline Brief Pain Points Solved: Step 2: Core Content Scenario Overview Problem Statement Standard HTTP-based retrieval of SSRS reports using the Reserved.ReportViewerWebControl.axd endpoint fails in MFA-protected environments due to missing browser session cookies. This often results in 302 redirects or HTML-based error messages that cannot be processed by Power Automate. Initial Approach and Issue A flow was constructed to: The Project ID is captured from Power Pages and passed to a Power Automate flow using an HTTP trigger, which is initiated when a user clicks a button on the portal—triggered via embedded JavaScript. Build the SSRS report URL dynamically. Compose -> PDF Download Start – Index -> Compose -> PDF Download String Length -> Compose -> PDF Download URL -> (Replaced the PrintOnOpen=true parameter with PrintOnOpen=false in the report export URL to prevent the print dialog from automatically appearing when the PDF is opened) Perform an HTTP GET request to download the report. This failed consistently on the first try due to the report session page not being fully ready or authenticated, especially in an MFA environment. Working Solution: Retry with Delay in Power Automate To overcome the session-based delay, we implemented a retry pattern inside Power Automate: Outcome: The flow fails the first time (as expected), but succeeds on the second or third retry as the session becomes valid and the SSRS report is available. Power Automate Configuration Highlights: Added a Scope block after the first HTTP request and set the Configure run after to Skipped and Failed If needed, you can add the third delay if the second one fails. Benefits of This Approach: To conclude, sometimes, achieving reliability in secure environments isn’t about complex code—it’s about using the right orchestration patterns. By strategically delaying and retrying the HTTP request to SSRS within Power Automate, we achieved consistent, secure report generation that works even under MFA constraints. 🔗 Need help implementing this retry-based flow in your environment?Reach out to CloudFronts—we help businesses implement scalable, reliable solutions every day. You can contact us directly at transform@cloudfronts.com.
Share Story :
Power Pages + Power Automate: Retrieve SharePoint Files via HTTP Trigger Flow
When building a Power Pages site to fetch SharePoint files, I initially relied on the official Power Pages flow trigger—“When Power Pages calls a flow.” However, the flow didn’t trigger reliably when initiated from the site. Despite proper configurations, the trigger wouldn’t execute consistently, leading to broken file fetch operations. To overcome this, I replaced the unreliable trigger with a Power Automate flow using an HTTP request trigger. This method allowed me to invoke the flow through a JavaScript function executed on page load, passing the required record ID dynamically. The HTTP approach not only worked reliably but also gave me more control over the request and response. This blog post outlines that workaround, from setting up the HTTP-triggered flow to integrating it seamlessly with Power Pages. Background and the Problem Power Pages provides a native trigger to call Power Automate flows. While ideal in theory, this approach often fails in practice due to: After spending time investigating these issues without consistent results, I decided to switch to a more controlled and universally reliable method using a HTTP trigger. My Workaround – HTTP Trigger Flow Power Automate Flow Setup: Trigger:Start with the “When an HTTP request is received” trigger. Define the request schema to accept a recordId— in this case, an orderId. Compose (Request Variables):Add a Compose action to extract the incoming ID. List Rows – Document Locations:Use Dataverse → List rows to retrieve the SharePoint Document Location related to the Order (based on the passed orderId). This assumes your files are stored in folders linked to Dataverse records. Condition – Check If Folder Exists:Use a Condition to check if any document location was found: If record exists → proceed, If no records found → terminate the flow (folder doesn’t exist). Compose – Relative URL: Compose – Folder Path:Combine the folder path: Get Files (SharePoint):Use the SharePoint Get files (properties only) action with the dynamic path set to the DocumentPath variable. Return Response:Format the SharePoint file metadata (Name, Link, Type) and send it back using the Response action. JavaScript (Executed on Page Load) Why This Works: Pros: Cons: To conclude, if you’ve faced reliability issues with native Power Pages flow triggers, the HTTP request method can be a game-changer. It enabled me to deliver a seamless experience for retrieving SharePoint files, and it can do the same for your project. In future iterations, I plan to enhance this by adding bearer token authentication and caching metadata for even faster performance. Want to integrate Power Automate flows reliably with Power Pages? Reach out to CloudFronts—we help businesses implement scalable, reliable Power Platform solutions every day. You can contact us directly at transform@cloudfronts.com.
Share Story :
Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification
In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects. To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation. Challenge Sales professionals routinely face challenges such as: These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects. How It Works Platform Data Sources CloudFronts SmartPitch pulls information from the following knowledge sources: AI Integration Key Features MQL – SQL Summary Generator Users can request MQL – SQL document which contains The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX. HTTP Request to Logic App At Logic App we used ChatGPT API to fetch company and client information Extract the company location from the company information, and similarly, extract the industry as well. Render it to custom copilot via request to the Logic App. Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions. Generative AI can also be instructed to directly create a formatted json based on parsed values. This formatted JSON can be passed to converted to an actual JSON and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person. This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry. This triggers an auto download of the MQL-SQL created as a PDF file on your system. Content Search Users can ask questions related to – 1. Case Study FAQ: Helps users ask questions about client success stories and project case studies, retrieves relevant information from a knowledge source, and offers follow-up FAQs before ending the conversation. Cloudfronts official website is used for fetching Case Studies information. 2. Opportunities: Helps users inquire about past projects or opportunities, detailing client names, roles, estimated revenue and outcomes. 3. SOPs: Provides quick answers and summaries for frequently asked questions related to organizational processes and SOPs. Users can ask questions like “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume. Security & Governance Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions. To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Bridge Your Database and Dataverse: Complete Integration Guide
Modern applications demand seamless, real-time data access. Microsoft Dataverse—the data backbone of the Power Platform—makes it easier to build and scale low-code apps, but often your enterprise data resides in legacy databases. Connecting a database to Dataverse enables automation, reporting, and app-building capabilities using the Power Platform’s ecosystem. In this blog, we’ll walk you through how to connect a traditional SQL database (Azure SQL or On-Premises) to Microsoft Dataverse. What is Dataverse? Dataverse is Microsoft’s cloud-based data platform, designed to securely store and manage data used by business applications. It’s highly integrated with Power Apps, Power Automate, and Dynamics 365. Key Features: Why Connect Your Database to Dataverse? Step-by-Step Guide: Connecting a Database to Dataverse Step 1: Open the Power Apps and select the proper Environment Step 2: Open Dataflow in Power Apps and create a new Dataflow Step 3: Connect to the Database using SQL Server Database. Step 4: Add the required credentials to make the connection between the database and Dataverse. Step 5: Add proper mapping of the column and find the unique ID of the table Step 6: Set the schedule refresh and publish the Dataflow. Step 7: Once Dataflow is published, we can see the table in the Power apps To conclude, connecting your database to Dataverse amplifies the power of your data, enabling app development, automation, and reporting within a unified ecosystem. Whether you need real-time access or periodic data sync, Microsoft offers flexible and secure methods to integrate databases with Dataverse. Start exploring virtual tables or dataflows today to bridge the gap between your existing databases and the Power Platform. Want to learn more? Check out our related guides on Dataverse best practices and virtual table optimization. We hope you found this blog useful. If you would like to discuss anything further, please reach out to us at transform@cloudfonts.com.
Share Story :
From Commit to Inbox: Automating Change Summaries with Azure AI
In our small development team, we usually merge code without formal pull requests. Instead, changes are committed directly by the developer responsible for the project, and while I don’t need to approve every change in my role as the senior developer, I still need to stay aware of what’s being merged. Manually reviewing each commit was becoming too time-consuming, so I built an automated process using Power Automate, Azure DevOps, and Azure AI.Now, whenever a commit is made, it triggers a workflow that summarizes the changes and sends me an email.This simple system keeps me informed without slowing down the team’s work. Although I kept the automation straightforward, it could easily be extended further.For example, it could be improved to allow me to reply directly to the committer from the email or even display file changes in detail using a text comparison feature in Outlook.We didn’t need that level of detail, but it’s a good option if deeper insights are ever required. Journey We get started with the Azure DevOps trigger “When a code is pushed”. Here we specify the organization name, project name and repository name. We can also specify a specific branch if we want to limit our tracking to simply that branch otherwise it tracks all the available branches to the User. Then we have a foreach loop that iterates over the “Ref Updates” object array. It contains a list of all the changes but not the exact details.This action pops up automatically as well when we configure the next action. Then we set up a “Azure DevOps REST API request to invoke” action. This has connection capabilities to Azure DevOps directly so it is better to use over a simple REST API action. We specify the relative URL as {Repository Name}/_apis/git/repositories/{Repository ID}/commits/{Commit ID}/changes?api-version=6.0 The Commit ID shows up as newObjectId in the “When code is pushed” trigger. Then we pass the output of this action to a “Create Text with GPT using a prompt” action under the AI Builder group.I’ve passed the prompt as below but it took several trials and errors to get exactly what I wanted. The last action is a simple “Send an email” one where I’ve kept myself as a recepient and I’ve added a subject and a body. Now to put it all together and run it – And here is the final output – When the hyperlinks are clicked they take me straight to azure while pointing to the file which is referred. For instance, if I click on the Events Codeunit – Conclusion Summarizing commit changes is just one way automation can make life easier.This same idea can be applied to other tasks, like summarizing meeting notes, project updates, or customer feedback.With a bit of creativity, we can use tools like this to cut down on repetitive work and free up time to focus on learning new skills or tackling more challenging projects.By finding smart ways to streamline our workflows, we can work more efficiently and open up more time for growth and development. If you need further assistance or have specific questions about your ERP setup, feel free to reach out for personalized guidance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Data Flow with Array Filtering in Power Automate
When working with arrays in Power Automate, it’s common to need to filter or select a specific item based on certain attributes. Whether you’re handling JSON data from an API, processing records from a list, or managing dynamic content within a flow, efficiently identifying the right item is key. In this blog, we’ll explore a simple yet effective method to extract the desired item from an array using expressions in Power Automate. By the end, you’ll have a clear strategy to streamline your workflows and enhance the intelligence of your automation. In case you need to select an item from an array in Power Automate based on the value of a certain attribute, here’s how you can do it. Scenario You have an array of objects, and each object has a specific attribute. You want to efficiently select the object(s) where this attribute matches a particular value. As you see, the array of objects have different structure – All of them have an attribute called “key” and that’s the one you want to select and then process further. Let’s see how we do it. Filter Array Let’s see how you can select the item from the array based on the value of the “key” attribute instead of looping through all the items and matching. To encapsulate, by using this approach, you can efficiently select specific items from an array based on the value of a particular attribute, making your Power Automate flows more dynamic and tailored to your specific needs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Mastering Concurrency in Power Automate: An Essential Guide for Optimized Workflows
Introduction Power Automate has revolutionized process automation by offering a low-code platform for building efficient workflows. However, when dealing with large-scale data or simultaneous operations, concurrency becomes a critical concept. Understanding and managing concurrency ensures that workflows run smoothly without performance bottlenecks or data integrity issues. In this blog, we’ll explore the concept of concurrency in Power Automate, its implications, and how to configure it effectively. Along the way, we’ll illustrate the topic with a practical example to help you grasp its real-world application. 1. What Is Concurrency in Power Automate? Concurrency refers to the ability of a workflow to execute multiple iterations or steps simultaneously. While concurrency can significantly speed up workflows, it must be handled carefully to avoid conflicts, particularly when working with shared resources or sequential processes. 2. Why Concurrency Matters Managing concurrency effectively can: However, improper configuration can lead to issues like data overwrites, skipped steps, or exceeding service limits. 3. Configuring Concurrency in Power Automate a) Setting Concurrency in Loop Actions Loop actions (e.g., “Apply to each”) in Power Automate have a concurrency control setting that determines how many items can be processed in parallel. b) Default Setting: By default, loops run sequentially. 4. Practical Example: Parallel Processing for Email Notifications a) Scenario: Your organization frequently sends mass email notifications to users based on CRM data. Using sequential processing causes delays, especially for large datasets. b) Solution: Implement a Power Automate workflow with concurrency enabled: Trigger: The workflow starts with a scheduled recurrence trigger or a Dataverse event. Data Retrieval: Fetch user data from Dataverse or SharePoint. Apply to Each: Enable concurrency control for the “Apply to Each” loop. Set a parallelism degree of 5 to process 5 emails simultaneously. Send Email: Each iteration sends an email notification to a user. Error Handling: Use retry policies or error-handling branches to manage failures. Outcome: The workflow completes email notifications significantly faster, improving operational efficiency while maintaining reliability. Following image contains settings of ‘Apply to Each’ action in Power Automate 5. Key Considerations and Best Practices a) Identify Dependencies: Avoid enabling concurrency for workflows with interdependent steps. b) Service Limits: Check Power Automates limits to prevent throttling. c) Monitor Performance: Use Power Automate analytics to monitor workflow performance and adjust settings as needed. d) Test Before Deployment: Ensure workflows behave as expected under concurrent execution. Conclusion Concurrency in Power Automate is a powerful tool for optimizing workflows, especially when handling bulk operations or parallel tasks. By understanding its settings and best practices, you can design workflows that are both efficient and reliable. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.