Category Archives: PowerApps

Implementing Plugin for Automated Lead Creation in Dynamics 365

Dynamics 365 CRM plugins are a powerful way to enforce business logic on the server but choosing when and how to use them is just as important as writing the code itself. In one implementation for a Netherlands-based sustainability certification organization, the client needed their certification journey to begin with a custom application entity while still ensuring that applicant and company details were captured as leads for downstream sales and engagement processes. This blog explores how a server-side plugin was used to bridge that gap, reliably creating and associating lead records at runtime while keeping the solution independent of UI behavior and future integrations. In this scenario, the certification application was the starting point of the business process, but sales and engagement still needed to operate on leads. Simply storing the same information in one place wasn’t enough, the system needed a reliable way to translate an application into a lead at the right moment, every time. That transformation logic is neither data storage nor UI behavior; its core business process logic, which is why it was implemented using a Dynamics 365 plugin. Scenario: Certification Application Not Flowing into Sales Users reported the following challenge: a. A user submits or creates a Certification Applicationb. Applicant and company details are captured on a custom entityc. Sales teams expect a Lead to be created for follow-upd. No Lead exists unless created manually or through inconsistent automatione. Application and sales data become disconnected This breaks the intended business flow, as certification teams and sales teams end up working in parallel systems without a reliable link between applications and leads. Possible Solution: Handling Lead Creation Through Manual Processes (Original Approach) Before implementing the plugin, the organization attempted to manage lead creation manually or through disconnected automation. How It Worked (Initially) a. A Certification Application was submittedb. Users reviewed the applicationc. Sales team manually created a Lead with applicant/company detailsd. They tried to match accounts/contacts manuallye. Both records remained loosely connected. Why This Might Look Reasonable a. Simple to explain operationallyb. No development effortc. Works as long as users follow the steps perfectly The Hidden Problems 1] Inconsistent Data Entry a. Users forgot to create leadsb. Leads were created with missing fieldsc. Duplicate accounts/contacts were createdd. Sales lost visibility into new certification inquiries 2] Broken Cross-Department Workflow a. Certification team worked in the custom entityb. Sales team worked in Leadsc. No structural linkage existed between the twod. Downstream reporting (pipeline, conversion, source tracking) became unreliable. Workaround to This Approach: Use Server-Side Logic Instead of Manual Steps Practically, the transformation of an application into a lead is business logic, not user behavior. Once that boundary is respected, the solution becomes stable, predictable, and automation-friendly. Practical Solution: A Server-Side Plugin (Improved Approach) Instead of depending on people or scattered automation, the lead is created centrally and automatically through a plugin registered on the Certification Application entity. Why a Plugin? a. Executes consistently regardless of data sourceb. Not tied to form events or UI interactionsc. Can safely check for existing accounts/contactsd. Ensures one source of truth for lead and application linkagee. Works for portal submissions, integrations, and bulk imports This is essential for a client, where applications may originate from multiple channels and must feed accurately into the sales funnel. How the Plugin-Based Solution Works The solution was implemented using a server-side plugin registered on the Certification Application entity. The plugin executes when a new application is created, retrieves the necessary applicant and organization details, performs basic checks for existing accounts and contacts, creates a Lead using the extracted data, and finally links the Lead back to the originating application record. This ensures that every certification application automatically enters the sales pipeline in a consistent and reliable manner. Implementation Steps (For Developers New to Plugins) If you’re new to Dynamics 365 plugins, the implementation followed these core steps: Build and Register the Plugin. Once the plugin logic is implemented, build the project to generate the signed assembly. After a successful build: After registration, every new Certification Application will automatically trigger the plugin, ensuring that a Lead is created and linked without any manual intervention. a. Open the Plugin Registration Toolb. Connect to the target Dynamics 365 environmentc. Register the compiled assemblyd. Register the plugin step on the Create message of the Certification Application entitye. Configure the execution stage (typically post-operation) and execution mode (Synchronous or Asynchronous, depending on business needs) To encapsulate, this solution shows why server-side plugins are the right place for core business logic in Dynamics 365. By automatically creating and linking a Lead when a Certification Application is created, the organization removed manual steps, prevented data inconsistencies, and ensured that every application reliably flowed into the sales pipeline. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Browser-Level State Retention in Dynamics 365 CRM: Improving Performance & UX with Session Storage

Dynamics 365 model-driven apps are excellent at storing business data, but not every piece of information belongs in Dataverse. A common design folly is using Dataverse fields to store temporary UI state-things like selected views, filters, or user navigation preferences. While this works technically, it introduces unnecessary performance overhead and can create incorrect behavior in multi-user environments. In this blog, I’ll focus on browser-level retention of CRM UI data using “sessionStorage“, using subgrid view retention as a practical example for a technology consulting and cybersecurity services firm based in Houston, Texas, USA, specializing in modern digital transformation and enterprise security solutions. The Real Problem: UI State vs Business Data Let’s separate concerns clearly: Type Example Where it should live Business data Status, owner, amounts Dataverse UI state Selected view, filter, scroll position Browser Subgrid views fall squarely into the UI state category. Scenario: Subgrid View Resetting on Navigation Users reported the following behavior: This breaks user workflow, especially for records that users revisit frequently. Possible Solution: Persisting UI State in Dataverse (Original Approach) This would attempt to fix it by storing the selected subgrid view GUID in a Dataverse field on the parent record. How It Works Why this might look reasonable The Hidden Problems 1] Slower Form Execution 2] Data Model Pollution 3] Incorrect Multi-User Behavior 4] Scalability Issues In short, Dataverse was doing work it should never have been asked to do. Workaround to this Approach: Keep UI State in the Browser for that session, But practically: The selected subgrid view belongs to the user’s session, not the record. Once that boundary is respected, the solution becomes much cleaner. Practical Solution: Browser Session Storage (Improved Approach) Instead of persisting view selection in Dataverse, we store it locally in the browser using sessionStorage. sessionStorage is part of the Web Storage API, which provides a way to store key-value pairs in a web browser. Unlike localStorage, which persists data even after the browser is closed, sessionStorage is designed to store data only for the duration of a single session. This means that the data is available as long as the tab or window is open, and it is cleared when the tab or window is closed. Why Session Storage? How the Improved Solution Works 1. Store the View Locally on Subgrid Change 2. Restore the View on Form/Grid Load This ensures that when the user revisits the form, the subgrid opens exactly where they left off. Why This Approach Is Superior 1] Faster Execution 2] Correct User Experience 3] Clean Architecture 4] Zero Backend Impact When to Use Browser-Level Retention Use this pattern when: Examples: To conclude, not all data deserves to live in Dataverse. When you store UI state in the browser instead of the database, you gain: Subgrid view retention is just one example-but the principle applies broadly across Dynamics 365 customizations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Exposing Plugins as Bound Actions for Power Automate Flows: A Practical Procedure for Efficient Record Processing, involving several records.

In complex business processes, like calculating commissions or validating data across multiple records, applying the same logic repeatedly in a Power Automate flow can quickly become inefficient and difficult to maintain. A more scalable approach is to encapsulate the logic in a Dataverse plugin, expose it as a bound action, and then call this action from a flow. This method centralizes business rules, reduces redundancy, and improves maintainability. In this post, we’ll walk through the steps to implement this approach and examine its advantages over applying the same logic directly within a flow for each individual record. We’ll illustrate this with a practical example from a Houston-based technology consulting and cybersecurity services firm that specializes in modern digital transformation and enterprise security solutions. Flow Diagram Step 1: Create the PluginThe first step is to write a plugin that contains the logic you want to apply to each record. Example: DuplicateCommissionsCounter Step 2: Expose the Plugin as a Bound ActionInstead of running plugin logic manually for each record, you can register it as a bound action in Dataverse. Procedure: E.g. 2. Attach your plugin to this action. Outcome: This exposes your plugin logic as a reusable, callable bound action. Any process or flow can now invoke it for a specific invoice record. Step 3: Use Power Automate to Call the Bound ActionOnce the plugin is exposed, you can loop through multiple records in a flow and call the action. Procedure in Power Automate: This approach ensures that all complex logic resides in the plugin, while the flow orchestrates which records need processing. Advantages Over Logic Directly in the Flow To conclude, exposing plugins as bound actions is a robust, maintainable way to apply complex logic across multiple records in Dataverse. It allows Power Automate flows to focus on orchestration rather than logic execution, leading to cleaner, faster, and easier-to-manage solutions. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Overcoming Dataverse Connector Limitations: The Power Automate Approach to Export Hidden

Working with Microsoft Dataverse Connector in Power BI is usually straightforward—until you encounter a table that simply refuses to load any rows, even though the data clearly exists in the environment. This happens especially with hidden, virtual, or system-driven tables (e.g. msdyn_businessclosure, msdyn_scheduleboardsetting) which are commonly used in Field Service and Scheduling scenarios. Before jumping to a workaround, it’s important to understand why certain Dataverse tables don’t load in Power BI, what causes this behavior, and why the standard Dataverse connector may legitimately return zero rows. Causes – 1] The Table Is a Virtual or System Table with Restricted AccessSystem-managed Dataverse tables like msdyn_businessclosure are not exposed to the Dataverse connector because they support internal scheduling and platform functions. 2] No Records Exist in the Root Business Unit Data owned by child business units is not visible to Power BI accounts associated with a different BU, resulting in zero rows returned. 3] The Table Is Not Included in the Standard Dataverse Connector Some solution-driven or non-standard tables are omitted from the Dataverse connector’s supported list, so Power BI cannot load them. Solution: Export Dataverse Data Using Power Automate + Excel Sync Since Power BI can read:-> OneDrive-hosted files-> Excel files-> SharePoint-hosted spreadsheets …a suitable workaround is to extract the restricted Dataverse table into Excel using a scheduled (When the records are few) / Dataverse triggered (When there are many records and you only want a single one, to avoid pagination) Power Automate flow. What it can do –-> Power Automate can access system-driven tables.-> Excel files in SharePoint can be refreshed by Power BI Service.-> we can bypass connector restrictions entirely.-> The method works even if entities have hidden metadata or internal platform logic. This ensures:-> Consistent refresh cycles-> Full visibility of all table rows-> No dependency on Dataverse connector limitations Use case I needed to use the Business Closures table (Dataverse entity: msdyn_businessclosure) for a few calculations and visuals in a Power BI report. However, when I imported it through the Dataverse Connector, the table consistently showed zero records, even though the data was clearly present inside Dynamics 365. There are 2 reasons possible for this –1] It is a System/Platform Tablemsdyn_businessclosure is a system-managed scheduling table, and system tables are often hidden from external connectors, causing Power BI to return no data. 2] The Table Is Not Included in ā€œStandard Tablesā€ Exposed to Power BIMany internal Field Service and scheduling entities are excluded from the Dataverse connector’s metadata, so Power BI cannot retrieve their rows even if they exist. So here, we would fetch the records via “Listing” in Power automate and write to an excel file to bypass the limitations that hinder the exposure of that data; without compromising on user privileges, or system roles; we can also control or filter the rows being referred directly at source before reaching PBI Report. Automation steps – 1] Select a suitable trigger to fetch the rows of that entity (Recurring or Dataverse, whichever is suitable). 2] List the rows from the entity (Sort/Filter/Select/Expand as necessary). 3] Perform a specific logic (e.g. clearing the existing rows, etc.) on the excel file where the data would be written to. 4] For each row in the Dataverse entity, select a primary key (e.g. the GUID), provide the path to the particular excel file (e.g. SharePoint -> Location -> Document Library -> File Name -> Sheet or Table in the Excel File), & assign the dynamic values of each row to the columns in the excel file. 5] Once this is done, import it to the PBI Report by using suitable Power Query Logic in the Advanced Editor as follows – -> a) Loading an Excel File from SharePoint Using Web.Contents() – Source = Excel.Workbook(Web.Contents(“https://<domain>.sharepoint.com/sites/<Location>/Business%20Closures/msdyn_businessclosures.xlsx”),null,true), What this step does: -> Uses Web.Contents() to access an Excel file stored in SharePoint Online.-> The URL points directly to the Excel file msdyn_businessclosures.xlsx inside the SharePoint site.-> Excel.Workbook() then reads the file and returns a structured object containing:All sheets, Tables, Named ranges Parameters used: null → No custom options (e.g., column detection rules)true → Indicates the file has headers (first row contains column names) -> b) Extracting a Table Named ā€œTable1ā€ from the Workbook – msdyn_businessclosures_Sheet = Source{[Item=”Table1″, Kind=”Table”]}[Data], This would search inside the Source object (which includes all workbook elements), and look specifically for an element where: Item = “Table1” → the name of the table in the Excel fileKind = “Table” → ensures it selects a table, not a sheet with the same name & would extract only the Data portion of that table. As a result, we get Power Query table containing the exact contents of Table1 inside the Excel workbook, to which we can further apply our logic filter, clean, etc. To conclude, when Dataverse tables refuse to load through the Power BI Dataverse Connector—especially system-driven entities like msdyn_businessclosure—the issue is usually rooted in platform-level restrictions, connector limitations, or hidden metadata. Instead of modifying these constraints, offloading the data through Power Automate → Excel → Power BI provides a controlled, reliable, and connector-independent integration path. By automating the extraction of Dataverse rows into an Excel file stored in SharePoint or OneDrive, you ensure: This method is simple to build, stable to maintain, and flexible enough to adapt to any Dataverse table -whether standard, custom, or system-managed. For scenarios where Power BI needs insights from hidden or restricted Dataverse tables, this approach remains one of the most practical and dependable solutions. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Managing Post-Delivery Service and Repairs Using Cases in Dynamics 365 CRM

Why This Matters Imagine you’ve just delivered an order, and now there’s a service issue or repair request from the customer. What’s the best way to track and resolve that? That’s where Cases come in. This blog walks you through how your company can use Cases in Dynamics 365 CRM to efficiently handle post-delivery service and repair requests—directly linked to the order fulfillment process for better visibility and control. Let’s break it down step by step. Step 1: Navigate to Cases from an Order Fulfillment Record Start by opening the Order Fulfillment record.Click on the ā€œRelatedā€ dropdown and select ā€œCasesā€ from the list. This takes you directly to all service cases related to that order. Step 2: Create a New Case Click on the ā€œNew Caseā€ button in the Cases tab. A Quick Create: Case form appears. Here’s what you’ll see and fill in: Optional fields like Contact, Origin, Entitlement, and others can be filled in if needed.You can also include details such as First Response By, Resolve By, and Description, depending on your business requirements. Once done, hit Save and Close. Step 3: View All Related Cases After saving, you’ll see a list of all Cases associated with the order under the Case Associated View. Each entry includes key info like: This makes it easy to monitor all service activity related to an order at a glance. Step 4: Manage Case Details Click on any Case Title to open the full Case record. From here, you can: Step 5: Monitor Service Performance Navigate to Dashboards > Service and Repair to track ongoing Case performance. Here’s what you’ll see: This allows your company’s service team to monitor progress, manage workload, and identify recurring product or fulfillment issues. To conclude, by following this process, your company ensures that every post-delivery service or repair request is captured, tracked, and resolved—while keeping everything connected to the original order. It’s simple, efficient, and fully integrated into Dynamics 365 CRM. Hope this helps!!! I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Automatically Update Lookup Fields in Dynamics 365 Using Power Automate: From Custom Tables to Standard Entities

Imagine this: you update a product’s purchase date in a registration record and—boom—a related case automatically gets refreshed with the accurate ā€œPurchased Fromā€ lookup. Saves time, reduces errors, and keeps everything in sync without you lifting a finger. Let’s walk through how to make that happen using Power Automate. The goal: When a Product Registration’s cri_purchasedat field is changed, the system will retrieve the related “Purchased From” record and update any linked Case(s) with the appropriate lookup reference. Let’s break down the step-by-step process of how this is done in Power Automate. Step 1: Trigger the Flow When Purchase Date Changes Flow trigger: When a row is added, modified, or deleted (Dataverse) This setup ensures that our flow only fires when that specific date field is modified. Step 2: Pull in the ā€œPurchased Fromā€ Record Next, use List rows on the ā€œPurchased Fromā€ table with a FetchXML query. We’re searching for a record whose name matches the updated cri_purchasedat. Set Row Count to 1, since we expect only one match. 3. Identify Any Linked Case Records Add another List rows action, this time on the Cases table. We look for records where cri_productregistrationid equals the current product registration’s ID:We now use the List Rows action to fetch all related Case records tied to the updated Product Registration. This time we’re targeting the Cases table (which is internally incident in Dataverse) and using a FetchXML query to match records where cri_productregistrationid equals the current record being modified. This step is critical because it gives us the list of Case records we need to update, based on the link with the modified product registration. <fetch> <entity name=”incident”>     <attribute name=”incidentid” />     <attribute name=”title” />     <attribute name=”cf_actualpurchasedfrom” />     <filter>       <condition attribute=”cri_productregistrationid” operator=”eq” value=”@{triggerOutputs()?[‘body/cri_productregistrationid’]}” />     </filter>  </entity></fetch> 5. Before updating anything, we add a Condition control to ensure that our previously fetched Purchased From record exists and is unique. Why? Because if there’s no match (or multiple matches), we don’t want to update the Cases blindly. We check if this length equals 1. If true → move forward with updates.If false → stop or handle the exception accordingly. To conclude, this kind of validation builds guardrails into your automation, making it more robust and preventing incorrect data from being applied across multiple records. After confirming a valid match, the flow loops through each related Case and updates the ā€œActual Purchased Fromā€ field with the correct value from the matched record, ensuring accurate linkage based on the latest update. Once this step runs, your staging automation is complete—with Cases now intelligently updated in real-time based on Product Registration changes. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

How We Built Smart Pitch — and What We Learned Along the Way

In today’s world, AI is no longer a luxury—it’s a necessity for driving smarter decisions, faster innovation, and personalized experiences. We have come up with our requirements for AI to support the conversion from MQL (Marketing Qualified Lead) to SQL (Sales Qualified Lead). How did the idea originate? In our organization, whenever a prospect reaches out to us, we search for company information like company size, revenue, location, industry type, contact person details, designation, decision-maker, and LinkedIn profile. This information helps the Sales team prepare better and deliver a stronger pitch by understanding the customer before the call. Also, during the MQL to SQL stage, we look for things like: This information helps the Sales team convert the prospect into a client and increases our chances of winning the deal. Earlier, this entire process was manual and time-consuming. So, we decided to automate it with an AI agent that can gather this information for us in just a few minutes. Implementation approach After the project was approved internally, we started exploring how to make it happen. Initially, we didn’t know where or how to start. During our research, we came across Copilot Studio, which allows us to build custom agents from scratch based on our needs. We learned about Copilot Studio’s and began building our agent. We named it Elevator Pitch. Version 1 Highlights: This feedback led to the idea for Version 2, which would automate more steps and also pull information from the internet. Version 2 Enhancements: Version 2 Features: Company & Contact information with a single click on MQL to SQL, the agent now generates the document within minutes—something that earlier used to take hours or even a full day. Live demo in Zurich & New York On 22nd May 2025, we had an event scheduled at the Microsoft office in Zurich with one of our clients, where we shared the Buchi journey with CloudFronts. We discussed how we collaborated to connect their multiple systems and prepared their data for insights and AI initiatives. At the same event, we had the opportunity to demonstrate our Smart Pitch product, which caught the audience’s attention. It was a proud moment for us to showcase our first AI product at the Microsoft office—delivered within just a few months of hard work. Our second opportunity came on 06 June 2025 in New York, at the AI Community Conference, where we presented again in front of a global audience. What Next in Version 3: So far, we have built this solution using Microsoft’s inbuilt Knowledge Center, ChatGPT API, SharePoint, company websites, and Dataverse. Since we were working with both structured and unstructured data, we faced some inconsistencies and performance issues. This led us to reflect and identify the need for Version 3 (V3), which will include: The development of Smart Pitch V3 is currently in progress. We’ll share our thoughts once it goes live. A demo video has also been shared, so you can see how smart and fast our Agent is at delivering useful insights. Delivering Answers in Minutes—Thanks to Smart Pitch I’d also like to share a quick story. One day, our Practice Manager was on leave, and we received a prospect inquiry about Project Operations to Business Central (PO-BC) pricing. I wasn’t sure where that information was stored, and suddenly our CEO asked me for the details. I was a bit stressed, unsure where to search or how to respond. Then I decided to ask our Smart Pitch agent the same question. To my surprise, the agent quickly gave me the exact information I needed. It was a big relief, and I was able to share the details with our CEO in just a few minutes—without even knowing where the document was uploaded. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Automated Email Reminders Based on Date Fields in Power Automate for Dynamics 365 CRM.

Managing reminders and deadlines can be tricky, especially when you’re juggling multiple tasks in Dynamics 365. But what if you could set up automatic email reminders based on specific dates? In this guide, I’ll show you how to use Power Automate with D365 CRM to send automatic email reminders when certain dates are entered, and follow up at 7, 14, 21, and 28-day intervals if another related date field isn’t filled. By the end of this post, you’ll learn how to create a simple workflow that keeps your team on track by sending timely reminders when needed. The Use-Case: Automatic Email Reminders for Unfilled Dates Imagine this: Your Project Manager fills in a date for a project milestone. But if the next milestone isn’t updated after a set period, your system will automatically send reminder emails to the right people. This saves your team from having to manually follow up and ensures that important dates are never overlooked. Here’s how you can set it up using Power Automate: Key Components of the Solution Follow the Power Automate step outlined below: Select the ā€œWhen a row is added, modified or deletedā€ trigger from the Dataverse connector, set the Change Type to Modified, choose the Order Fulfillments table, set the Scope to Organization, specify the columns cf_submittalprotocolprocess and cf_initialshopdrawingssubmitted to trigger only on changes to those fields, and optionally use the Filter Rows field to apply additional conditions if needed. Click the ellipsis (three dots) on the top-right corner of the trigger card, select Settings, scroll to the Trigger Conditions section, and enter the following expression to ensure the flow only triggers when cf_initialshopdrawingssubmitted is not empty: Below is the condition that you need to add: @and(     not(empty(triggerOutputs()?[‘body/cf_initialshopdrawingssubmitted’])),     equals(triggerOutputs()?[‘body/cf_submittalprotocolprocess’],979570001) ) Add the ā€œGet a row by IDā€ action from the Dataverse connector, set the Table Name to Opportunities, and use the dynamic value Opportunity (Value) for the Row ID to retrieve the corresponding Opportunity record related to the modified Order Fulfillment. Add the ā€œGet a row by IDā€ action from the Dataverse connector, set the Table Name to Order Fulfillments, and use the dynamic value Order Fulfillment for the Row ID to retrieve full details of the modified Order Fulfillment record. Add a ā€œComposeā€ action named Comments for Email, and provide a formatted list including Project Manager (Order Fulfillment), Opportunity Contact, Secondary Contact, a static email (e.g., testblog@gmail.com), and again Project Manager (Order Fulfillment) as an example. Add a ā€œFilter arrayā€ action, set the From field to a coalesce(…) expression generating a list of participants, and in Basic Mode set the condition to Item is not equal to null to remove null entries. Use a coalesce(createArray(…)) expression to conditionally construct an array of activity parties based on field availability (Opportunity Contact, Secondary Contact, Owner ID, Project Manager), falling back to a default address (CRMAdmin@gmail.com) if the Project Manager is null. coalesce(     createArray(         if(             not(equals(outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value’], null)),             json(concat(                 ‘{“participationtypemask”: 2,”partyid@odata.bind”: “contacts(‘, outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value’], ‘)”}’             )),             null         ),         if(             not(equals(triggerOutputs()?[‘body/_ow_secondarycontact_value’], null)),             json(concat(                 ‘{“participationtypemask”: 2,”partyid@odata.bind”: “contacts(‘, triggerOutputs()?[‘body/_ow_secondarycontact_value’], ‘)”}’             )),             Null ),         if(             not(equals(triggerOutputs()?[‘body/_ownerid_value’], null)),             json(concat(                 ‘{“participationtypemask”: 3,”addressused”: “testblog@gmail.com”}’             )),             null         ),         json(concat(             ‘{“participationtypemask”: 4,”partyid@odata.bind”: “systemusers(‘, outputs(‘Getting_Order_Fulfillment_by_ID’)?[‘body/_cf_projectmanager_value’], ‘)”}’         )),         if(             not(equals(outputs(‘Getting_Order_Fulfillment_by_ID’)?[‘body/_cf_projectmanager_value’], null)),             json(concat(                 ‘{“participationtypemask”: 1,”partyid@odata.bind”: “systemusers(‘, outputs(‘Getting_Order_Fulfillment_by_ID’)?[‘body/_cf_projectmanager_value’], ‘)”}’             )),             json(concat(                 ‘{“participationtypemask”: 1,”addressused”: “CRMAdmin@gmail.com”}’             ))         )    )) Switch to Advanced Mode in the Filter array and use the expression @not(equals(item(), null)) for better control over null filtering of the dynamic participant list. In the step below, I used a “Compose” action to extract the Project Manager’s ID, which is then used in the filter array step. Below is the body of the Filter Array, which I’ve saved in a new Compose action named ā€œEmail Participants.ā€ Add a “Compose” action to generate the email body using HTML formatting, apply an if() expression to dynamically insert the recipient’s name, and use concat() to list the required items for fabrication. Below is the expression I used to identify all the recipients who will be receiving the emails. if(   not(equals(outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value@OData.Community.Display.V1.FormattedValue’], null)),    concat(‘<strong>’, outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value@OData.Community.Display.V1.FormattedValue’], ‘</strong>’),    null After completing the previous steps, add a parallel branch with four parallel actions, each configured to send the email after a delay of 7, 14, 21, and 28 days, respectively. After introducing a 7-day delay, add a parallel branch that retrieves the corresponding Order Fulfillment record by ID and checks if both the Drawing Approval Date and Redline Issued Date fields updates have been made since the initial trigger Add the ā€œAdd a new rowā€ action from the Dataverse connector, set the Table Name to Email Messages, populate the Activity Parties field using the dynamic output outputs(‘Email_Participants’), and map the Description field similarly with the appropriate output value containing the email body content. Set the Regarding (Order Fulfillments) field in the ā€œAdd a new rowā€ action to the dynamic value Order Fulfilment (cf_orderfulfillment) to associate the email with the corresponding Order Fulfillment record. This is how the final power automate will look like Similarly, you can do for 14,21 and 28 days. Below is how the email would look like in CRM: The trigger will start when Initial shop drawings field contains the date field and Submittal/protocol process equals certain option. If both Redline Int Received and Drawing Approved Date fields remain empty after 7, 14, 21, and 28 days, CRM will automatically send a follow-up email on each of those days. This is how the email will look like. To conclude, setting up this automatic reminder system in Power Automate for D365 CRM will help your team stay on top of project milestones, reduce manual follow-ups, and make sure nothing gets overlooked. It’s a simple yet effective way to automate reminders and keep everyone informed without any extra effort. Hope this helps!!! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

How to Perform Data Transformation in Microsoft Dataverse

Microsoft Dataverse is a powerful data platform that supports secure and scalable data storage for business applications. However, raw data imported into Dataverse often needs transformation—cleaning, reshaping, filtering, or merging—to make it useful and reliable for apps and analytics.  In this blog, we’ll show you how to apply transformations to data before or after it reaches Dataverse using tools like Power Query, Dataflows, and business rules—ensuring you always work with clean, structured, and actionable data.  What is Data Transformation in Dataverse?  Why Data Transformation Matters Data transformation refers to modifying data’s structure, content, or format before or after it’s stored in Dataverse. This includes:  Step-by-Step Guide: Connecting a Database to Dataverse  Step 1: Open the Power Apps and select the proper Environment  Step 2: Open Dataflow in Power Apps and create a new Dataflow  Step 3: Connect to the Database using SQL Server Database.  Step 4: Add the required credentials to make the connection between the database and Dataverse.  Step 5: Add the transformation in the Dataverse  Step 6: Add proper mapping of the column and find the unique ID of the table   Step 7: Set the schedule refresh and publish the Dataflow.  Step 8: Once Dataflow is published, we can see the table in the Power apps  To conclude, transforming data in Dataverse is key to building reliable and high-performing applications. Whether using Power Query, calculated columns, or Power Automate, you can ensure your data is clean, structured, and actionable.  We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com. Ready to improve your Dataverse data quality? Start with a simple dataflow or calculated column today, and empower your business applications with better, transformed data.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange