Category Archives: Blog
Deploying AI Agents with Agent Bricks: A Modular Approach
In today’s rapidly evolving AI landscape, organizations are seeking scalable, secure, and efficient ways to deploy intelligent agents. Agent Bricks offers a modular, low-code approach to building AI agents that are reusable, compliant, and production-ready. This blog post explores the evolution of AI leading to Agentic AI, the prerequisites for deploying Agent Bricks, a real-world HR use case, and a glimpse into the future with the ‘Ask Me Anything’ enterprise AI assistant. Prerequisites to Deploy Agent Bricks Use Case: HR Knowledge Assistant HR departments often manage numerous SOPs scattered across documents and portals. Employees struggle to find accurate answers, leading to inefficiencies and inconsistent responses. Agent Bricks enables the deployment of a Knowledge Assistant that reads HR SOPs and answers employee queries like ‘How many casual leaves do I get?’ or ‘Can I carry forward sick leave?’. Business Impact: Agent Bricks in Action: Deployment Steps Figure 1: Add data to the volumes Figure 2: Select Agent bricks module Figure 3: Click on Create Agent option to deploy your agent Figure 4: Click on Update Agent option to update deploy your agent Agent Bricks in Action: Demo Figure 1: Response on Question based on data present in the dataset Figure 2: Response on Question asked based out of the present in the dataset To conclude, Agent Bricks empowers organizations to build intelligent, modular AI agents that are secure, scalable, and impactful. Whether you’re starting with a small HR assistant or scaling to enterprise-wide AI agents, the time to act is now. AI is no longer just a tool it’s your next teammate. Start building your AI workforce today with Agent Bricks. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com Start Your AI Journey Today !!
Share Story :
Databricks vs Azure Data Factory: When to Use Which in ETL Pipelines
Introduction: Two Powerful Tools, One Common Question If you work in data engineering, you’ve probably faced this question:Should I use Azure Data Factory or Databricks for my ETL pipeline? Both tools can move and transform data, but they serve very different purposes.Understanding where each tool fits can help you design cleaner, faster, and more cost-effective data pipelines. Let’s explore how these two Azure services complement each other rather than compete. What Is Azure Data Factory (ADF) Azure Data Factory is a data orchestration service.It’s designed to move, schedule, and automate data workflows between systems. Think of ADF as the “conductor of your data orchestra” — it doesn’t play the instruments itself, but it ensures everything runs in sync. Key Capabilities of ADF: Best For: What Is Azure Databricks Azure Databricks is a data processing and analytics platform built on Apache Spark.It’s designed for complex transformations, data modeling, and machine learning on large-scale data. Think of Databricks as the “engine” that processes and transforms the data your ADF pipelines deliver. Key Capabilities of Databricks: Best For: ADF vs Databricks: A Detailed Comparison Feature Azure Data Factory (ADF) Azure Databricks Primary Purpose Orchestration and data movement Data processing and advanced transformations Core Engine Integration Runtime Apache Spark Interface Type Low-code (GUI-based) Code-based (Python, SQL, Scala) Performance Limited by Data Flow engine Distributed and scalable Spark clusters Transformations Basic mapping and joins Complex joins, ML models, and aggregations Data Handling Batch-based Batch and streaming Cost Model Pay per pipeline run and Data Flow activity Pay per cluster usage (compute time) Versioning and Debugging Visual monitoring and alerts Notebook history and logging Integration Best for orchestrating multiple systems Best for building scalable ETL within pipelines In simple terms, ADF moves the data, while Databricks transforms it deeply. When to Use ADF Use Azure Data Factory when: Example:Copying data daily from Salesforce and SQL Server into Azure Data Lake. When to Use Databricks Use Databricks when: Example:Transforming millions of sales records into curated Delta tables with customer segmentation logic. When to Use Both Together In most enterprise data platforms, ADF and Databricks work together. Typical Flow: This hybrid approach combines the automation of ADF with the computing power of Databricks. Example Architecture:ADF → Databricks → Delta Lake → Synapse → Power BI This is a standard enterprise pattern for modern data engineering. Cost Considerations Using ADF for orchestration and Databricks for processing ensures you only pay for what you need. Best Practices Azure Data Factory and Azure Databricks are not competitors.They are complementary tools that together form a complete ETL solution. Understanding their strengths helps you design data pipelines that are reliable, scalable, and cost-efficient. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com
Share Story :
Designing a Clean Medallion Architecture in Databricks for Real Reporting Needs
Most reporting problems do not come from Power BI or visualization tools. They come from how the data is organized before it reaches the reporting layer. A lot of teams try to push raw CRM tables, ERP extracts, finance dumps, and timesheet files directly into Power BI models. This usually leads to slow refreshes, constant model changes, broken relationships, and inconsistent metrics across teams. A clean Medallion Architecture solves these issues by giving your data a predictable, layered structure inside Databricks. It gives reporting teams clarity, improves performance, and reduces rework across projects. Below is a senior-level view of how to design and implement it in a way that supports long-term reporting needs. Why the Medallion Architecture Matters The Medallion model gets discussed often, but in practice the value comes from discipline and consistency. The real benefit is not the three layers. It is the separation of responsibilities: This separation ensures data engineers, analysts, and reporting teams do not step on each other’s work. You avoid the common trap of mixing raw, cleaned, and aggregated data in the same folder or the same table, which eventually turns the lake into a “large folder with files,” not a structured ecosystem. Bronze Layer: The Record of What Actually Arrived The Bronze layer should be the most predictable part of your data platform. It contains raw data as received from CRM, ERP, HR, finance, or external systems. From a senior perspective, the bronze layer has two primary responsibilities: This means storing load timestamps, file names, and source identifiers. The Bronze layer is not the place for business logic. Any adjustment here will compromise traceability. A good bronze table lets you answer questions like:“What exactly did we receive from Business Central on the 7th of this month?”If your Bronze layer cannot answer this, it needs improvement. Silver Layer: Apply Business Logic Once, Use It Everywhere The Silver layer transforms raw data into standardized, trusted datasets. A senior approach focuses on solving root issues here, not patching them later.Typical responsibilities include: This is where you remove all the “noise” that Power BI models should never see. Silver is also where cross-functional logic goes.For example: Once the Silver layer is stable, the Gold layer becomes significantly simpler. Gold Layer: Data Structured for Reporting and Performance (Gold) represents the presentation layer of the Lakehouse. It contains curated datasets designed around reporting and analytics use cases, rather than reflecting how data is stored in source systems. A senior-level Gold layer focuses on: Gold tables should reflect business definitions, not technical ones. If your teams rely on metrics like utilization, revenue recognition, resource cost rates, or customer lifetime value, those calculations should live here. Gold is also where performance tuning matters. Partitioning, Z-ordering, and optimizing Delta tables significantly improves refresh times and Power BI performance. A Real-World Example In projects where CRM, Finance, HR, and Project data come from different systems, reporting becomes difficult when each department pulls data separately. A Medallion architecture simplifies this: The reporting team consumes these gold tables directly in Power BI with minimal transformations. Why This Architecture Works for Reporting Teams To conclude, a clean Medallion Architecture is not about technology – it’s about structure, discipline, and clarity. When implemented well, it removes daily friction between engineering and reporting teams.It also creates a strong foundation for governance, performance, and future scalability. Databricks makes the Medallion approach easier to maintain, especially when paired with Delta Lake and Unity Catalog. Together, these pieces create a data platform that can support both operational reporting and executive analytics at scale. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com
Share Story :
Gouge Linen and Garment Services Partners with CloudFronts for Dynamics 365 Sales and Customer Service
We are delighted to announce that Gouge Linen and Garment Services Partners with CloudFronts for the implementation of Dynamics 365 Sales and Customer Services. Founded in May 1945, Gouge Linen and Garment Services is a leading 100% Australian-owned industrial laundry provider, strengthening its role as a trusted partner across healthcare, aged care, hospitality, manufacturing, and food production sectors. Backed by state-of-the-art facilities and a dedicated logistics network, the company delivers high-quality linen, garment, mat, and towel services with exceptional reliability and efficiency. Gouge remains committed to sustainability through advanced water-saving systems, energy-efficient operations, and responsible business practices, reinforcing its strong reputation for service excellence and community impact. Learn more about them at https://www.gouge.com.au/ Gouge Linen currently relies on excel-based quoting, manual freight costing, and fragmented data stored across ABS/Oracle BI. This lack of integration limits visibility across departments, creating inefficiencies, delays, and scalability challenges. CloudFronts will deploy Microsoft Dynamics 365 Sales with a CPQ (Configure, Price, Quote) process to automate quote generation, team to streamline sales, service, and route costing workflows. The solution will deliver real-time visibility into inventory, production costs, and freight data, enabling faster, data-driven decisions across the organization. The implementation will reduce quote turnaround time from days to under an hour, improve coordination between sales, operations, and logistics teams, and enhance overall customer experience. On this occasion, Priyesh Wagh, Practice Manager at CloudFronts, stated: “ We look forward to partnering with the Gouge team as we initiate the first phase of their Dynamics 365 Sales and Customer Service implementation, which will evolve into a comprehensive quotation engine. This initial rollout will establish a strong foundation for an integrated sales-to-service platform, with the potential to extend into billing capabilities in the future. We’re excited to collaborate with the Gouge Linen and Garment Services team on this transformation journey.” About CloudFronts CloudFronts is a global AI- First Microsoft Solutions & Databricks Partner for Business Applications, Data & AI, helping teams and organizations worldwide solve their complex business challenges with Microsoft Cloud, AI, and Azure Integration Services. We have a global presence with offices in U.S, Singapore & India. Since its inception in 2012, CloudFronts has successfully served over 200+ small and medium-sized clients all over the world, such as North America, Europe, Australia, MENA, Maldives & India, with diverse experiences in sectors ranging from Professional Services, Financial Services, Manufacturing, Retail, Logistics/SCM, and Non-profits. Please feel free to connect with us at transform@cloudFronts.com
Share Story :
Optimizing Inventory Operations with Microsoft Dynamics 365 Business Central
Managing inventory effectively is essential for any organization aiming to balance stock levels, minimize excess inventory costs, and ensure timely order fulfillment.Microsoft Dynamics 365 Business Central provides a range of tools that simplify and automate inventory control – helping businesses maintain the right stock at the right time. In this post, we’ll walk through the key features and planning tools available in Business Central’s Inventory Management module. Pre-requisite: 1. Access the Item List Page Start by opening the Item List page. This page offers a complete overview of all active items, including quantities on hand, reorder points, and categories. It serves as the foundation for any inventory planning activity. 2. Open an Item Card Select an item from the list to view its Item Card, where you configure how the system manages, replenishes, and forecasts that product. The setup on this page directly affects how purchase or production orders are generated. a. Configure Replenishment Method and Reordering Policy Under the Replenishment tab, you can define how stock for each item should be refilled when levels drop below a specific threshold. Replenishment Methods include: Lead Time:Set the expected number of days it takes to receive, produce, or assemble an item. This ensures the system plans replenishment activities in advance. Reordering Policies: b. Using Stock Keeping Units (SKUs) for Location-Specific Planning SKUs allow tracking of an item by individual location or variant, enabling businesses to manage stock independently across warehouses or stores.This approach ensures accurate availability data, reduces fulfillment errors, and supports better demand analysis for each location. c. Demand Forecasting The Demand Forecast feature in Business Central helps predict future requirements by analyzing past sales and usage patterns.Forecasts can be system-generated or manually adjusted to reflect upcoming promotions, seasonal variations, or expected demand spikes. d. Requisition (MRP/MPS) Planning The Requisition Worksheet supports Material Requirements Planning (MRP) and Master Production Scheduling (MPS). It automatically reviews forecasts, current stock, and open orders to suggest what needs to be purchased or produced. The system lists recommendations such as item names, quantities, and suppliers.Once reviewed, click Carry Out Action Messages to create purchase or production orders directly — saving time and minimizing manual work. e. Aligning with Sales Orders When a Sales Order is entered, Business Central dynamically recalculates availability.If demand exceeds what was forecasted, the system proposes additional purchase or production orders to prevent shortages and maintain customer satisfaction. To conclude, Dynamics 365 Business Central simplifies inventory control by automating procurement, forecasting demand, and synchronizing stock levels with actual sales.By using replenishment rules, SKUs, and requisition planning, businesses can improve inventory accuracy, reduce costs, and deliver orders faster – all within a single integrated ERP system. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com
Share Story :
Master Guide: Team Foundation Server (TFVC) & Azure DevOps Configuration for Dynamics 365 Finance & Operations
In the world of Dynamics 365 Finance & Operations (D365 F&O), efficient code management isn’t just a luxury-it’s a critical requirement. Whether you are a seasoned developer or just setting up your first Virtual Machine (VM), correctly configuring Visual Studio with Azure DevOps (Team Foundation Server/TFVC) is the bedrock of a stable development lifecycle. This guide will walk you through the step-by-step configuration to ensure your environment is ready for enterprise-grade development. 1. Why TFVC and Not Git? While Git is widely adopted across modern software development, Team Foundation Version Control (TFVC) continues to be the preferred version control system for Dynamics 365 Finance & Operations due to its architectural fit. 2. Prerequisites Before you dive into Visual Studio, ensure you have the following ready: 3. Step-by-Step Configuration Step A: Connect Visual Studio to Azure DevOps Step B: The “Golden” Folder Structure Before mapping, you must define a clean folder structure in your Azure DevOps repository (Source Control Explorer). A standard structure looks like this: Step C: Workspace Mapping (The Critical Step) This is where most errors occur. You must map the server folders (Azure DevOps) to the specific local directories where the D365 runtime looks for code. Note: On some local VHDs or older VMs, the drive letter might be C: or J: instead of K:. Verify your AOSService location before mapping. Step D: Configuring Dynamics 365 Options Once mapped, you need to tell Visual Studio to organize new projects correctly. 4. Best Practices for the Development Lifecycle To conclude, configuring Visual Studio for D365 F&O is a one-time setup that pays dividends in stability. By ensuring your Metadata maps to the AOS service directory and your Projects map to your user directory, you create a seamless bridge between your IDE and the D365 runtime. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com
Share Story :
Sending Emails With Report Attachments via API in Business Central
In many integrations, external systems need to trigger Business Central (BC) to email documents—such as sales order confirmations, invoices, or custom reports—directly to customers.With the BC API page shown below, you can expose an endpoint that receives a Sales Order No. and Customer No., validates both, and then triggers a custom codeunit (SendCustomerEmails) that sends all required reports as email attachments. This approach allows external applications (ERP integrations, e-commerce systems, automation tools) to call BC and initiate document delivery without user interaction. Steps to Achieve the goal page 50131 “Custom Sales Order API”{ApplicationArea = All;APIGroup = ‘APIGroup’;APIPublisher = ‘VJ’;APIVersion = ‘v2.0’;Caption = ‘SendAllReportFromCustom’;DelayedInsert = true;EntityName = ‘SendAllReportFromCustom’;EntitySetName = ‘SendAllReportFromCustom’;PageType = API;SourceTable = “Sales Header”;Permissions = tabledata “Sales Header” = rimd;ODataKeyFields = “No.”; layout{ area(Content) { repeater(General) { field(“No”; DocumentNOL) { ApplicationArea = All; trigger OnValidate() var Rec_SO: Record “Sales Header”; Rec_SO1: Record “Sales Header”; begin if DocumentNOL = ” then Error(‘”No.” cannot be empty.’); Clear(Rec_SO); Rec_SO.Reset(); Rec_SO.SetRange(“Document Type”, Rec_SO1.”Document Type”::Order); Rec_SO.SetRange(“No.”, DocumentNOL); if not Rec_SO.FindFirst() then Error(‘Sales order does not exist in BC’); end; } field(“BilltoCustomerNo”; BillToCustomerNo) { ApplicationArea = All; trigger OnValidate() var Rec_Customer: Record Customer; Rec_SHG: Record “Sales Header”; begin Clear(Rec_Customer); Rec_Customer.Reset(); Rec_Customer.SetRange(“No.”, BillToCustomerNo); if not Rec_Customer.FindFirst() then Error(‘The customer does not exist in BC’) else begin if (DocumentNOL <> ”) and (BillToCustomerNo <> ”) then begin Clear(Rec_SHG); Rec_SHG.Reset(); Rec_SHG.SetRange(“Document Type”, Rec_SHG.”Document Type”::Order); Rec_SHG.SetRange(“Bill-to Customer No.”, BillToCustomerNo); Rec_SHG.SetRange(“No.”, DocumentNOL); if Rec_SHG.FindFirst() then SendEmail.SendAllReports(Rec_SHG) else Error( ‘No sales order found for the given bill-to customer number %1 and order number %2.’, BillToCustomerNo, DocumentNOL); end; end; end; } } }} var DocumentNOL: Code[30]; BillToCustomerNo: Code[30]; SendEmail: Codeunit SendCustomerEmails; } Codeunit to send email and attach the pdf codeunit 50016 SendCustomerEmails{Permissions = tabledata “Sales Header” = rimd, tabledata “Sales Invoice Header” = rimd; procedure SendAllReports(var Rec_SH: Record “Sales Header”): Booleanvar TempBlob: Codeunit “Temp Blob”; outStream: OutStream; inStreamVar: InStream; EmailCU: Codeunit Email; EmailMsg: Codeunit “Email Message”; Rec_Customer: Record Customer; Ref: RecordRef;begin Rec_Customer.Reset(); Rec_Customer.SetRange(“No.”, Rec_SH.”Bill-to Customer No.”); if not Rec_Customer.FindFirst() then Error(‘Customer not found: %1’, Rec_SH.”Bill-to Customer No.”); if Rec_Customer.”E-Mail” = ” then Error(‘No email address found for customer %1’, Rec_Customer.”No.”); // Create the email message (English only) EmailMsg.Create( Rec_Customer.”E-Mail”, StrSubstNo(‘Your order confirmation – %1’, Rec_SH.”No.”), StrSubstNo(‘Dear %1, <br><br>Thank you for your order. Attached you will find your order confirmation and related documents.<br><br>Best regards,’, Rec_Customer.”Name”), true); // Prepare a record reference for report generation Ref.Get(Rec_SH.RecordId); Ref.SetRecFilter(); // Generate first report (e.g. Order Confirmation) TempBlob.CreateOutStream(outStream); Report.SaveAs(50100, ”, ReportFormat::Pdf, outStream, Ref); TempBlob.CreateInStream(inStreamVar); EmailMsg.AddAttachment(‘OrderConfirmation_’ + Rec_SH.”No.” + ‘.pdf’, ‘application/pdf’, inStreamVar); // Generate second report (e.g. Invoice or any other report you want) TempBlob.CreateOutStream(outStream); Report.SaveAs(1306, ”, ReportFormat::Pdf, outStream, Ref); TempBlob.CreateInStream(inStreamVar); EmailMsg.AddAttachment(‘Invoice_’ + Rec_SH.”No.” + ‘.pdf’, ‘application/pdf’, inStreamVar); // Send the email EmailCU.Send(EmailMsg); Message(‘Email with PDF report(s) sent for document No %1’, Rec_SH.”No.”); exit(true);end; } To conclude, this API lets external systems initiate automatic emailing of sales order reports from Business Central. With just two inputs, you can trigger any complex reporting logic encapsulated inside your custom codeunit. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
Create records in Dynamics CRM using Microsoft Excel Online
Importing customer data into Dynamics 365 doesn’t have to be complicated. Whether you’re migrating from another system or onboarding a large volume of new customers, using Microsoft Excel Online provides a quick, user-friendly, and efficient way to create multiple records at once-without any technical setup. In this blog, I’ll walk you through a simple step-by-step process to import customer (or any entity) records directly into your Dynamics 365 environment using Excel Online, ensuring clean, fast, and accurate data entry. Let’s say you want to import customer records or any entity records in dynamics CRM in this blog I will show you how you can import multiple customer records into your dynamics 365 environment simply using Microsoft Excel online. Step 1: Go to the entity’s home page who’s records you want to create (In my case it is customer entity). Step 2: On the active accounts view (or any view) click on edit columns and add the columns as per the data you want to be fill in. (Don’t forget to hit apply button at the bottom) Step 2 : Once your view is ready click on Export to Excel Button on the top left and select Open in excel online. Step 3: If you are using a system view like in this example you will see existing records on the online excel, you can clear those records or keep them as is. If you change any existing record, it will update the data of that record so you can also use this to update existing records at once (I will write a separate blog post for updating records for now let’s focus on creating records) Step 4: You can then add the data which you want to create to the online excel sheet, in this example I am transferring data from a local excel sheet to the online excel. Step 5: Once you have added your data on the online excel, hit apply button. Step 6: You will get a Popup about your data being submitted for import, hit Track Progress. Step 7: You will see your data has been submitted and is parsing. (It will take couple of minutes to hours depending upon the amount of data you have submitted keep refreshing to see the progress of the records). Step 8: Once the import job is completed you will see how many records were created successfully and how many failed or partially failed. You can open the import job and check failed entries, correct the entries and re-import Failed records All the successfully parsed records will be created in your system. Importing customer records in Dynamics 365 becomes incredibly seamless with Excel Online. With just a few steps-preparing your view, exporting to Excel, adding your data, and submitting the import-you can create hundreds or even thousands of records in a fraction of the time. This approach not only speeds up data entry but also ensures consistency and reduces manual errors. Hope this helps! 😊 I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
Filtering Dynamics 365 Subgrids Without Relationships: A JavaScript-Only Approach Using setFilterXml
In Microsoft Dynamics 365, subgrids are a powerful way to display related records on a form. But what happens when: Out-of-the-box, Dynamics 365 doesn’t give us many options here. We can select a view, but we cannot apply dynamic filters unless the entities are directly related or the criteria already exist in the view’s FetchXML. This is where the JavaScript setFilterXml() API becomes a life-saver. In this article, I’ll show you how to filter a subgrid dynamically using JavaScript — even when the subgrid’s entity is completely unrelated to the main form entity. Use Case Imagine your form has a field called Name, and you want to filter the subgrid so that it shows only records whose Name begins with the same prefix. But: As there are also records, where the lookup column might need to be empty on purpose, which further would break relationship based filtering in the subgrid. OOB? Impossible. With JavaScript? Totally doable. How the JS based subgrid filtering works In Dynamics 365, subgrids are rendered as independent UI components inside the form. Even though the form loads first, subgrids load asynchronously in the background, which means: The form and its fields may already be available, but the subgrid control might not yet exist, so trying to apply a filter immediately on form load will fail. Here is the basic structure of a JS Function to perform Subgrid filtering – This control represents the interactive UI component that displays the records for the view.It gives you programmatic access to:-> Set filters-> Refresh the grid-> Access its view ID-> Handle events (in some versions) However, because subgrids load later than the form, this line may return null the first several times. If you proceed at that point, your script will break.So we implement a retry pattern: If the subgrid is not ready, wait 100ms -> Try again -> Repeat until the control becomes availableThis guarantees that our next steps run only when the subgrid is fully loaded. var oAnnualTCVTargetGridFilter = oAnnualTCVTargetGridFilter || {}; oAnnualTCVTargetGridFilter.filterSubgrid = function(executionContext) {var formContext = executionContext.getFormContext(); }; To make sure the filter is applied correctly, we follow a three-step workflow: 1. Retry Until the Subgrid Control Is Loaded (setTimeout) – When the script runs, we attempt to retrieve the subgrid control using: var subgrid = formContext.getControl(“tcvtargets”); 2. Apply the Filter (setFilterXml()) – Once the subgrid control is found, we can safely apply a filter. Then we can apply our filtering logic, and utilize it in the FetchXML Query: -> Read the field Name (cf_name) from the main form & design a logic -> Construct a FetchXML <filter> element -> Passing this XML to the subgrid using: This tells Dynamics 365 to apply an additional filter on top of the existing view used by the subgrid. A few important things to note: If the cf_name field is empty, we instead apply a special filter that returns no rows. This ensures the grid displays only relevant and context-driven data. 3. Refresh the Subgrid (subgrid.refresh()) – After applying the filter XML, the subgrid must be refreshed: Without this call, Dynamics will not re-run the query, meaning your filter won’t take effect until the user manually refreshes the subgrid. Refreshing forces the system to: -> Re-query data using the combined view FetchXML + your custom filter -> Re-render the grid -> Display the filtered results immediately This gives the user a seamless, dynamic experience where the subgrid shows exactly the records that match the context. JS + FetchXML based filtering in action – Without filtering :- With filtering :- Key Advantages of This Approach Works Even When No Relationship Exists Possibility to filter a subgrid even if the target entity has no direct link to the form’s main entity. This is extremely useful in cases where the relationship must remain optional or intentionally unpopulated. Enables Dynamic, Contextual Filtering We can design filtering logic on the form field values, user selections, or business rules. Filtering on Fields Not Included in the View Since the filtering logic is applied client-side, there is no need to modify or clone the system view just to include filterable fields. Bypasses Limitations of Lookup-Based Relation Filtering This method works even when the lookup column is intentionally left empty, which is a scenario where OOB relationship-based filtering fails. More Flexible Than Traditional View Editing You can apply advanced logic such as prefix matching, conditional filters, or dynamic ranges—things not possible using standard UI-only configuration. To conclude, filtering subgrids dynamically in Dynamics 365 is not something the platform supports out-of-the-box- especially when the entities are unrelated or when the filter criteria doesn’t exist in the subgrid’s original view. However, with a small amount of JavaScript and the setFilterXml() API, you gain complete control over what data appears inside a sub grid, purely based on the context passed from the main form. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
Power BI Workspace Security: How to Choose the Right Roles for Your Team
Workspace security is one of the most important parts of managing Power BI in any organization. You might have great reports, well-designed datasets, and a smooth refresh pipeline – but if the wrong people get access to your workspace, things can break quickly. Reports can be overwritten, datasets modified, or confidential information exposed. Power BI uses a clear role-based access model to control who can do what inside a workspace. The only challenge is understanding which role to assign to which user. In this guide, we’ll break down the roles in simple terms, explain what they allow, and help you decide which one is appropriate in real situations. The goal is to make workspace security easy, predictable, and mistake-free. Understanding Power BI Workspace Roles Power BI provides four primary workspace roles: Each role controls the level of access a person has across datasets, reports, refreshes, and workspace settings. Below is a clear explanation of what each role does. 1. Admin The admin role has full control over the workspace. Admins can add or remove users, assign roles, update reports, delete datasets, change refresh settings, and modify workspace configurations. Admins should be limited to your BI team or IT administrators. Giving Admin access to business users often leads to accidental changes or loss of content. 2. Member Members have high-level access, but not full control. They can publish content, edit reports, modify datasets, schedule refreshes, and share content with others. However, they cannot manage workspace users or update security settings. This role is usually assigned to internal report developers or analysts who actively maintain reports. 3. Contributor Contributors can create and publish content, refresh datasets, and edit reports they own. They cannot modify or delete items created by others and cannot add or remove users. This role is ideal for team-level contributors, temporary developers, or department users who build reports only for their group. 4. Viewer Viewers can access and interact with reports but cannot edit or publish anything. They cannot access datasets or modify visuals. This is the safest role and should be assigned to most end-users and leadership teams. Viewers can explore content, use filters and drill features, and export data if the dataset allows it. Quick Comparison Table Role View Reports Edit Reports Publish Modify Datasets Add Users Typical Use Admin Yes Yes Yes Yes Yes BI Admins Member Yes Yes Yes Yes No Report Developers Contributor Yes Their own Yes Their own No Team Contributors Viewer Yes No No No No Consumers Examples Finance Department Sales Team External Clients Always use Viewer to avoid accidental edits or exposure of internal configurations. To conclude, power BI workspace security is simple once you understand how each role works. The key is to assign access based on responsibility, not convenience. Viewers should consume content, Contributors should create their own content, Members should manage reports, and Admins should oversee the entire workspace. Using the right roles helps you protect your data, maintain clean workspaces, and ensure that only the right people can make changes. A well-managed workspace makes your Power BI environment more reliable and easier to scale as your reporting needs grow. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
