Latest Microsoft Dynamics 365 Blogs | CloudFronts

Optimizing Inventory Operations with Microsoft Dynamics 365 Business Central

Managing inventory effectively is essential for any organization aiming to balance stock levels, minimize excess inventory costs, and ensure timely order fulfillment.Microsoft Dynamics 365 Business Central provides a range of tools that simplify and automate inventory control – helping businesses maintain the right stock at the right time. In this post, we’ll walk through the key features and planning tools available in Business Central’s Inventory Management module. Pre-requisite: 1. Access the Item List Page Start by opening the Item List page. This page offers a complete overview of all active items, including quantities on hand, reorder points, and categories. It serves as the foundation for any inventory planning activity. 2. Open an Item Card Select an item from the list to view its Item Card, where you configure how the system manages, replenishes, and forecasts that product. The setup on this page directly affects how purchase or production orders are generated. a. Configure Replenishment Method and Reordering Policy Under the Replenishment tab, you can define how stock for each item should be refilled when levels drop below a specific threshold. Replenishment Methods include: Lead Time:Set the expected number of days it takes to receive, produce, or assemble an item. This ensures the system plans replenishment activities in advance. Reordering Policies: b. Using Stock Keeping Units (SKUs) for Location-Specific Planning SKUs allow tracking of an item by individual location or variant, enabling businesses to manage stock independently across warehouses or stores.This approach ensures accurate availability data, reduces fulfillment errors, and supports better demand analysis for each location. c. Demand Forecasting The Demand Forecast feature in Business Central helps predict future requirements by analyzing past sales and usage patterns.Forecasts can be system-generated or manually adjusted to reflect upcoming promotions, seasonal variations, or expected demand spikes. d. Requisition (MRP/MPS) Planning The Requisition Worksheet supports Material Requirements Planning (MRP) and Master Production Scheduling (MPS). It automatically reviews forecasts, current stock, and open orders to suggest what needs to be purchased or produced. The system lists recommendations such as item names, quantities, and suppliers.Once reviewed, click Carry Out Action Messages to create purchase or production orders directly — saving time and minimizing manual work. e. Aligning with Sales Orders When a Sales Order is entered, Business Central dynamically recalculates availability.If demand exceeds what was forecasted, the system proposes additional purchase or production orders to prevent shortages and maintain customer satisfaction. To conclude, Dynamics 365 Business Central simplifies inventory control by automating procurement, forecasting demand, and synchronizing stock levels with actual sales.By using replenishment rules, SKUs, and requisition planning, businesses can improve inventory accuracy, reduce costs, and deliver orders faster – all within a single integrated ERP system. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

Master Guide: Team Foundation Server (TFVC) & Azure DevOps Configuration for Dynamics 365 Finance & Operations

In the world of Dynamics 365 Finance & Operations (D365 F&O), efficient code management isn’t just a luxury-it’s a critical requirement. Whether you are a seasoned developer or just setting up your first Virtual Machine (VM), correctly configuring Visual Studio with Azure DevOps (Team Foundation Server/TFVC) is the bedrock of a stable development lifecycle. This guide will walk you through the step-by-step configuration to ensure your environment is ready for enterprise-grade development. 1. Why TFVC and Not Git? While Git is widely adopted across modern software development, Team Foundation Version Control (TFVC) continues to be the preferred version control system for Dynamics 365 Finance & Operations due to its architectural fit. 2. Prerequisites Before you dive into Visual Studio, ensure you have the following ready: 3. Step-by-Step Configuration Step A: Connect Visual Studio to Azure DevOps Step B: The “Golden” Folder Structure Before mapping, you must define a clean folder structure in your Azure DevOps repository (Source Control Explorer). A standard structure looks like this: Step C: Workspace Mapping (The Critical Step) This is where most errors occur. You must map the server folders (Azure DevOps) to the specific local directories where the D365 runtime looks for code. Note: On some local VHDs or older VMs, the drive letter might be C: or J: instead of K:. Verify your AOSService location before mapping. Step D: Configuring Dynamics 365 Options Once mapped, you need to tell Visual Studio to organize new projects correctly. 4. Best Practices for the Development Lifecycle To conclude, configuring Visual Studio for D365 F&O is a one-time setup that pays dividends in stability. By ensuring your Metadata maps to the AOS service directory and your Projects map to your user directory, you create a seamless bridge between your IDE and the D365 runtime. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

Sending Emails With Report Attachments via API in Business Central

In many integrations, external systems need to trigger Business Central (BC) to email documents—such as sales order confirmations, invoices, or custom reports—directly to customers.With the BC API page shown below, you can expose an endpoint that receives a Sales Order No. and Customer No., validates both, and then triggers a custom codeunit (SendCustomerEmails) that sends all required reports as email attachments. This approach allows external applications (ERP integrations, e-commerce systems, automation tools) to call BC and initiate document delivery without user interaction. Steps to Achieve the goal page 50131 “Custom Sales Order API”{ApplicationArea = All;APIGroup = ‘APIGroup’;APIPublisher = ‘VJ’;APIVersion = ‘v2.0’;Caption = ‘SendAllReportFromCustom’;DelayedInsert = true;EntityName = ‘SendAllReportFromCustom’;EntitySetName = ‘SendAllReportFromCustom’;PageType = API;SourceTable = “Sales Header”;Permissions = tabledata “Sales Header” = rimd;ODataKeyFields = “No.”; layout{ area(Content) { repeater(General) { field(“No”; DocumentNOL) { ApplicationArea = All; trigger OnValidate() var Rec_SO: Record “Sales Header”; Rec_SO1: Record “Sales Header”; begin if DocumentNOL = ” then Error(‘”No.” cannot be empty.’); Clear(Rec_SO); Rec_SO.Reset(); Rec_SO.SetRange(“Document Type”, Rec_SO1.”Document Type”::Order); Rec_SO.SetRange(“No.”, DocumentNOL); if not Rec_SO.FindFirst() then Error(‘Sales order does not exist in BC’); end; } field(“BilltoCustomerNo”; BillToCustomerNo) { ApplicationArea = All; trigger OnValidate() var Rec_Customer: Record Customer; Rec_SHG: Record “Sales Header”; begin Clear(Rec_Customer); Rec_Customer.Reset(); Rec_Customer.SetRange(“No.”, BillToCustomerNo); if not Rec_Customer.FindFirst() then Error(‘The customer does not exist in BC’) else begin if (DocumentNOL <> ”) and (BillToCustomerNo <> ”) then begin Clear(Rec_SHG); Rec_SHG.Reset(); Rec_SHG.SetRange(“Document Type”, Rec_SHG.”Document Type”::Order); Rec_SHG.SetRange(“Bill-to Customer No.”, BillToCustomerNo); Rec_SHG.SetRange(“No.”, DocumentNOL); if Rec_SHG.FindFirst() then SendEmail.SendAllReports(Rec_SHG) else Error( ‘No sales order found for the given bill-to customer number %1 and order number %2.’, BillToCustomerNo, DocumentNOL); end; end; end; } } }} var DocumentNOL: Code[30]; BillToCustomerNo: Code[30]; SendEmail: Codeunit SendCustomerEmails; } Codeunit to send email and attach the pdf codeunit 50016 SendCustomerEmails{Permissions = tabledata “Sales Header” = rimd, tabledata “Sales Invoice Header” = rimd; procedure SendAllReports(var Rec_SH: Record “Sales Header”): Booleanvar TempBlob: Codeunit “Temp Blob”; outStream: OutStream; inStreamVar: InStream; EmailCU: Codeunit Email; EmailMsg: Codeunit “Email Message”; Rec_Customer: Record Customer; Ref: RecordRef;begin Rec_Customer.Reset(); Rec_Customer.SetRange(“No.”, Rec_SH.”Bill-to Customer No.”); if not Rec_Customer.FindFirst() then Error(‘Customer not found: %1’, Rec_SH.”Bill-to Customer No.”); if Rec_Customer.”E-Mail” = ” then Error(‘No email address found for customer %1’, Rec_Customer.”No.”); // Create the email message (English only) EmailMsg.Create( Rec_Customer.”E-Mail”, StrSubstNo(‘Your order confirmation – %1’, Rec_SH.”No.”), StrSubstNo(‘Dear %1, <br><br>Thank you for your order. Attached you will find your order confirmation and related documents.<br><br>Best regards,’, Rec_Customer.”Name”), true); // Prepare a record reference for report generation Ref.Get(Rec_SH.RecordId); Ref.SetRecFilter(); // Generate first report (e.g. Order Confirmation) TempBlob.CreateOutStream(outStream); Report.SaveAs(50100, ”, ReportFormat::Pdf, outStream, Ref); TempBlob.CreateInStream(inStreamVar); EmailMsg.AddAttachment(‘OrderConfirmation_’ + Rec_SH.”No.” + ‘.pdf’, ‘application/pdf’, inStreamVar); // Generate second report (e.g. Invoice or any other report you want) TempBlob.CreateOutStream(outStream); Report.SaveAs(1306, ”, ReportFormat::Pdf, outStream, Ref); TempBlob.CreateInStream(inStreamVar); EmailMsg.AddAttachment(‘Invoice_’ + Rec_SH.”No.” + ‘.pdf’, ‘application/pdf’, inStreamVar); // Send the email EmailCU.Send(EmailMsg); Message(‘Email with PDF report(s) sent for document No %1’, Rec_SH.”No.”); exit(true);end; } To conclude, this API lets external systems initiate automatic emailing of sales order reports from Business Central. With just two inputs, you can trigger any complex reporting logic encapsulated inside your custom codeunit. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Create records in Dynamics CRM using Microsoft Excel Online 

Importing customer data into Dynamics 365 doesn’t have to be complicated. Whether you’re migrating from another system or onboarding a large volume of new customers, using Microsoft Excel Online provides a quick, user-friendly, and efficient way to create multiple records at once-without any technical setup. In this blog, I’ll walk you through a simple step-by-step process to import customer (or any entity) records directly into your Dynamics 365 environment using Excel Online, ensuring clean, fast, and accurate data entry. Let’s say you want to import customer records or any entity records in dynamics CRM in this blog I will show you how you can import multiple customer records into your dynamics 365 environment simply using Microsoft Excel online.  Step 1: Go to the entity’s home page who’s records you want to create (In my case it is customer entity).   Step 2: On the active accounts view (or any view) click on edit columns and add the columns as per the data you want to be fill in. (Don’t forget to hit apply button at the bottom)  Step 2 : Once your view is ready click on Export to Excel Button on the top left and select Open in excel online.  Step 3: If you are using a system view like in this example you will see existing records on the online excel, you can clear those records or keep them as is. If you change any existing record, it will update the data of that record so you can also use this to update existing records at once (I will write a separate blog post for updating records for now let’s focus on creating records)  Step 4: You can then add the data which you want to create to the online excel sheet, in this example I am transferring data from a local excel sheet to the online excel.  Step 5: Once you have added your data on the online excel, hit apply button.  Step 6: You will get a Popup about your data being submitted for import, hit Track Progress.  Step 7: You will see your data has been submitted and is parsing. (It will take couple of minutes to hours depending upon the amount of data you have submitted keep refreshing to see the progress of the records).  Step 8: Once the import job is completed you will see how many records were created successfully and how many failed or partially failed. You can open the import job and check failed entries, correct the entries and re-import   Failed records  All the successfully parsed records will be created in your system.  Importing customer records in Dynamics 365 becomes incredibly seamless with Excel Online. With just a few steps-preparing your view, exporting to Excel, adding your data, and submitting the import-you can create hundreds or even thousands of records in a fraction of the time. This approach not only speeds up data entry but also ensures consistency and reduces manual errors. Hope this helps! 😊  I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Filtering Dynamics 365 Subgrids Without Relationships: A JavaScript-Only Approach Using setFilterXml

In Microsoft Dynamics 365, subgrids are a powerful way to display related records on a form. But what happens when: Out-of-the-box, Dynamics 365 doesn’t give us many options here. We can select a view, but we cannot apply dynamic filters unless the entities are directly related or the criteria already exist in the view’s FetchXML. This is where the JavaScript setFilterXml() API becomes a life-saver. In this article, I’ll show you how to filter a subgrid dynamically using JavaScript — even when the subgrid’s entity is completely unrelated to the main form entity. Use Case Imagine your form has a field called Name, and you want to filter the subgrid so that it shows only records whose Name begins with the same prefix. But: As there are also records, where the lookup column might need to be empty on purpose, which further would break relationship based filtering in the subgrid. OOB? Impossible. With JavaScript? Totally doable. How the JS based subgrid filtering works In Dynamics 365, subgrids are rendered as independent UI components inside the form. Even though the form loads first, subgrids load asynchronously in the background, which means: The form and its fields may already be available, but the subgrid control might not yet exist, so trying to apply a filter immediately on form load will fail. Here is the basic structure of a JS Function to perform Subgrid filtering – This control represents the interactive UI component that displays the records for the view.It gives you programmatic access to:-> Set filters-> Refresh the grid-> Access its view ID-> Handle events (in some versions) However, because subgrids load later than the form, this line may return null the first several times. If you proceed at that point, your script will break.So we implement a retry pattern: If the subgrid is not ready, wait 100ms -> Try again -> Repeat until the control becomes availableThis guarantees that our next steps run only when the subgrid is fully loaded. var oAnnualTCVTargetGridFilter = oAnnualTCVTargetGridFilter || {}; oAnnualTCVTargetGridFilter.filterSubgrid = function(executionContext) {var formContext = executionContext.getFormContext(); }; To make sure the filter is applied correctly, we follow a three-step workflow: 1. Retry Until the Subgrid Control Is Loaded (setTimeout) – When the script runs, we attempt to retrieve the subgrid control using: var subgrid = formContext.getControl(“tcvtargets”); 2. Apply the Filter (setFilterXml()) – Once the subgrid control is found, we can safely apply a filter. Then we can apply our filtering logic, and utilize it in the FetchXML Query: -> Read the field Name (cf_name) from the main form & design a logic -> Construct a FetchXML <filter> element -> Passing this XML to the subgrid using: This tells Dynamics 365 to apply an additional filter on top of the existing view used by the subgrid. A few important things to note: If the cf_name field is empty, we instead apply a special filter that returns no rows. This ensures the grid displays only relevant and context-driven data. 3. Refresh the Subgrid (subgrid.refresh()) – After applying the filter XML, the subgrid must be refreshed: Without this call, Dynamics will not re-run the query, meaning your filter won’t take effect until the user manually refreshes the subgrid. Refreshing forces the system to: -> Re-query data using the combined view FetchXML + your custom filter -> Re-render the grid -> Display the filtered results immediately This gives the user a seamless, dynamic experience where the subgrid shows exactly the records that match the context. JS + FetchXML based filtering in action – Without filtering :- With filtering :- Key Advantages of This Approach Works Even When No Relationship Exists Possibility to filter a subgrid even if the target entity has no direct link to the form’s main entity. This is extremely useful in cases where the relationship must remain optional or intentionally unpopulated. Enables Dynamic, Contextual Filtering We can design filtering logic on the form field values, user selections, or business rules. Filtering on Fields Not Included in the View Since the filtering logic is applied client-side, there is no need to modify or clone the system view just to include filterable fields. Bypasses Limitations of Lookup-Based Relation Filtering This method works even when the lookup column is intentionally left empty, which is a scenario where OOB relationship-based filtering fails. More Flexible Than Traditional View Editing You can apply advanced logic such as prefix matching, conditional filters, or dynamic ranges—things not possible using standard UI-only configuration. To conclude, filtering subgrids dynamically in Dynamics 365 is not something the platform supports out-of-the-box- especially when the entities are unrelated or when the filter criteria doesn’t exist in the subgrid’s original view. However, with a small amount of JavaScript and the setFilterXml() API, you gain complete control over what data appears inside a sub grid, purely based on the context passed from the main form. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Power BI Workspace Security: How to Choose the Right Roles for Your Team

Workspace security is one of the most important parts of managing Power BI in any organization. You might have great reports, well-designed datasets, and a smooth refresh pipeline – but if the wrong people get access to your workspace, things can break quickly. Reports can be overwritten, datasets modified, or confidential information exposed. Power BI uses a clear role-based access model to control who can do what inside a workspace. The only challenge is understanding which role to assign to which user. In this guide, we’ll break down the roles in simple terms, explain what they allow, and help you decide which one is appropriate in real situations. The goal is to make workspace security easy, predictable, and mistake-free. Understanding Power BI Workspace Roles Power BI provides four primary workspace roles: Each role controls the level of access a person has across datasets, reports, refreshes, and workspace settings. Below is a clear explanation of what each role does. 1. Admin The admin role has full control over the workspace. Admins can add or remove users, assign roles, update reports, delete datasets, change refresh settings, and modify workspace configurations. Admins should be limited to your BI team or IT administrators. Giving Admin access to business users often leads to accidental changes or loss of content. 2. Member Members have high-level access, but not full control. They can publish content, edit reports, modify datasets, schedule refreshes, and share content with others. However, they cannot manage workspace users or update security settings. This role is usually assigned to internal report developers or analysts who actively maintain reports. 3. Contributor Contributors can create and publish content, refresh datasets, and edit reports they own. They cannot modify or delete items created by others and cannot add or remove users. This role is ideal for team-level contributors, temporary developers, or department users who build reports only for their group. 4. Viewer Viewers can access and interact with reports but cannot edit or publish anything. They cannot access datasets or modify visuals. This is the safest role and should be assigned to most end-users and leadership teams. Viewers can explore content, use filters and drill features, and export data if the dataset allows it. Quick Comparison Table Role View Reports Edit Reports Publish Modify Datasets Add Users Typical Use Admin Yes Yes Yes Yes Yes BI Admins Member Yes Yes Yes Yes No Report Developers Contributor Yes Their own Yes Their own No Team Contributors Viewer Yes No No No No Consumers Examples Finance Department Sales Team External Clients Always use Viewer to avoid accidental edits or exposure of internal configurations. To conclude, power BI workspace security is simple once you understand how each role works. The key is to assign access based on responsibility, not convenience. Viewers should consume content, Contributors should create their own content, Members should manage reports, and Admins should oversee the entire workspace. Using the right roles helps you protect your data, maintain clean workspaces, and ensure that only the right people can make changes. A well-managed workspace makes your Power BI environment more reliable and easier to scale as your reporting needs grow. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Resource Management in Dynamics 365 Project Operations

Resource Management is at the heart of delivering successful project-based services. In Dynamics 365 Project Operations (D365 PO), it ensures that the right people with the right skills are assigned to the right projects at the right time. Effective resource management boosts utilization, enhances profitability, and drives customer satisfaction. This article explores how D365 PO supports resource allocation, capacity planning, and skills-based matching. 1. Resource Types in D365 PO In D365 PO, resources can be: Resource Type Description User Licensed individual within the system Contact External personnel (e.g., subcontractors) Generic Placeholder resource for planning purposes Resources are linked to roles, organizations, cost rates, and sales prices. 2. Resource Allocation  Process: Booking Types: Each booking is visible in Team Members section of the project and feeds into utilization reports.  3. Capacity Planning Capacity planning in D365 PO is about balancing project demands with available workforce capacity. Key Capabilities: Project Managers and Resource Managers can proactively manage staffing levels, avoiding burnout or bench time. 4. Skills-Based Matching Matching resources based on skills, proficiency, and certifications ensures project quality and client satisfaction. Skill Matching Features: This structured approach supports fair allocation, talent development, and project success. 5. Impact on Project Execution Good resource management: All bookings and allocations are tightly integrated with WBS tasks, time entry, and financial tracking modules in Dynamics 365 PO. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Update any number of entity records in dynamics CRM using Microsoft Excel Online 

Posted On December 5, 2025 by Vidit Gholam Posted in Tagged in

There are many ways to update multiple records of a dynamics CRM entity, in this blog let’s see one of the easiest and faster way to do it that is by using excel online.  Let’s consider an example, let’s say you have a fixed number of account records and you manually want to update the account number.  Step 1: Go to the entity’s home page who’s records you want to update.  Step 2: On the All-Accounts view (or any view) clicks on edit columns and add the columns as which you want to update in my case it is Account Number.  Step 2 : Once your view is ready click on Export to Excel Button on the top left and select Open in excel online.  Step 3:  This will open all your accounts in an excel sheet in a pop-up window.  Step 4: Now you just need to update the columns which you want to update and hit save (I am adding all the account numbers).  Step 6: You will get a Popup about your data being submitted for import, hit Track Progress.  Step 7: You will see your data has been submitted for updating and is parsing. (It will take couple of minutes to hours depending upon the amount of data you have submitted keep refreshing to see the progress of the records).  Step 8: Once the import job is completed you will see how many records were created successfully and how many failed or partially failed. You can open the import job and check failed entries, correct the entries and re-import (All my reports where successfully updates)  Failed records (Sample from some older imports)  All the successfully parsed records will be updated in your system.  Before Update:   After Update:   Hope this helps! 😊  I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Overcoming Dataverse Connector Limitations: The Power Automate Approach to Export Hidden

Working with Microsoft Dataverse Connector in Power BI is usually straightforward—until you encounter a table that simply refuses to load any rows, even though the data clearly exists in the environment. This happens especially with hidden, virtual, or system-driven tables (e.g. msdyn_businessclosure, msdyn_scheduleboardsetting) which are commonly used in Field Service and Scheduling scenarios. Before jumping to a workaround, it’s important to understand why certain Dataverse tables don’t load in Power BI, what causes this behavior, and why the standard Dataverse connector may legitimately return zero rows. Causes – 1] The Table Is a Virtual or System Table with Restricted AccessSystem-managed Dataverse tables like msdyn_businessclosure are not exposed to the Dataverse connector because they support internal scheduling and platform functions. 2] No Records Exist in the Root Business Unit Data owned by child business units is not visible to Power BI accounts associated with a different BU, resulting in zero rows returned. 3] The Table Is Not Included in the Standard Dataverse Connector Some solution-driven or non-standard tables are omitted from the Dataverse connector’s supported list, so Power BI cannot load them. Solution: Export Dataverse Data Using Power Automate + Excel Sync Since Power BI can read:-> OneDrive-hosted files-> Excel files-> SharePoint-hosted spreadsheets …a suitable workaround is to extract the restricted Dataverse table into Excel using a scheduled (When the records are few) / Dataverse triggered (When there are many records and you only want a single one, to avoid pagination) Power Automate flow. What it can do –-> Power Automate can access system-driven tables.-> Excel files in SharePoint can be refreshed by Power BI Service.-> we can bypass connector restrictions entirely.-> The method works even if entities have hidden metadata or internal platform logic. This ensures:-> Consistent refresh cycles-> Full visibility of all table rows-> No dependency on Dataverse connector limitations Use case I needed to use the Business Closures table (Dataverse entity: msdyn_businessclosure) for a few calculations and visuals in a Power BI report. However, when I imported it through the Dataverse Connector, the table consistently showed zero records, even though the data was clearly present inside Dynamics 365. There are 2 reasons possible for this –1] It is a System/Platform Tablemsdyn_businessclosure is a system-managed scheduling table, and system tables are often hidden from external connectors, causing Power BI to return no data. 2] The Table Is Not Included in “Standard Tables” Exposed to Power BIMany internal Field Service and scheduling entities are excluded from the Dataverse connector’s metadata, so Power BI cannot retrieve their rows even if they exist. So here, we would fetch the records via “Listing” in Power automate and write to an excel file to bypass the limitations that hinder the exposure of that data; without compromising on user privileges, or system roles; we can also control or filter the rows being referred directly at source before reaching PBI Report. Automation steps – 1] Select a suitable trigger to fetch the rows of that entity (Recurring or Dataverse, whichever is suitable). 2] List the rows from the entity (Sort/Filter/Select/Expand as necessary). 3] Perform a specific logic (e.g. clearing the existing rows, etc.) on the excel file where the data would be written to. 4] For each row in the Dataverse entity, select a primary key (e.g. the GUID), provide the path to the particular excel file (e.g. SharePoint -> Location -> Document Library -> File Name -> Sheet or Table in the Excel File), & assign the dynamic values of each row to the columns in the excel file. 5] Once this is done, import it to the PBI Report by using suitable Power Query Logic in the Advanced Editor as follows – -> a) Loading an Excel File from SharePoint Using Web.Contents() – Source = Excel.Workbook(Web.Contents(“https://<domain>.sharepoint.com/sites/<Location>/Business%20Closures/msdyn_businessclosures.xlsx”),null,true), What this step does: -> Uses Web.Contents() to access an Excel file stored in SharePoint Online.-> The URL points directly to the Excel file msdyn_businessclosures.xlsx inside the SharePoint site.-> Excel.Workbook() then reads the file and returns a structured object containing:All sheets, Tables, Named ranges Parameters used: null → No custom options (e.g., column detection rules)true → Indicates the file has headers (first row contains column names) -> b) Extracting a Table Named “Table1” from the Workbook – msdyn_businessclosures_Sheet = Source{[Item=”Table1″, Kind=”Table”]}[Data], This would search inside the Source object (which includes all workbook elements), and look specifically for an element where: Item = “Table1” → the name of the table in the Excel fileKind = “Table” → ensures it selects a table, not a sheet with the same name & would extract only the Data portion of that table. As a result, we get Power Query table containing the exact contents of Table1 inside the Excel workbook, to which we can further apply our logic filter, clean, etc. To conclude, when Dataverse tables refuse to load through the Power BI Dataverse Connector—especially system-driven entities like msdyn_businessclosure—the issue is usually rooted in platform-level restrictions, connector limitations, or hidden metadata. Instead of modifying these constraints, offloading the data through Power Automate → Excel → Power BI provides a controlled, reliable, and connector-independent integration path. By automating the extraction of Dataverse rows into an Excel file stored in SharePoint or OneDrive, you ensure: This method is simple to build, stable to maintain, and flexible enough to adapt to any Dataverse table -whether standard, custom, or system-managed. For scenarios where Power BI needs insights from hidden or restricted Dataverse tables, this approach remains one of the most practical and dependable solutions. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

The New Digital Backbone: How Azure Is Replacing Legacy Middleware Across Global Enterprises

The Integration Shift No Enterprise Can Ignore For more than a decade, legacy 3rd-party integration platforms served as the backbone of enterprise operations. But in a world being redefined by AI, cloud-native systems, and real-time data, these platforms are no longer keeping pace. Across industries, CIOs and digital leaders are facing the same reality: What was once a dependable foundation has now become a barrier to modern transformation. This is why enterprises around the world are accelerating the shift to Azure Integration Services (AIS) a cloud-native, modular, and future-ready integration backbone. From our field experience including the recent large-scale migration from TIBCO for Tinius Olsen one message is clear: Modernizing integration is not an IT upgrade. It is a business modernization initiative. 1. Why Integration Modernization Is Now a Business Imperative Digital systems are more distributed than ever. AI and automation are accelerating. Data volumes have exploded. Customers expect real-time experiences. Yet legacy middleware platforms were built for a world before: The challenges CIOs consistently report include: • Escalating licensing & maintenance costs: Annual renewals, hardware provisioning, and forced upgrades drain budgets. • Limited elasticity: Legacy platforms require you to over-provision capacity “just in case,” increasing cost and reducing efficiency. • Rigid, code-heavy orchestration: Every enhancement takes longer, requiring specialized skills. • Poor monitoring and operational visibility: Teams struggle to troubleshoot issues quickly due to decentralized logs. • Slow deployment cycles: Innovation slows down because integration becomes the bottleneck. This is why the modernization conversation has moved from “Should we?” to “How soon can we?”. 2. Why Azure Is Becoming the Digital Backbone for Modern Enterprises Azure Integration Services brings together a powerful suite of cloud-native capabilities: This is not a one-to-one replacement for middleware. It is an entirely new integration architecture built for the future. 3. What We Learned from the TIBCO → Azure Migration Journey Across the Tinius Olsen modernization project and similar enterprise engagements, six clear lessons emerged. 1. Cost Optimization Is Real and Immediate Moving to Azure shifts integration from a heavy fixed-cost model to a lightweight consumption model. Clients consistently see: Integration becomes a value driver not a cost burden. 2. Elastic Scalability Gives Confidence During Peak Loads Legacy platforms require expensive over-provisioning. Azure scales automatically depending on demand. The result: Scalability stops being a constraint and becomes an advantage. 3. Observability Becomes a Competitive Advantage Azure’s built-in monitoring ecosystem dramatically changes operational visibility: Tasks that once required hours of log investigations now take minutes.Root-cause analysis speeds up, uptime improves, and teams can proactively govern critical workflows. 4. Developer Experience Improves Significantly Modern integration requires both: Azure enables both through Logic Apps + Functions, enabling teams to build integrations: Developers can finally innovate instead of wrestling with legacy tooling. 5. The Platform Becomes AI- and Data-Ready Migration to Azure doesn’t just replace middleware.It unlocks new modernization pathways: The integration layer becomes a strategic enabler for enterprise-wide transformation. 6. The Strategic Message for CIOs and Digital Leaders Modernizing integration is not simply about technology replacement. It is about: In short: It is about building a future-ready enterprise. Modernizing Integration Is No Longer Optional The next decade will be defined by AI-driven systems, composable applications, and hyper automation.Legacy integration platforms were not built for this future, Azure is. Enterprises that modernize their integration layer today will be the ones that innovate faster, scale smarter, and operate more efficiently tomorrow. Read the Microsoft-published case study:CloudFronts Modernizes Tinius Olsen with Microsoft Dynamics 365 Talk to a Cloud ArchitectDiscuss your integration modernization roadmap in a 1:1 strategy session. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange