Category Archives: D365 General
Filtering Dynamics 365 Subgrids Without Relationships: A JavaScript-Only Approach Using setFilterXml
In Microsoft Dynamics 365, subgrids are a powerful way to display related records on a form. But what happens when: Out-of-the-box, Dynamics 365 doesn’t give us many options here. We can select a view, but we cannot apply dynamic filters unless the entities are directly related or the criteria already exist in the view’s FetchXML. This is where the JavaScript setFilterXml() API becomes a life-saver. In this article, I’ll show you how to filter a subgrid dynamically using JavaScript — even when the subgrid’s entity is completely unrelated to the main form entity. Use Case Imagine your form has a field called Name, and you want to filter the subgrid so that it shows only records whose Name begins with the same prefix. But: As there are also records, where the lookup column might need to be empty on purpose, which further would break relationship based filtering in the subgrid. OOB? Impossible. With JavaScript? Totally doable. How the JS based subgrid filtering works In Dynamics 365, subgrids are rendered as independent UI components inside the form. Even though the form loads first, subgrids load asynchronously in the background, which means: The form and its fields may already be available, but the subgrid control might not yet exist, so trying to apply a filter immediately on form load will fail. Here is the basic structure of a JS Function to perform Subgrid filtering – This control represents the interactive UI component that displays the records for the view.It gives you programmatic access to:-> Set filters-> Refresh the grid-> Access its view ID-> Handle events (in some versions) However, because subgrids load later than the form, this line may return null the first several times. If you proceed at that point, your script will break.So we implement a retry pattern: If the subgrid is not ready, wait 100ms -> Try again -> Repeat until the control becomes availableThis guarantees that our next steps run only when the subgrid is fully loaded. var oAnnualTCVTargetGridFilter = oAnnualTCVTargetGridFilter || {}; oAnnualTCVTargetGridFilter.filterSubgrid = function(executionContext) {var formContext = executionContext.getFormContext(); }; To make sure the filter is applied correctly, we follow a three-step workflow: 1. Retry Until the Subgrid Control Is Loaded (setTimeout) – When the script runs, we attempt to retrieve the subgrid control using: var subgrid = formContext.getControl(“tcvtargets”); 2. Apply the Filter (setFilterXml()) – Once the subgrid control is found, we can safely apply a filter. Then we can apply our filtering logic, and utilize it in the FetchXML Query: -> Read the field Name (cf_name) from the main form & design a logic -> Construct a FetchXML <filter> element -> Passing this XML to the subgrid using: This tells Dynamics 365 to apply an additional filter on top of the existing view used by the subgrid. A few important things to note: If the cf_name field is empty, we instead apply a special filter that returns no rows. This ensures the grid displays only relevant and context-driven data. 3. Refresh the Subgrid (subgrid.refresh()) – After applying the filter XML, the subgrid must be refreshed: Without this call, Dynamics will not re-run the query, meaning your filter won’t take effect until the user manually refreshes the subgrid. Refreshing forces the system to: -> Re-query data using the combined view FetchXML + your custom filter -> Re-render the grid -> Display the filtered results immediately This gives the user a seamless, dynamic experience where the subgrid shows exactly the records that match the context. JS + FetchXML based filtering in action – Without filtering :- With filtering :- Key Advantages of This Approach Works Even When No Relationship Exists Possibility to filter a subgrid even if the target entity has no direct link to the form’s main entity. This is extremely useful in cases where the relationship must remain optional or intentionally unpopulated. Enables Dynamic, Contextual Filtering We can design filtering logic on the form field values, user selections, or business rules. Filtering on Fields Not Included in the View Since the filtering logic is applied client-side, there is no need to modify or clone the system view just to include filterable fields. Bypasses Limitations of Lookup-Based Relation Filtering This method works even when the lookup column is intentionally left empty, which is a scenario where OOB relationship-based filtering fails. More Flexible Than Traditional View Editing You can apply advanced logic such as prefix matching, conditional filters, or dynamic ranges—things not possible using standard UI-only configuration. To conclude, filtering subgrids dynamically in Dynamics 365 is not something the platform supports out-of-the-box- especially when the entities are unrelated or when the filter criteria doesn’t exist in the subgrid’s original view. However, with a small amount of JavaScript and the setFilterXml() API, you gain complete control over what data appears inside a sub grid, purely based on the context passed from the main form. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
Overcoming Dataverse Connector Limitations: The Power Automate Approach to Export Hidden
Working with Microsoft Dataverse Connector in Power BI is usually straightforward—until you encounter a table that simply refuses to load any rows, even though the data clearly exists in the environment. This happens especially with hidden, virtual, or system-driven tables (e.g. msdyn_businessclosure, msdyn_scheduleboardsetting) which are commonly used in Field Service and Scheduling scenarios. Before jumping to a workaround, it’s important to understand why certain Dataverse tables don’t load in Power BI, what causes this behavior, and why the standard Dataverse connector may legitimately return zero rows. Causes – 1] The Table Is a Virtual or System Table with Restricted AccessSystem-managed Dataverse tables like msdyn_businessclosure are not exposed to the Dataverse connector because they support internal scheduling and platform functions. 2] No Records Exist in the Root Business Unit Data owned by child business units is not visible to Power BI accounts associated with a different BU, resulting in zero rows returned. 3] The Table Is Not Included in the Standard Dataverse Connector Some solution-driven or non-standard tables are omitted from the Dataverse connector’s supported list, so Power BI cannot load them. Solution: Export Dataverse Data Using Power Automate + Excel Sync Since Power BI can read:-> OneDrive-hosted files-> Excel files-> SharePoint-hosted spreadsheets …a suitable workaround is to extract the restricted Dataverse table into Excel using a scheduled (When the records are few) / Dataverse triggered (When there are many records and you only want a single one, to avoid pagination) Power Automate flow. What it can do –-> Power Automate can access system-driven tables.-> Excel files in SharePoint can be refreshed by Power BI Service.-> we can bypass connector restrictions entirely.-> The method works even if entities have hidden metadata or internal platform logic. This ensures:-> Consistent refresh cycles-> Full visibility of all table rows-> No dependency on Dataverse connector limitations Use case I needed to use the Business Closures table (Dataverse entity: msdyn_businessclosure) for a few calculations and visuals in a Power BI report. However, when I imported it through the Dataverse Connector, the table consistently showed zero records, even though the data was clearly present inside Dynamics 365. There are 2 reasons possible for this –1] It is a System/Platform Tablemsdyn_businessclosure is a system-managed scheduling table, and system tables are often hidden from external connectors, causing Power BI to return no data. 2] The Table Is Not Included in “Standard Tables” Exposed to Power BIMany internal Field Service and scheduling entities are excluded from the Dataverse connector’s metadata, so Power BI cannot retrieve their rows even if they exist. So here, we would fetch the records via “Listing” in Power automate and write to an excel file to bypass the limitations that hinder the exposure of that data; without compromising on user privileges, or system roles; we can also control or filter the rows being referred directly at source before reaching PBI Report. Automation steps – 1] Select a suitable trigger to fetch the rows of that entity (Recurring or Dataverse, whichever is suitable). 2] List the rows from the entity (Sort/Filter/Select/Expand as necessary). 3] Perform a specific logic (e.g. clearing the existing rows, etc.) on the excel file where the data would be written to. 4] For each row in the Dataverse entity, select a primary key (e.g. the GUID), provide the path to the particular excel file (e.g. SharePoint -> Location -> Document Library -> File Name -> Sheet or Table in the Excel File), & assign the dynamic values of each row to the columns in the excel file. 5] Once this is done, import it to the PBI Report by using suitable Power Query Logic in the Advanced Editor as follows – -> a) Loading an Excel File from SharePoint Using Web.Contents() – Source = Excel.Workbook(Web.Contents(“https://<domain>.sharepoint.com/sites/<Location>/Business%20Closures/msdyn_businessclosures.xlsx”),null,true), What this step does: -> Uses Web.Contents() to access an Excel file stored in SharePoint Online.-> The URL points directly to the Excel file msdyn_businessclosures.xlsx inside the SharePoint site.-> Excel.Workbook() then reads the file and returns a structured object containing:All sheets, Tables, Named ranges Parameters used: null → No custom options (e.g., column detection rules)true → Indicates the file has headers (first row contains column names) -> b) Extracting a Table Named “Table1” from the Workbook – msdyn_businessclosures_Sheet = Source{[Item=”Table1″, Kind=”Table”]}[Data], This would search inside the Source object (which includes all workbook elements), and look specifically for an element where: Item = “Table1” → the name of the table in the Excel fileKind = “Table” → ensures it selects a table, not a sheet with the same name & would extract only the Data portion of that table. As a result, we get Power Query table containing the exact contents of Table1 inside the Excel workbook, to which we can further apply our logic filter, clean, etc. To conclude, when Dataverse tables refuse to load through the Power BI Dataverse Connector—especially system-driven entities like msdyn_businessclosure—the issue is usually rooted in platform-level restrictions, connector limitations, or hidden metadata. Instead of modifying these constraints, offloading the data through Power Automate → Excel → Power BI provides a controlled, reliable, and connector-independent integration path. By automating the extraction of Dataverse rows into an Excel file stored in SharePoint or OneDrive, you ensure: This method is simple to build, stable to maintain, and flexible enough to adapt to any Dataverse table -whether standard, custom, or system-managed. For scenarios where Power BI needs insights from hidden or restricted Dataverse tables, this approach remains one of the most practical and dependable solutions. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
Sales Return process in Dynamics 365 Finance and Operations Part 2
In the previous part of my blog, I explained about the Credit Only process. In this part of my blog, I will go through the Physical Return process. The Physical return process is determined based on the Disposition Code that is assigned to that Return Order. Disposition Codes in D365 Finance and Operations: Disposition codes in Dynamics 365 Finance and Operations (D365FO) are essential tools used to categorize and manage returned items. These codes help businesses decide what to do with products that customers send back, whether it’s restocking, repairing, or scrapping them. By using disposition codes, companies can streamline their return processes, maintain accurate inventory records, and ensure that returned items are handled efficiently and appropriately. This not only enhances operational efficiency but also helps in improving customer satisfaction by managing returns in a clear and organized manner. Below is the List of Dispositions Codes that are available in D365 FNO: These Disposition codes are available as Standard Functionality in D365 FNO. You can also create new codes based on the business requirements. In this part of the blog, I will walk you through the Replace Item and Credit Customer scenario. Let’s take a scenario where we have sold 5 items to the customer and after delivery the customer does the Quality check in which 2 products fail due to quality issues. The customer has Scrapped those products on our behalf and now we will provide the customer with the replacement items. For that: Go to Sales and Marketing>Sales Returns>All Return Orders. On the All return orders page click New to create a New Sales Return Order. Select the Customer for which the Return Order is to be created. Enter the Site, Warehouse, RMA number and other details and click OK. In the first part of the blog I created the Return order using the Find Sales Order function so in this part I will directly add the line with negative quantity. In the below screenshot you can see that I have added a line for the Product P-000015 with negative quantity. The next step is to register the Line with the Replace and Credit customer Disposition code. For that click on the Update Line option in the Lines tab then from the drop down click on the Registration option. Then from the Disposition Code drop down select the Replace and Credit Customer option. Then add the registration line then click on Confirm Registration. In the below screenshot you can see the Line status is changed to Registered and the Return order status is changed to Open. Now if you go to the All-Sales Order Page you can see that a New Sales Order is created with the Order type as Returned Order with the Status as Open Order. Now if you open the Sales Order and check the lines the quantity of the line will be exactly same as that of the Return order. The next step is to create a Replacement order as we have selected the Disposition Code of Replace and Credit. For that click on Update Line and Click Registration which will change the Line status from Registered to Expected. As you do this you will notice that the Post Packing Slip button is now disabled, and you can see that the Replacement Order button is now available. As our disposition code is Replace and credit Customer the next step is to create a Replacement Order. For that click on the New Replacement Order button. Add the same site and Warehouse as Return order and click Ok this will create a Replacement Order. After Replacement Order is created go back to the Return Order again and Click Registration and select the Credit disposition code which will Credit the Amount back into the Customer’s account. Now after that Post the Packing slip for the Return order which will change the Return Order status to Received. Then go ahead and Invoice the Return Order from the All-Sales Order page which will again change the Return Order status to Closed. Then go ahead and process the Replacement Sales Order. If you go to the Customer transaction and check, you can see that the Amount is credited back in to the Customer Account. So, this completes the Sales Return Process of Return and Credit to customer. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Create a New Environment in LCS for D365 Finance and Operations
Introduction In this blog, we’ll be looking into creating a new environment for D365 Finance and Operations or D365 Commerce. Pre-requisites References Configuration Go to Microsoft Dynamics Lifecycle Services and log in with your account. If you select D365 Commerce, you get the following screen. If you select D365 Finance and Operations, you get another screen where you have to specify whether the project is an actual implementation or just for evaluation after which you get the same screen as below. Once the Project is created, we get the following screen. From here, we click on the hamburger menu at the top and then click on Cloud Hosted Environments. Click on Add to create a new environment. If you get the below pop-up asking to configure an Azure Connector, please refer to my blog – “Configure an Azure Connector in LCS”. Once, you have an Azure Connector configured, you can click on Add again and get the following pop-up. After selecting the Application and Platform version, you’ll get the option to select the environment topology. DEMO – A demo environment includes only Microsoft demo data. You can use a demo environment to explore default features and functionality. DEVTEST – A DevTest environment is for development or build. You can use this environment for development or build. Then we get another pop-up to select the environment topology. After that is selected, we decide the environment name and the size of the VM that is to be used for this environment. You can read more about VM sizes here – VM sizes – Azure Virtual Machines | Microsoft Learn Once we click on Next we get the last pop-up after which the environment gets deployed. Once we click on deploy, it takes about 6-8 hours to deploy the environment after which it’ll be available in the cloud-hosted environments section. If, for some reason, you try to create an environment with the latest platform and application version and that deployment fails, you can try to create an environment one platform/application version below that. Conclusion Thus we saw how to create an environment in LCS for either D365 Finance and Operations. Happy Coding!
Share Story :
Switching of Forms using Option Set field in Javascript
Use Case – To Switch form based on the value selected from option set field. Steps – 1. We have a Option Set field – “Lead Type“ Based on the Option selected – the form will switch For example – “PMO Member” – On change of the option, it will switch to PMO Member Form “HR” – On change of the option, it will switch to HR Form 2. To get the current form Guid use the following in the console- 3. Below JavaScript Code- Hope this helps!!
Share Story :
Disable field on change of tab in D365 CE
Use case – Our requirement is to enable field description field on invoice line form on clicking of tab General. Let’s see how we can achieve this Solution – Step 1 – Create web resource with below function- var invoiceLineCustomization = { unlockField : function(executionContext) { var formContext = executionContext.getFormContext(); formContext.getControl(“description”).setDisabled(false); }, } Step 2: Add this web resource on tab property event TabStateChange and try. (path to go to event tab – Click on tab -> change properties -> event) Output – Hope this helps !
Share Story :
Let’s get started with Azure Function for Dynamics 365 CRM: Part 1
In this blog, we will learn how to create an Azure Function App to connect it with Dynamics 365 CRM and perform the CRUD operation in Dynamics 365 CRM. First, we will create an Azure Function project for Dynamics 365 CRM from Visual Studio, and below are the steps for the same. Open Visual Studio and create a new Azure Function Project. Filter platform as Azure, select Azure Function, and click on Next. Name your project and click on Create button. After clicking on Create, you will get the below screen where you need to select the Trigger and Azure Function v1 (.Net Framework) and click on Create. In my case, the Azure Function trigger will be Http Trigger which means when an azure function will receive an HTTP request it will trigger. Your Azure Project will be created and the following will be the structure of the project. Now, we will create Azure Function to connect with the Dynamics 365 CRM we need to add the required NuGet Package [Microsoft.CrmSdk.CoreAssemblies]. To add the NuGet Package right-click on the Project and click on Manage Nuget packages. Add Microsoft.CrmSdk.CoreAssemblies in your Project. Following is the code to connect the Dynamics 365 CRM. Currently, we are going to use the credential by specifying them in the C# Code or you can use it as constant. Later in the series, we will learn how to use Environment Variable and pass the credential more secure way in Azure Function using Azure Key Vault. [Stay tuned..!!] Now, select the code and Refactor the code and make the connection function that will return the IOrganizationService if the connection is established else return null. Now, the final code will be as below: Now, we will create a customer with a Hardcoded name when the function is triggered with an HTTP request. Update the existing code with the below code: Testing: We will require the API testing tool, here I am using Postman and the following is the link to download “Postman”. https://www.postman.com/downloads/ To test the application, click on the Start button on top of Navbar as mentioned below in the screenshot [Button will have Project Name]. It will take a few minutes to Load the Azure Emulator Following is the screen you will be able to see and copy the URL highlighted in the red below and paste that URL in Postman. Open the Postman and click on the create a new tab; Select request as POST and paste the URL: After pasting the URL, click on Send You will get the following response on the Azure Function Tool and Postman If there any error or issue with the Azure Function code, the request will be failed and will be displayed on both Azure Function Tool and Postman [Status will be “4**” or “5**” ] Now, we will take look at Dynamics 365 CRM environment and check whether the account is created or not. We are justing getting started with Azure Function for Dynamics 365 CRM and stay tuned for more in this series. Upcoming blogs 1. How to use Dynamics 365 Credentials securely using Azures Function. 2. How to create Dynamics 365 integration with Third-party Applications. Many more……
Share Story :
Autosave quick create a form record on fields change and open the created record
Requirement: The user will create the record from the Quick Create and on change of specific field, the record should automatically save. Whenever the record will be saved it should open the created record. In our scenario, we are going to use the Account Entity. When the user will enter the main phone data in the Account Quick Create Form, the record will automatically save and it will open the created account record. Steps: Go to Default Solution or your customization solution and Open the Quick Create Form from Account Entity 2. Click on the Form Properties to edit the form properties of Account Quick Create Form and add the script on the form. 3. Click on the +New button to add the JavasScript web resources on the form so that you can use the script function on Form Events 4. If you wanted to use an existing Web Resource then you add it or else create new by click on new 5. Create a new Script and add it on form 6. Now, we need to add the event list and bind the calling function to it. To do it click on the + New on Event Listener of Form Properties. Note: Please don’t forget to pass the execution context as the first parameter to the function. 7. We will use formContext.data.save() function to get the created record’s GUID and open entity record using that GUID. Following is code for your reference: 8. Code Explanation: a. quickCreateonSave is a function that is trigger on change of Main phone fields of Account Quick Create. b. When the function is triggered we are using formcontext.data.save to save the save so that we can get the GUID of the recently created record. c. On formcontext.data.save() we can assign two functions — on success and failure as parameter to .then() [Reference] d. On Success, the function will return the following response where you can get the entityType, entity id, and name of the record. e. After getting the entity id and entity type, we can use the open form to open the record as mentioned in the Code. 9. Publish all the customization that you have done on the Account Entity Demo: I hope it helps you guys!!
Share Story :
Importing Notes in Dynamics 365 correctly
Preparing an Excel Template for Notes entity is a little tricky. Doesn’t work when you just Export directly as a Template from the Templates Wizard and try to include all columns and import as is. Why? Because there’s no Regarding field exported when you export/import that template. Here’s what you can do as a workaround. Scenario Now, let’s assume you want to Export a standard Excel Template for import so that you can re-import into Dynamics for Notes entity for a regarding entity. Exporting Excel Template Your Document Templates are where all your Excel Templates can be exported from – Now, follow the below – Select Notes entity and Edit Columns you need to Export the Excel with your required columns Select the columns you need. Observe that you don’t get Regarding column to export Then Download the file. Modifying the Excel Now, since you don’t have the Regarding field in the Excel you exported from Templates, here’s what you need to do – Add a column yourself, give it a proper name based on what the Notes’ Parent entity should be. In this example, I’m importing Notes for Account. So, I’m adding a column called ‘Account Name’ A new column will be created as below Now, populate your data based on how the Notes should be imported and tagged to which records. Now, by default, this template is exported in Microsoft Excel Worksheet (xlsx) format. You’ll need to Save As in CSV format Re-Importing Up until above, your Excel is ready to be imported. Let’s begin – Import the file as a usual Excel Import in Dynamics 365 CRM. Since this is not a direct Template importing as is, but a CSV, you’ll get to map this file manually. Then, you’ll need to manually select the Note entity from the drop-down and then proceed Now, whatever is mapped automatically will be mapped. For the newly created Account Name field, you’ll need to expand the Not Mapped dropdown and select Regarding (Lookup) Now, you’ll need to select the entity you want the Notes to appear under. In my case, this is Accounts, so I have it right there Since this Regarding fields supports several entities, scroll all the way down to Confirm your selection Now, your Regarding field is set and you are ready to confirm and Import Now, my Import here is completed. (You’ll need to take care the Import is successful) Imported Notes If you look at the records now, the Notes will be attached to the respective Regarding records Hope this helps!!
Share Story :
Why Custom Filter JS code doesn’t work on Lookup field? [Fixed]
One of the major pet-peeve is not understanding why the code isn’t working. And you for sure know you’ve written the correct code. But, thing just don’t work. One such tricky situation is that of applying custom filter to fields using JavaScript in Dynamics 365 Customer Engagement apps. Scenario Let’s say you have a custom filter to be applied to a field and you’ve written your JS code on Load to apply the filter and everything (you know what you need to do!) Example: But the above is just not working. Why??? Reason The reason is pretty simple! Because, the Lookup field is still using the one set on the field itself. Check that – The above should be turned off to make your code work since the field’s default OOB filtering takes precedence. And now, your code should work (Provided everything in it correct) Hope this quick tip helps!
