Filtering Entity Lookups based on field value
I have a contact record with a role option set field with options Billing, Decision Maker, Client, Influencer. The Requirement is to only show billing contacts i.e. contacts with role billing related to the account selected on the Opportunity in the lookup of the contacts on the Opportunities. Let’s see how I to filtered the contacts lookup to just show billing contacts : Step 1: Create a contacts view such that the role is billing(or any field on which you want to filter is set to the value by which you want to filter that record ). Step 2: Set the filter criteria of the contacts lookup field on the opportunity as shown below. In the Additional properties turn of the view selector and set the default view to the view you created in #1Step 1 In the Field Behavior Tick the “disable most recently used items for this field”. Now you will only see contacts with role billing in the lookup of the contacts on the opportunity form. Hope this helps! Thank you.
Share Story :
How to create a table using Azure Logic Apps with proper formatting
In this blog, we will see how we can create a table using HTML language using the Azure logic app. If you see carefully, there is an action block present which is “Create HTML table” but it does not give formatting flexibility. so in this blog, I will explain how we can use compose block to create a table with HTML syntax Step 1: To start with the logic app I took the recurrence trigger. you can use any trigger as per your requirement (which is set to run once a day) Step 2: Compose block is an important block as we are writing HTML syntax to format our table. For the demo I used sample data otherwise you can enter dynamic fields as well. in the highlighted section. Step 3: So output of Compose block is sent as the body of the email as shown in the below screenshot. Output: Email Hope this blog helps you. Thank you !!
Share Story :
How to create Dynamic Option-set/List based on value from other fields in Canvas PowerApps
Hi Everyone, Consider a scenario where we want to filter out a dropdown/combo-box choice field based on value in other field or dropdown.Since we cannot use scripts in Canvas App, here’s how we can show specific choice based on multiple values For this example, I’ve considered a bunch of basic items belonging to Fruits, Vegetables and Dairy products. For example purpose I’ll be creating a collection of values to representation. This step will be avoided for choices field within a table/entity.Below is the ‘Master Record’ of collection. Step 1: Create an indexed collection. (referring to MasterList in this example) Step 2: Let’s say we would like to classify the Master List into 3 categories (Fruits, Vegetables and Dairy). In this example, I’ll be using a dropdown list for which the main dropdown field is required to filter data. I’ll be adding a hidden button, containing the following code Based on the above code,Type 1 refers to “Fruits”, Type 2 refers to “Vegetables” and Type 3 refers to “Dairy” Step 3: Added dropdown which will hold classification values. Step 4: The main dropdown choice field which will be filtered based on categories mentioned by above dropdown. That’s it. We have successfully implemented the dynamic choice list as per values dependent on other field. Hope this helps you.
Share Story :
Update records in Dynamics CRM using Azure Logic Apps
In this blog, we will see how we can update the records CRM with the help of a logic app workflow. Step 1: Add the recurrence trigger in the logic app and set it to run in a one-day interval. you can set any interval. Without a trigger, you cannot create a logic app. Step 2: Add new step after recurrence trigger Step 3: Add the List records action from Dynamics 365 and connect to CRM with your credentials Select Account entity Step 4: For testing purposes, I have created a test account (account number = 1001)in the UAT environment as shown below. Step 5: Initialize the variable with account number is 1001. The account where you want to change/update the data Step 6: Filter the list of accounts where the account number is equal to 1001 as mentioned above step. Step 7: After finding a record from the account list then we will update the record here I updated the account name and city(Note: Account number should unique) Result Hope this blog helps you Thank you!
Share Story :
How to use Standard Journal Feature In Business Central?
Introduction. They might be some need to copy the data from one batch to another. In Business central there are multiple ways to do so use copy & paste method or use or configuration packages or Use Standard Journal Feature. Let’s understand about the Standard Journal Feature in detail. But please note Standard Journal Function is not present in other journals apart from General Journal. Steps to achieve the goal: Go to batch where you want to move the data from. There will be action to save as a standard Journal. 5. You would be able to see the Standard Journal you have created. Click on Ok. Please note if you change the amount of the line after copying from the Standard Journal. It will not be reflected the same in Standard Journal. As it is used as Template. Conclusion: Thus we saw how we can use Standard Journal Feature in Business central. Thank you very much for reading my work. Hope you enjoyed the article!
Share Story :
Load JSON data from Azure Blob Storage to Microsoft Finance and operation
In this blog we will see how to we can integrate data from Azure Blob storage to Microsoft Finance and operations. In this use case we are updating the data in the finance and operation destination Prerequisite: Azure Blob Storage Azure Finance and operation Step 1 : In this we will create the HTTP Trigger workflow or you can selected any trigger based on requirement. Step 2 : Azure Logic App will read the data stored in the Azure Blob Storage in JSON format. Below is sample JSON format, [ { “MeterId”:”A001″, “MeterRead”:”100″ }, { “MeterId”:”A003″, “MeterRead”:”300″ } ] Step 3: workflow logic, It will read JSON formatted data which contains the Meter ID and Meter Reading. Based on Meter ID it will fetch the record id. Using record id Meter Reading data will be updated in F&O. Destination Finance and operation: Hope this helps!
Share Story :
Configuring NAS in LS Central for automating Data Director jobs
Introduction LS Central Scheduler Jobs are used for automatic background processing. These jobs use the NAS Service under the hood. We are going to see how to configure the NAS Service for LS Central. References https://help.lscentral.lsretail.com/Content/LS-Insight/Setup/LS-Central-In-Cloud-LS-Insight-In-Azure/3-Machine-Or-VM.htm Pre-requisites LS Central v16.4 – OnPrem Data Director Configuration Create a new Server Instance and name it appropriately. Ensure that the account for this new Server Instance is set to User and the User has Administrator privileges. In the General tab, update the “Service Default Company” and “Service Default Time Zone.” In the NAS Services tab, set the following fields: Run NAS Services with Admin Rights: True Startup Argument: NASID,TYPEFILTER=,LOG=1,REPEAT=1 Startup Codeunit: 99001468 Startup Method: LSRSCHEDULER Restart the Server Instance. Open the Scheduler Setup in LS Central and set the “Enable NAS Scheduler” to true. Refresh the page. Conclusion: Thus we saw how to configure NAS Services in LS Central. Happy Coding!
Share Story :
Using the Power BI Report Builder to create and publish paginated reports.
Power BI Report Builder is a great tool to create paginated reports which can be easily printed in a proper page layout. If you used to work in SSRS Report Builder, the whole environment will look familiar. Plus Power BI report builder is a very light tool and has additional features such as directly importing a data source from an existing PowerBI report from any workspace, publishing these RDL reports to existing PowerBI workspace and embedding existing paginated reports into PowerBI dashboards. In this blog we will see how to create a report using the Power BI Report Builder. You need to download the Power BI Report Builder first, for that go to app.powerbi.com and click on the ellipses beside your profile After clicking on the Download option select the Paginated Report builder option. Install the downloaded setup file. Sign into the report builder. You can open the tool from Start Menu and right away start creating reports by adding data from the following data sources. For this blog I am using a dataset of existing PowerBI report. Navigate to the dataset of desired report in PowerBI service, click on the ellipses sign and select Create Paginated Report. Wait for the report to get processed. Open the downloaded RDL file. Since we directly imported our data source from an existing PowerBi report we don’t have to add a data source again. However we have to configure the dataset table. Right click on Datasets and select Add Dataset Choose the Data source from which the dataset should get its Data. Click on the Query Designer and wait for it to load. On the left pane you can see all the fields from the Data source. Drag the fields that you’ll be using in the report and execute the query. After previewing the dataset, click on OK You can view the dataset and its fields now in the left pane. You can insert various visuals from the Insert tab on the ribbon and populate them with fields from the dataset. After finalizing the design and features of the report, you can preview it by clicking on the Run button in the top left corner of the window. You’ll now see your paginated report, to exit this view click on Design button on the top left corner. You can also publish this report in your PowerBI Service Workspaces, However publishing requires a PowerBI Premium License. Thank you for reading, Hope this blog helped
Share Story :
Send a message/notification on Microsoft Teams as soon as an Opportunity is created in Dynamics 365 via Azure Logic Apps.
In this blog we will see the steps in order to send a automated message via Teams as soon as an Opportunity is created in Microsoft Dynamics 365. Step 1: Go to portal.azure.com and select the Azure Logic App Resource. Step 2: Enter all the details such as the Name, Resource Group, Subscription, Region, etc. required while creating a Logic App. Step 3: Select the Dynamics 365 trigger When a record is updated. Step 4: Select Opportunities entity after setting by the Dynamics 365 CRM connection Step 5: Set the data refresh time as required. Step 6: Select the IF action in the next step and the condition would be status_label=won for true. Step 7: Inside True Block select Post a message in a chat or channel option. You can also handle the condition for False block, but in this case we can leave it blank. Step 8: You can post this message in a Group, Channel or send it as a Personal chat. Step 10: Wait for the trigger successful notification. Step 11: Go to Dynamics 365 CRM and navigate to opportunities entity. Step 12: Open a test opportunity or create one if doesn’t exist and close the opportunity as won. Step 13: As soon as you closed this opportunity you may have received the following message. Hence in this blog we saw how we can send messages on MS Teams using Azure Logic Apps on triggering certain conditions. Hope this helped!
Share Story :
Quick Tip – Duplicate Fields from a Table to another Table
Hello,Consider a scenario where we need to have a table (entity) which is a replica of other table or you need some fields replicated into another table. This tool will save a lot of precious time. Let’s say I have a table called ‘Scoping’ and I want fields from that data replicated into another table called ‘ScopingClone’. Steps you’ll take to do this in a time-efficient manner. Step 1: Create a new Table/Entity. (In my case, I created ScopingClone) Step 2: Navigate to XrmToolBox and install ‘Clone Field Definitions‘, connect to your environment and open the tool. As you can see, the fields that I selected are successfully cloned. I hope you found this blog useful.