Category Archives: Blog
How to use Standard Journal Feature In Business Central?
Introduction. They might be some need to copy the data from one batch to another. In Business central there are multiple ways to do so use copy & paste method or use or configuration packages or Use Standard Journal Feature. Let’s understand about the Standard Journal Feature in detail. But please note Standard Journal Function is not present in other journals apart from General Journal. Steps to achieve the goal: Go to batch where you want to move the data from. There will be action to save as a standard Journal. 5. You would be able to see the Standard Journal you have created. Click on Ok. Please note if you change the amount of the line after copying from the Standard Journal. It will not be reflected the same in Standard Journal. As it is used as Template. Conclusion: Thus we saw how we can use Standard Journal Feature in Business central. Thank you very much for reading my work. Hope you enjoyed the article!
Share Story :
Load JSON data from Azure Blob Storage to Microsoft Finance and operation
In this blog we will see how to we can integrate data from Azure Blob storage to Microsoft Finance and operations. In this use case we are updating the data in the finance and operation destination Prerequisite: Azure Blob Storage Azure Finance and operation Step 1 : In this we will create the HTTP Trigger workflow or you can selected any trigger based on requirement. Step 2 : Azure Logic App will read the data stored in the Azure Blob Storage in JSON format. Below is sample JSON format, [ { “MeterId”:”A001″, “MeterRead”:”100″ }, { “MeterId”:”A003″, “MeterRead”:”300″ } ] Step 3: workflow logic, It will read JSON formatted data which contains the Meter ID and Meter Reading. Based on Meter ID it will fetch the record id. Using record id Meter Reading data will be updated in F&O. Destination Finance and operation: Hope this helps!
Share Story :
Configuring NAS in LS Central for automating Data Director jobs
Introduction LS Central Scheduler Jobs are used for automatic background processing. These jobs use the NAS Service under the hood. We are going to see how to configure the NAS Service for LS Central. References https://help.lscentral.lsretail.com/Content/LS-Insight/Setup/LS-Central-In-Cloud-LS-Insight-In-Azure/3-Machine-Or-VM.htm Pre-requisites LS Central v16.4 – OnPrem Data Director Configuration Create a new Server Instance and name it appropriately. Ensure that the account for this new Server Instance is set to User and the User has Administrator privileges. In the General tab, update the “Service Default Company” and “Service Default Time Zone.” In the NAS Services tab, set the following fields: Run NAS Services with Admin Rights: True Startup Argument: NASID,TYPEFILTER=,LOG=1,REPEAT=1 Startup Codeunit: 99001468 Startup Method: LSRSCHEDULER Restart the Server Instance. Open the Scheduler Setup in LS Central and set the “Enable NAS Scheduler” to true. Refresh the page. Conclusion: Thus we saw how to configure NAS Services in LS Central. Happy Coding!
Share Story :
Using the Power BI Report Builder to create and publish paginated reports.
Power BI Report Builder is a great tool to create paginated reports which can be easily printed in a proper page layout. If you used to work in SSRS Report Builder, the whole environment will look familiar. Plus Power BI report builder is a very light tool and has additional features such as directly importing a data source from an existing PowerBI report from any workspace, publishing these RDL reports to existing PowerBI workspace and embedding existing paginated reports into PowerBI dashboards. In this blog we will see how to create a report using the Power BI Report Builder. You need to download the Power BI Report Builder first, for that go to app.powerbi.com and click on the ellipses beside your profile After clicking on the Download option select the Paginated Report builder option. Install the downloaded setup file. Sign into the report builder. You can open the tool from Start Menu and right away start creating reports by adding data from the following data sources. For this blog I am using a dataset of existing PowerBI report. Navigate to the dataset of desired report in PowerBI service, click on the ellipses sign and select Create Paginated Report. Wait for the report to get processed. Open the downloaded RDL file. Since we directly imported our data source from an existing PowerBi report we don’t have to add a data source again. However we have to configure the dataset table. Right click on Datasets and select Add Dataset Choose the Data source from which the dataset should get its Data. Click on the Query Designer and wait for it to load. On the left pane you can see all the fields from the Data source. Drag the fields that you’ll be using in the report and execute the query. After previewing the dataset, click on OK You can view the dataset and its fields now in the left pane. You can insert various visuals from the Insert tab on the ribbon and populate them with fields from the dataset. After finalizing the design and features of the report, you can preview it by clicking on the Run button in the top left corner of the window. You’ll now see your paginated report, to exit this view click on Design button on the top left corner. You can also publish this report in your PowerBI Service Workspaces, However publishing requires a PowerBI Premium License. Thank you for reading, Hope this blog helped
Share Story :
Send a message/notification on Microsoft Teams as soon as an Opportunity is created in Dynamics 365 via Azure Logic Apps.
In this blog we will see the steps in order to send a automated message via Teams as soon as an Opportunity is created in Microsoft Dynamics 365. Step 1: Go to portal.azure.com and select the Azure Logic App Resource. Step 2: Enter all the details such as the Name, Resource Group, Subscription, Region, etc. required while creating a Logic App. Step 3: Select the Dynamics 365 trigger When a record is updated. Step 4: Select Opportunities entity after setting by the Dynamics 365 CRM connection Step 5: Set the data refresh time as required. Step 6: Select the IF action in the next step and the condition would be status_label=won for true. Step 7: Inside True Block select Post a message in a chat or channel option. You can also handle the condition for False block, but in this case we can leave it blank. Step 8: You can post this message in a Group, Channel or send it as a Personal chat. Step 10: Wait for the trigger successful notification. Step 11: Go to Dynamics 365 CRM and navigate to opportunities entity. Step 12: Open a test opportunity or create one if doesn’t exist and close the opportunity as won. Step 13: As soon as you closed this opportunity you may have received the following message. Hence in this blog we saw how we can send messages on MS Teams using Azure Logic Apps on triggering certain conditions. Hope this helped!
Share Story :
Quick Tip – Duplicate Fields from a Table to another Table
Hello,Consider a scenario where we need to have a table (entity) which is a replica of other table or you need some fields replicated into another table. This tool will save a lot of precious time. Let’s say I have a table called ‘Scoping’ and I want fields from that data replicated into another table called ‘ScopingClone’. Steps you’ll take to do this in a time-efficient manner. Step 1: Create a new Table/Entity. (In my case, I created ScopingClone) Step 2: Navigate to XrmToolBox and install ‘Clone Field Definitions‘, connect to your environment and open the tool. As you can see, the fields that I selected are successfully cloned. I hope you found this blog useful.
Share Story :
Open document on click of button in D365 CRM using JavaScript
In this blog we will see how we can open a PDF document on click of button from a record in CRM Let say we have User Guide button on Lead Entity and on click of User Guide button, a PDF document which is User Guide document should be open in next tab. Solution Create a solution and add lead entity only. Then open the same solution in XRM Toolbox – Ribbon Workbench var openUserGuide = { //openUserGuide.userGuide userGuide: function () { “use strict”; Xrm.Navigation.openUrl(“https://sinerleak.sharepoint.com/:b:/s/SingerLewak/EaQO2OWjWA1BnHFCCENV-6EBDkILbg3EfPSFLEu-KCeraw?e=ofVyVB”); } } 4. Add action to command and Publish the solution from XRM Toolbox Output –
Share Story :
Using Automated Testing in POSTMAN for Business Central Web Services
Introduction: While using Business Central Web Services or APIs, we often use POSTMAN for testing the request and the responses. Today we’ll see how we can automate this testing to a certain extent using the inbuilt features of POSTMAN. We can have testing logic that runs before every request, after every request or logic that tests on particular request. In the below demonstration, we’ll write automated test to check for GET, PUT, POST and DELETE operations for a single record on a custom API. Postman itself provides a bunch of standard procedures or boiler plate code which we can modify as per our requirements. As this uses Javascript we can also use additional JS features here. Pre-requisites: Postman Business Central OnCloud or OnPremise References: Writing tests | Postman Learning Center Announcing Postman for the Web, Now in Open Beta | Postman Blog Configuration: Post Request – So first we are going to be creating a record in the Customer Table with the following fields. One of the common things to be testing with Custom APIs is to verify whether the request is being created successfully (1) and what we are sending and what is being stored in the record are the same(2). As we are using Javascript, the response is stored in the jsonData variable and we can access any of the fields of the response as a property on the jsonData variable. As the rest of our automated tests are doing to be performed on this same record, we need to store the Identifier for this record inside some variable which exists outside the scope of this request, here we are using a variable with the Collection scope. If you want to use the same variable outside of this collection, you can also define Global variables. GET Request – In a simple GET request, the only thing we are concerned with here, is whether the request is executed successfully or not. For this we are simply going to be checking the Status Code. PUT Request – In a PUT request, we are going to be modifying the record that we previously created, here I’m going to update the name of the record. A common test-case for PUT requests would be to ensure that (a) the request is completed successfully and (b) what is being sent in the request is what is updated on the record and is available in the response. DELETE Request – In a simple DELETE request, the only thing we are concerned with is whether the requested is executed successfully and here we will be simply checking the Status Code returned. Once all the Automated Tests are written, you can either execute them from the Collection Level or from a folder level. Here we will be executing our tests from the folder level. We can also define the Run Order for the requests. Once the Tests have run, we can get a summary of the results as well as detailed version of the results. Conclusion: Thus we saw how to use Automated Testing in POSTMAN to reduce re-work and increase efficiency while testing. A bonus tip – you can now use POSTMAN Web Version to create requests instead of download the POSTMAN app and the entire blog above was written using the Web App of POSTMAN. Do note that not everything that can be done on the Windows App can be done on the Web App. Happy Coding!
Share Story :
Enable language translation on Custom solutions in Dynamics CRM
In this blog, we’ll see how to apply language translation on custom entities, model driven apps and business process flows in Dynamics CRM Step 1- Go to Settings -> Administration->Languages. In Language settings select the required language and click on apply. Step 2- Include all the required components into the solution For example- Custom Entities Model Driven App- For example – Sales Hub Business Process Flow (BPF)- Include the BPF entity as well as process Step 3- Select the solution and click on Export Translation to export the translations to an archived file. Step 4- Extract the contents of the downloaded CrmTranslations_<solutionname>_1_0.zip. This will extract two files. Step 5- After extracting the file, open the CrmTranslations.xml file in excel. You will find 3 sheets in it. Open Localized Labels file, you will find columns for each of the languages you have deployed. Fill in the translations for each language options available. Step 6- Zip the file again and re-import the translations to the same solution using the Import translations button. Step 7- After successful import, click on publish all customizations. Step 8- Go to settings ->Personalization Settings -> Languages. Select the language you wanted to translate into. Output- Hope this helps!!
Share Story :
SharePoint Integration with Dynamics 365
In this blog we’ll see how to integrate SharePoint with Dynamics 365. Step 1- Configure SharePoint Option in Dynamics 365 Document Management option. Go to Advanced settings -> Document Management. Step 2- In Document Management select “Enable Server-based SharePoint Integration” Step 3- Now in the pop-up screen provide SharePoint site location as “Online” then proceed to next. Step 4- Now provide a valid SharePoint URL and click on finish. Step 5- Enable Share point Document setting for Entities using Document Management Settings. Step 6- Now in the Pop-up screen select the entities that you want to use to manage SharePoint documents. You will find some entities like Account are already Enabled and if you want you can enable other entities. We can also add custom entities if required. Step 7- Check Based on entity, document libraries and folders that are based on Account entity are automatically created on SharePoint site. Step 8- We can store a document for an Account or for any other entity in SharePoint. Open an Account and click on Related tab to choose Document option. Hope this helps!!
