Category Archives: D365 Finance and Operations
RSAT (Regression Suite Automation Tool ) implementation and configuration for Finance and Operations
Purpose The Regression suite automation tool (RSAT) significantly reduces the time and cost of user acceptance testing. This tool enables functional power users to record business tasks using the Finance and Operations Task recorder and convert these recordings into a suite of automated tests without the need to write source code. Test libraries are stored and distributed in Lifecycle Services (LCS) using the Business Process Modeler (BPM) libraries. These libraries are also fully integrated with Azure DevOps Services (Azure DevOps) for test execution, reporting and investigation. Test parameters are decoupled from test steps and stored in Microsoft Excel files. Prerequisites Dynamics 365 for Finance and Operations test environment (Demo or tier 2 UAT environment Excel Azure DevOps: You will need an Azure DevOps Test Manager or Test Plans license. For example, if you have a Visual Studio Enterprise subscription, you already have a license to Test Plans. Pricing-https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/ For a demo environment, you don’t need to buy any license. Authentication Certificate: To enable secure authentication, RSAT requires a certificate to be installed on the RSAT client computer. The RSAT settings dialog box allows you to automatically create and install the authentication certificate. Installation Download Regression Suite Automation Tool.msi to your machine RSAT requires Selenium and web browser driver libraries. RSAT will prompt you if needed libraries are missing and will automatically install them for you. Configuration For RSAT Open RSAT application. Select the Settings button in the upper right to configure RSAT. And next steps will help you to find those required fields input. Go to project settings of Lcs for your projects. Go to Visual Studio Team Services. Here you need to mention the Azure DevOps project in the Azure DevOps site URL field. In order to do that, click on https://www.visualstudio.com Open Azure DevOps and create a new organization if there is not an existing one. Now create a new project as shown below Now you need to set up a security token by clicking on account info>security Once you create the token, save it as you will not be able to access it again when you want to use it. Once that is done, go back to the main page and create a new test plan. Name it as RSAT-TT (or you can use any name) Now right click on RSAT-TT and create a new suite you can name it ‘Demo’. Azure DevOps setup is done. In Azure DevOps site URL mention Organization name that you set up in Azure DevOps. And in Personal access token field paste the token that you had earlier saved. Click on continue to select the project and continue, Save. Now you need to deploy it to the environment Next, open the Regression Suite Automation Tool, Go to settings in Azure Dev Ops Url field copy it from the LCS Access token should be the security token you had copied. Click on Test connection so the Project name and Test plan will populate. Now run VM. You will find Hostname and SOAP Hostname by going to IIS and then right-clicking on AOSService>Edit bindings. Copy both the Hostname and in Hostname and SOAP Hostname fields paste these values Admin username should be the username you use to login to your environment. To generate Thumbprint click on New and save at any location and then copy the generated certificate to the VM Open the copied certificate and install it to the local machine at personal and Trusted Root Certification Authorities locations.Now Open the wif file in admin mode in notepad from the given location of VM In wif file find CN name=127.0.0.1 exists or not. If not, copy the selected portion and paste it below the same authority block. Now add modify those lines as follows: <authority name=”CN=127.0.0.1″> <keys> <add thumbprint=”F46D2F16C0FA0EEB5FD414AEC43962AF939BD89A”/> </keys> <validIssuers> <add name=”127.0.0.1″ /> </validIssuers> </authority> ( Note: Add thumbprint of installed Certificate in wif as shown) Final steps include Copy thumbprint from RSAT settings (which was generated when you click on New) and paste it in wif file in your VM Then Mention the company name And Working directory Set default browser as internet explorer Save as and ok Next, Go to LCS open business process modeler and create a new Library Name it as RSAT, go to edit and rename the process as required and you may add a child node to it by clicking on the Add process. Now go to Finance and operations, go to test recorder Create recording by clicking on create a recording and perform the operation and then click on the stop button. Name it as per your need then Save it to Lifecycle services or Save this to PC option. Click ok Now go back to LCS in the project library and click on the requirement, tab check it’s syncing Now Sync test cases and VSTS sync Next, go to Visual studio DevOps, test cases, click on Add existing Then click on the run query and click on Add test case Now go to regression suite automation and load the test and download test cases. select test and click on new and generate test execution parameter files Then click on edit option for the older version to edit values in excel For older version For newer version Now edit metadata for the test in excel file and save and close Now Run the test after this step, automatic session for the test is handled by selenium where the browser will perform steps as test cases Then run the test and after it’s completed successfully click on upload (Note the result as passed)
Share Story :
How To Enable / Disable Maintenance mode Using SQL query in Dynamics 365 for Finance & Operations
In finance and operations for making any changes in License configuration form, you need to enable maintenance mode and after making desirable changes you need to disable to this mode. You can enable or disable maintenance mode using SQL query as well as command prompt. In this blog, we are performing this operation using the SQL query. The following are steps to enabling/disabling maintenance mode:- Open SSMS(Microsoft SQL Server Management Studio) in your server. Click on New Query and enter the following query to enable maintenance mode:- update dbo.SQLSYSTEMVARIABLESset dbo.SQLSYSTEMVARIABLES.VALUE =1 where dbo.SQLSYSTEMVARIABLES.PARM = ‘CONFIGURATIONMODE’ You can verify status using following command:-SELECT * FROM [AxDB].[dbo].[SQLSYSTEMVARIABLES] After this restart IIS services in some cases, you need to restart the server. perform your changes to the License configuration form. after desired changes are made you can disable mode using the following command update dbo.SQLSYSTEMVARIABLESset dbo.SQLSYSTEMVARIABLES.VALUE =0 where dbo.SQLSYSTEMVARIABLES.PARM = ‘CONFIGURATIONMODE’ Now again you can verify status using the same query in step 3 and again repeat step 4.I hope this blog will help you
Share Story :
Move database from sandbox to development in D365 Finance and Operations
Hello, In this blog I am going to demonstrate how to move database from sandbox to development environment. In some cases, there might be a situation where you need to debug the code with production data. For this, first we need to move database from production to sandbox with refresh database in LCS as shown in below screenshot. Then we need to move database from sandbox to development as follows. Steps to move database from Sandbox to Dev Login to LCS and click on Sandbox Environment full details. On Maintain Tab click Move database. To export the Sandbox Database, click on Export Database. 4. You can find the .bacpac file in Database backup of asset library after successfully executing export command . Download the .bacpac file to development VM. 5. Open SSMS in development server. Before importing the database AxDB you must rename the existing AxDB by the following Script. USE master; GO ALTER DATABASE MyTestDatabase SET SINGLE_USER WITH ROLLBACK IMMEDIATE GO ALTER DATABASE MyTestDatabase MODIFY NAME = MyTestDatabaseCopy ; GO ALTER DATABASE MyTestDatabaseCopy SET MULTI_USER GO 6. Right click on Database, select Import Data-tier Application. 7. Click Next. 8. Change the New database name to AxDB and click Next. 9. Click Next and Browse to the folder where .bacpac is downloaded. 10. Click Finish to import database. 11. You can see the Steps as follows. 12. Once Import is done, Open Visual Studio and do Full Synchronization. I hope this blog will help you.
Share Story :
How to resolve workflow editor error “Application cannot be started.Contact the application vendor”
Sometimes when you try to open workflow editor you receive error as “Application cannot be started.Contact the application vendor” as shown in screenshot. this problem can be caused due to various versions of application are there on your system. Let see how to solve this problem :- First thing first make sure you are using internet explorer browser for the workflow. If you are using internet explorer the go to settings and go to internet options. Now try to connect the application. If above step is also not working for you then there must be multiple versions of applications are on your system. To resolve this you visit C:\Users\*YourUserFolder*\AppData\Local\Apps\2.0 and search as workfow and select second application file and open file location of it.(AppData may be hidden in some cases) Now in the opened location delete all the files other than folder in that location. after this try to download the latest workflow editor and it should work now.
Share Story :
How to create and apply workflow for purchase order in D365 finance and operations
In this blog we will learn how to create workflows in D365 finance and operation. For this blog we are taking example of Purchase order workflow. Which will allow us to create purchase order which is allocated to different persons for approval and review process. Navigate to Procurement and Sourcing >>Setup>>Procurement and Sourcing workflows, and click on new and select purchase order workflow as follows:- This will open workflow editor in you first need to provide log in details same as that of environment. Here we need to arrange various components and need to set their properties to resolve those following errors. Components for this of workflow: – Start: – To indicate start of workflow design. End:- To indicate end of workflow design. Review Purchase order:-This assign review(Complete/Return PO). Approve Purchase order:- This assign users who needs to approve purchase order. To design workflow follow the steps :- Now In designer create design as shown in screenshot Set the Review element and right click and open properties and set as basic settings as follows:- In assignment make sure you have assigned type(in our case user) and user name You can also escalate roles after certain time as follows(we are not considering this setup for this blog) Now get back to approve purchase order and open its properties for and set automatic action as follows which will approve Purchase order below 10000 USD. Set the notification for person who will receive notification when particular operation is performed(for eg :- Approved/rejected etc) Now click on step 1 to enter in step 1 section and open its properties. First we are assigning user who will approve the purchase order as screenshot suggest as well as you can set time limit for approval and completion policy as well. You can also add the condition to step 1 which will decide whether to run this step or notNow close step and get back to main design of designer Now click on save and close and mention version notes and activate this workflow Now you can see new workflow in procurement and sourcing workflows Now create new purchase order and after that click on workflow button and click on submit you can also check history of it Now another user will complete the purchase order approval and mention comment. Now user with authority of approval will approve from common>>Work Items assigned to me
Share Story :
Model import and export in D365 Finance and Operations using Powershell
When we want to move customization done on specific model from one environment to other development environment we need to export and import the model file. Steps for model import and export using PowerShell :- Open PowerShell in administrator mode. Change directory to the path of package bin folder. Export command:-.\ModelUtil.exe -export -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=”name of model” -outputpath=path to store model after exportFor example: If model name is TOUpgradeModel and I want to store the model file to path is C:\Temp\ModelFile The command will be as follows: .\ModelUtil.exe -export -metadatastorepath=K:\AosService\PackagesLocalDirectory -modelname=”TOUpgradeModel” -outputpath=C:\Temp\ModelFile Output file you can see on the specified path as Import Command :-.\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=the path from. axmodel to importFor example: .\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=C:\Temp\ModelFile\TOUpgradeModel-Cloudfront.axmodel ( Note : If model already exist in your environment, trying to import the same model you will receive the error message of “Model already exist”. So, delete the existing model by command .\ModelUtil.exe -delete -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=” TOUpgradeModel ” try to import the model)
Share Story :
How to resolve Error “Exception from HRESULT: 0xC0202009” While data export
While Exporting data using data entity in D365 FO sometimes the Data project fails to export data with error “Exception from HRESULT: 0xC0202009”. While event log displays – EventData methodName DMFGenerateSSISPackage.generateFileDataV2() diagnosticsMessage System.Exception: Exception from HRESULT: 0xC0202009 at Microsoft.Dynamics.AX.Framework.Tools.DMF.ServiceProxy.DmfEntitySharedTypesProxy.DoWork[T](Func`1 work) at Dynamics.AX.Application.DMFGenerateSSISPackage.`generateFileDataV2(DMFDefinitionGroupExecution _dmfDefinitionGroupExecution, String _defGroupName, DMFFileFormat _fileFormat, DMFDelimiter _rowDelimiter, DMFDelimiter _columnDelimiter, String _codePage, String _locale, NoYes _isFirstRowHeader, NoYes _unicode, String _source, String _textQualifier, DMFXMLStyle _style, String _rootElement, String _filePath, Map _entitySyncVersion, Int32 _previewCount, Boolean @_entitySyncVersion_IsDefaultSet, Boolean @_previewCount_IsDefaultSet) in xppSource://Source/ApplicationFoundation\AxClass_DMFGenerateSSISPackage.xpp:line 1273 In Such Cases, the reason behind this is some target fields is may be disabled or cause this problem. To resolve this problem, you need to perform the following steps :- Refresh Entity List Data Management>>Framework parameters>>Entity Setting And wait until all entities got refreshed Regenerate Mapping – Select data entity with the above issue (In our case Sales Order header v2 Entity)from data entities in data management. Click on Generate Mapping – Select “Yes” from generation warning Note:-You can try to disable/remove fields from the mapping until it starts working. This way you at least find out the problematic field. To be more effective, disable the first half of the field list. If the export works, the problem was in some of the disabled fields. I hope this will help you.
Share Story :
Error “A reference to ‘xyz ‘ is required to compile this module” solution
Many of the time while building project/solution we came across the “reference is required to compile this module error”. The reason behind this error is that your module’s reference package is missing the required package. In error itself, the missing module can be rectified as shown for example in following screenshot reference to “SourceDocumentationTypes” is not made. Now you have to add a missing reference to your module as follows: Select Update model Parameters from Dynamics 365 >>Model Management>>Update model Parameter Now select the required model name from the model list and click on Next. Now make sure to select the checkbox in front of a required reference from reference packages (In our case SourceDocumentationTypes reference was not there ) and click on next. Now click on finish. Now after attaching a reference to the package build package/Solution and you have solved your error.
Share Story :
Create Azure Connector With ARM(Azure Resource Manager) Configuration
While Creating Any Cloud-Hosted Environment in LCS it Is Necessary to create Azure Connector for which ARM(Azure Resource Manager) configuration is necessary. So this article will help you to create Azure Connector. Steps to follow :- Role assignment at the azure portal For Proper Working of Azure Connector make sure you have mentioned role assignment in your azure portal.Visit the Azure portal with the same credential as that of LCS and visit subscription section. Now select Access Control(IIM) In which click on Add Button and select Add Role Assignment. Now Configure the Add Role Assignment field as follows and save those configurations. Authorize link Now Navigate back to LCS in which Project Settings>>Azure Connectors and make sure to autorize link by clicking authorize button. Create Add option for connector Click on Add Button in Azure Connectors section and add Name, Azure subscription Id , and Toggle Configure to Azure Resource Manager(ARM) option to Yes. Click on next and Check for the following page Again Click on next move to the next step Upload Management Certificate Download the management certificate Now Upload Downloaded Certificate in Azure portal as follows And upload the certificate Select Region for Connector Navigate back to previous LCS session and Complete setup By selecting required Azure Region Click on confirm and your Azure connector is created and the screen looks as follows
Share Story :
Data fidelity checker in Microsoft Dynamics 365 Retail
Customer service, sales, marketing, finance, and operations are the five main pillars of any organization. When you take care of these aspects well, you can easily beat the competition that is out there. Unfortunately, not many companies focus on these elements as they do not know what exactly they need to do to streamline the processes. Dynamics 365 for Finance and Operations is the solution or answer that you are looking for to scale up your business. This product will help you to stay on top of your toes in addressing issues in finance and operations with ease. Your staff is going to become a lot more efficient when they use this product. Installing and using this software is pretty easy. Here is how you do data fidelity checker: In this blog, I am going to shed some light on a new feature that is already available in the public preview of Microsoft Dynamics 365 Retail. General availability will be sometime in Oct 2019 The statement posting process is used to account for the transactions that occur in Cloud point of sale (POS) or Modern POS (MPOS). This process uses the distribution schedule to pull a set of POS transactions into the headquarters (HQ) The parameters that are defined on the Retail parameters and Stores pages are used to select the transactions that are pulled into individual statements. A retail transaction consistency checker(Validation checker) identifies and isolates inconsistent transactions before they are picked up by the statement posting process. We faced a lot of issues with statement posting. The process often asks you to run a validate store transaction job but still, the statement won’t post. Raising an MS support ticket is the only way to fix this. I believe data fidelity should make the statement posting process more efficient. Fidelity checker does the following: Data fidelity checker will check the data for omissions and anomalies and only those transactions that pass the validation will be included in the statement process. Validate records, gift cards, tax table Fix the transactions by users who are not in line with the auditing purpose. Also, currently, the statement posting process first reserves the inventory throughout the day as the transactions are carried out. Which is temporary. Then at the end of the day during the statement posting system removes these reservations for inventory created and then creates sales orders, payment journals, and ledger journals in the system. This process is not very efficient because when all the transactions are processed together at the end of the day, it could result in overloading and failure. To address this in future this process will be removed all together. Which means no temporary inventory job will be created. A new job that will run as per predefined schedule will create sales orders, invoice them, and create, post, and apply for payments. The statement document that gets created at the end of the day will only be used to calculate and post any counting variances.
