Finance and Operations Archives -

Tag Archives: Finance and Operations

Shopify Meets Dynamics 365 Finance and Operations: A Guide to Integration [Part 1]

Introduction The integration of Shopify with Dynamics 365 Finance and Operations (FnO) starts by creating a secure link.The initial step in this process involves generating an API token within Shopify, serving as the credential for verified communication between both systems.In this blog I will walk you through the steps to create the API token, facilitating a seamless beginning for your integration. Pre-requisites Access to the Shopify Admin account with appropriate permissions to create private apps or access custom apps.API access enabled in your Shopify store.A basic understanding of API concepts and authentication methods. [Available in Reference]The URL or endpoint details where the API calls will be directed. [Available in Reference] References Shopify – How to generate API token ResfulAPI.net – Basics of REST APIsShopify.dev – REST API Documentation Configuration Step 1: Access the Shopify Admin Portal Log in to your Shopify store’s Admin account. Navigate to Apps from the main menu. Step 2: Create a Custom App Click on Develop Apps (available under Apps). Select Create an App and provide a name (e.g., “Dynamics365_Integration”). Assign a developer or admin as the app owner. Step 3: Configure API Scopes After creating the app, click on it to open the configuration page. Under the Configuration section, define the API scopes required for integration based on your requirements. You can change these later if required. For example: Click on Save to save the changes. Step 4: Generate the API Token Once scopes are set, click on the API credentials tab. Click Install App to generate the credentials. A unique Access Token will be displayed.  Copy and securely store this token, as it will not be shown again. If you scroll down, you’ll also see the API Key and API Secret; store these values as well. Step 5: Test the Token Use a tool like Postman to test the API token. Set up a GET request to an API endpoint (e.g., https://<API KEY>:<API Secret>@<Store Name>.myshopify.com/admin/api/2023-07/products.json). Include the token in the header as X-Shopify-Access-Token. Verify the response to confirm the token is working correctly. Or simply (https://<Store Name>.myshopify.com/admin/api/2023-07/products.json) Conclusion The API token is your gateway to integrating Shopify with Dynamics 365 Finance and Operations.  By following this guide, you’ve taken the first critical step toward seamless data flow between your e-commerce platform and back-office operations.  In the next blog, we’ll explore how to configure Dynamics 365 Finance and Operations to connect with Shopify and start synchronizing data. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Your connection is not private | NET::ERR_CERT_DATE_INVALID Error in D365 Finance and Operations

As seen in title when error “Your connection is not private | NET::ERR_CERT_DATE_INVALID” occured which seems as follows in screenshot when you try to open environment link in browser. The reason for above error in my case was that SSL certificate was expired as you can see in following screenshot To solve this issue go to your lcs.dynamics.com with your credentials and open your project now select which ever environment and visit see details/Full details page and click on maintain button and select  Rotate secrets button  Now click on Rotate SSL certificates after which dialogue box will appear click on yes  Now wait till rotating secret symbol is loading as following screenshot after which you will be able to access your D365 Finance and operation link easily. Hope this blog will help you.Thank you

RSAT (Regression Suite Automation Tool ) implementation and configuration for Finance and Operations

Purpose The Regression suite automation tool (RSAT) significantly reduces the time and cost of user acceptance testing. This tool enables functional power users to record business tasks using the Finance and Operations Task recorder and convert these recordings into a suite of automated tests without the need to write source code. Test libraries are stored and distributed in Lifecycle Services (LCS) using the Business Process Modeler (BPM) libraries. These libraries are also fully integrated with Azure DevOps Services (Azure DevOps) for test execution, reporting and investigation. Test parameters are decoupled from test steps and stored in Microsoft Excel files. Prerequisites Dynamics 365 for Finance and Operations test environment (Demo or tier 2 UAT environment Excel Azure DevOps: You will need an Azure DevOps Test Manager or Test Plans license. For example, if you have a Visual Studio Enterprise subscription, you already have a license to Test Plans. Pricing-https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/ For a demo environment, you don’t need to buy any license. Authentication Certificate: To enable secure authentication, RSAT requires a certificate to be installed on the RSAT client computer. The RSAT settings dialog box allows you to automatically create and install the authentication certificate. Installation Download Regression Suite Automation Tool.msi to your machine RSAT requires Selenium and web browser driver libraries. RSAT will prompt you if needed libraries are missing and will automatically install them for you. Configuration For RSAT Open RSAT application. Select the Settings button in the upper right to configure RSAT. And next steps will help you to find those required fields input. Go to project settings of Lcs for your projects. Go to Visual Studio Team Services. Here you need to mention the Azure DevOps project in the Azure DevOps site URL field. In order to do that, click on https://www.visualstudio.com Open Azure DevOps and create a new organization if there is not an existing one. Now create a new project as shown below Now you need to set up a security token by clicking on  account info>security Once you create the token, save it as you will not be able to access it again when you want to use it. Once that is done, go back to the main page and create a new test plan. Name it as RSAT-TT (or you can use any name) Now right click on RSAT-TT and create a new suite you can name it ‘Demo’. Azure DevOps setup is done. In Azure DevOps site URL mention Organization name that you set up in Azure DevOps. And in Personal access token field paste the token that you had earlier saved. Click on continue to select the project and continue, Save. Now you need to deploy it to the environment Next, open the Regression Suite Automation Tool, Go to settings in Azure Dev Ops Url field copy it from the LCS Access token should be the security token you had copied. Click on Test connection so the Project name and Test plan will populate. Now run VM. You will find Hostname and SOAP Hostname by going to IIS and then right-clicking on AOSService>Edit bindings. Copy both the Hostname and in Hostname and SOAP Hostname fields paste these values Admin username should be the username you use to login to your environment. To generate Thumbprint click on New and save at any location and then copy the generated certificate to the VM Open the copied certificate and install it to the local machine at personal and Trusted Root Certification Authorities locations.Now Open the wif file in admin mode in notepad from the given location of VM In wif file find CN name=127.0.0.1 exists or not. If not, copy the selected portion and paste it below the same authority block. Now add modify those lines as follows: <authority name=”CN=127.0.0.1″>             <keys>               <add thumbprint=”F46D2F16C0FA0EEB5FD414AEC43962AF939BD89A”/>             </keys>             <validIssuers>             <add name=”127.0.0.1″ />             </validIssuers>             </authority>  ( Note: Add thumbprint of installed Certificate in wif as shown)   Final steps include Copy thumbprint from RSAT settings (which was generated when you click on New) and paste it in wif file in your VM Then Mention the company name And Working directory Set default browser as internet explorer Save as and ok Next, Go to LCS open business process modeler and create a new Library Name it as RSAT, go to edit and rename the process as required and you may add a child node to it by clicking on the Add process.  Now go to Finance and operations, go to test recorder  Create recording by clicking on create a recording and perform the operation and then click on the stop button. Name it as per your need then Save it to Lifecycle services or Save this to PC option. Click ok Now go back to LCS in the project library and click on the requirement, tab check it’s syncing  Now Sync test cases and VSTS sync Next, go to Visual studio DevOps, test cases, click on Add existing Then click on the run query and click on Add test case  Now go to regression suite automation and load the test and download test cases. select test and click on new and generate test execution parameter files Then click on edit option for the older version to edit values in excel For older version For newer version Now edit metadata for the test in excel file and save and close Now Run the test after this step, automatic session for the test is handled by selenium where the browser will perform steps as test cases Then run the test and after it’s completed successfully click on upload (Note the result as passed)

How to resolve workflow editor error “Application cannot be started.Contact the application vendor”

Sometimes when you try to open workflow editor you receive error  as “Application cannot be started.Contact the application vendor” as shown in screenshot. this problem can be caused due to various versions of application are there on your system. Let see how to solve this problem :- First thing first make sure you are using internet explorer browser for the workflow. If you are using internet explorer the go to settings and go to internet options. Now try to connect the application. If above step is also not working for you then there must be multiple versions of applications are on your system. To resolve this you visit C:\Users\*YourUserFolder*\AppData\Local\Apps\2.0 and search as workfow and select second application file and open file location of it.(AppData may be hidden in some cases) Now in the opened location delete all the files other than folder in that location. after this try to download the latest workflow editor and it should work now.

How to create and apply workflow for purchase order in D365 finance and operations

In this blog we will learn how to create workflows in D365 finance and operation. For this blog we are taking example of Purchase order workflow. Which will allow us to create purchase order which is allocated to different persons for approval and review process. Navigate to Procurement and Sourcing >>Setup>>Procurement and Sourcing workflows, and click on new and select purchase order workflow as follows:- This will open workflow editor in you first need to provide log in details same as that of environment. Here we need to arrange various components and need to set their properties to resolve those following errors. Components for this of workflow: – Start: – To indicate start of workflow design. End:- To indicate end of workflow design. Review Purchase order:-This assign review(Complete/Return PO). Approve Purchase order:- This assign users who needs to approve purchase order.   To design workflow follow the steps :- Now In designer create design as shown in screenshot Set the Review element and right click and open properties and set as basic settings as follows:- In assignment make sure you have assigned type(in our case user) and user name You can also escalate roles after certain time as follows(we are not considering this setup for this blog) Now get back to approve purchase order and open its properties for and set automatic action as follows which will approve Purchase order below 10000 USD. Set the notification for person who will receive notification when particular operation is performed(for eg :- Approved/rejected etc) Now click on step 1 to enter in step 1 section and open its properties. First we are assigning user who will approve the purchase order as screenshot suggest as well as you can set time limit for approval and completion policy as well. You can also add the condition to step 1 which will decide whether to run this step or notNow close step and get back to main design of designer Now click on save and close and mention version notes and activate this workflow Now you can see new workflow in procurement and sourcing workflows Now create new purchase order and after that click on workflow button and click on submit you can also check history of it Now another user will complete the purchase order approval and mention comment. Now user with authority of approval will approve  from common>>Work Items assigned to me  

Model import and export in D365 Finance and Operations using Powershell

When we want to move customization done on specific model from one environment to other development environment we need to export and import the model file. Steps for model import and export using PowerShell :- Open PowerShell in administrator mode. Change directory to the path of package bin folder. Export command:-.\ModelUtil.exe -export -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=”name of model” -outputpath=path to store model after exportFor example: If model name is TOUpgradeModel and I want to store the model file to path is C:\Temp\ModelFile The command will be as follows: .\ModelUtil.exe -export -metadatastorepath=K:\AosService\PackagesLocalDirectory -modelname=”TOUpgradeModel” -outputpath=C:\Temp\ModelFile Output file you can see on the specified path as Import Command :-.\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=the path from. axmodel to importFor example: .\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=C:\Temp\ModelFile\TOUpgradeModel-Cloudfront.axmodel ( Note : If model already exist in your environment, trying to import the same model you will     receive the error message of “Model already exist”. So, delete the existing model by command .\ModelUtil.exe -delete -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=” TOUpgradeModel ” try to import the model)

Create Azure Connector With ARM(Azure Resource Manager) Configuration

While Creating Any Cloud-Hosted Environment in LCS it Is Necessary to create Azure Connector for which ARM(Azure Resource Manager) configuration is necessary. So this article will help you to create Azure Connector. Steps to follow :- Role assignment at the azure portal For Proper Working of Azure Connector make sure you have mentioned role assignment in your azure portal.Visit the Azure portal with the same credential as that of LCS and visit subscription section. Now select Access Control(IIM) In which click on Add Button and select Add Role Assignment. Now Configure the Add Role Assignment field as follows and save those configurations. Authorize link Now Navigate back to LCS in which Project Settings>>Azure Connectors and make sure to autorize link by clicking authorize button. Create Add option for connector Click on Add Button in Azure Connectors section and add Name, Azure subscription Id ,  and Toggle Configure to Azure Resource Manager(ARM) option to Yes. Click on next and Check for the following page Again Click on next move to the next step Upload Management Certificate Download the management certificate Now Upload Downloaded Certificate in Azure portal as follows And upload the certificate Select Region for Connector Navigate back to previous LCS session and Complete setup By selecting required Azure Region Click on confirm and your Azure connector is created and the screen looks as follows  

Commands to Import .bacpac file to D3FOE SQL Server

Introduction: This blog article will explain how to import a .bacpac file to Microsoft SQL Server which is created from Finance and Operations database that is based on Azure SQL Server. You can refer steps here for creating .bacpac file. Following points are recommended for smooth and secure importing of .bacpac file Take a backup of existing database so you can revert if required. Import the database with a new name and modify it later once the entire process is completed error free. Copy the .bacpac file to local computer where you want to import the database for better performance. Steps: Run command prompt as an administrator. Run the below command to Import the database. cd C:\Program Files (x86)\Microsoft SQL Server\130\DAC\bin SqlPackage.exe /a:import /sf:D:\Exportedbacpac\SSProd.bacpac /tsn:localhost /tdn:SSProd /p:CommandTimeout=1200 where, tsn (target server name) – The name of the SQL Server to import into. tdn (target database name) – The name of the database to import into. The database should not already exist. sf (source file) – The path and name of the file to import from. Update the database: Run the following script against your database to add the users you deleted while creating .bacpac file.  Update your database name in Alter Command. CREATE USER axdeployuser FROM LOGIN axdeployuser EXEC sp_addrolemember ‘db_owner’, ‘axdeployuser’ CREATE USER axdbadmin FROM LOGIN axdbadmin EXEC sp_addrolemember ‘db_owner’, ‘axdbadmin’ CREATE USER axmrruntimeuser FROM LOGIN axmrruntimeuser EXEC sp_addrolemember ‘db_datareader’, ‘axmrruntimeuser’ EXEC sp_addrolemember ‘db_datawriter’, ‘axmrruntimeuser’ CREATE USER axretaildatasyncuser FROM LOGIN axretaildatasyncuser EXEC sp_addrolemember ‘DataSyncUsersRole’, ‘axretaildatasyncuser’ CREATE USER axretailruntimeuser FROM LOGIN axretailruntimeuser EXEC sp_addrolemember ‘UsersRole’, ‘axretailruntimeuser’ EXEC sp_addrolemember ‘ReportUsersRole’, ‘axretailruntimeuser’ CREATE USER axdeployextuser WITH PASSWORD = ‘<password from LCS>’ EXEC sp_addrolemember ‘DeployExtensibilityRole’, ‘axdeployextuser’ CREATE USER [NT AUTHORITY\NETWORK SERVICE] FROM LOGIN [NT AUTHORITY\NETWORK SERVICE] EXEC sp_addrolemember ‘db_owner’, ‘NT AUTHORITY\NETWORK SERVICE’ UPDATE T1 SET T1.storageproviderid = 0 , T1.accessinformation = ” , T1.modifiedby = ‘Admin’ , T1.modifieddatetime = getdate() FROM docuvalue T1 WHERE T1.storageproviderid = 1 –Azure storage ALTER DATABASE [<your AX database name>] SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 6 DAYS, AUTO_CLEANUP = ON) GO — Begin Refresh Retail FullText Catalogs DECLARE @RFTXNAME NVARCHAR(MAX); DECLARE @RFTXSQL NVARCHAR(MAX); DECLARE retail_ftx CURSOR FOR SELECT OBJECT_SCHEMA_NAME(object_id) + ‘.’ + OBJECT_NAME(object_id) fullname FROM SYS.FULLTEXT_INDEXES WHERE FULLTEXT_CATALOG_ID = (SELECT TOP 1 FULLTEXT_CATALOG_ID FROM SYS.FULLTEXT_CATALOGS WHERE NAME = ‘COMMERCEFULLTEXTCATALOG’); OPEN retail_ftx; FETCH NEXT FROM retail_ftx INTO @RFTXNAME; BEGIN TRY WHILE @@FETCH_STATUS = 0 BEGIN PRINT ‘Refreshing Full Text Index ‘ + @RFTXNAME; EXEC SP_FULLTEXT_TABLE @RFTXNAME, ‘activate’; SET @RFTXSQL = ‘ALTER FULLTEXT INDEX ON ‘ + @RFTXNAME + ‘ START FULL POPULATION’; EXEC SP_EXECUTESQL @RFTXSQL; FETCH NEXT FROM retail_ftx INTO @RFTXNAME; END END TRY BEGIN CATCH PRINT error_message() END CATCH CLOSE retail_ftx; DEALLOCATE retail_ftx; — End Refresh Retail FullText Catalogs Run the re-provision tool: To ensure Retail components are functional run the re-provision tool. Find steps here on how to run the tool. Start using the new database: Stop the below service to switch the database. World wide web publishing service. Finance and Operations Batch Management Service. Management Reporter 2012 Process Service. Once the services are stopped, rename your original database to ‘AxDB_orig’ and the new database to ‘AxDB’. Restart the service. Conclusion: This is how you can create a new database and import the data from a .bacpac file. If you face any issue then you can switch to original database or restore the copy of original database created prior to import.

SEARCH :

FOLLOW CLOUDFRONTS BLOG :

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange