D365 Finance and Operations Archives - Page 11 of 12 - - Page 11

Tag Archives: D365 Finance and Operations

Create security role in D365 Finance and operation

In D365 Finance and Operations when you need to provide and restrict users from a certain operation you can make use of security roles. You can create security roles from Finance and operations environment itself or from its development tool i.e Visual Studio. In this blog, we are going to create a security role in Visual Studio as follows.   Create privilege    First of all we need to create privilege as follows now we need to add new entry point and set  object type in our case display menu item from properties Now add object name(display menu item  name) as follows   create role   Now we need to create role where above created privilege will be needed create new security role  as follows now we need to add new privilege in role as shown And from properties select privilege which we have created in previous step Create Duty   Now we have to create new duty and assign previously created privilege in its properties as shown Now we can see security role in FnO environment select any user from system administration>>users and click on assign role as follows and now search for priviously created role and click on Ok button now your security role is assigned to user with our role will be able to see the object like form, report etc except user with system administrator.  

Customize Purchase order approval status

In the case of D365 Finance and Operations when you approve purchase requisition by default system creates Purchase order with approval status as “Approved” as follows     To change this default behavior of system such that once purchase requisition is approved the approval status of the purchase order as a draft you can use the following class class CFSPOStatus {     /// <summary>     ///     /// </summary>     /// <param name=”args”></param>     [PostHandlerFor(classStr(PurchAutoCreate_PurchReq), methodStr(PurchAutoCreate_PurchReq, endUpdate))]     public static void PurchAutoCreate_PurchReq_Post_endUpdate(XppPrePostArgs args)     {         //PurchTable  purchTable = args.getThis(‘purchTable’);         PurchAutoCreate_PurchReq purchReq = args.getThis() as PurchAutoCreate_PurchReq;         PurchTable  purchTable, purchTablenew;         purchTable = purchReq.parmPurchTable();         ttsbegin;         select forupdate purchTablenew where purchTableNew.PurchId == purchTable.PurchId;         if(purchTablenew && purchTablenew.DocumentState == VersioningDocumentState::Approved)         {             purchTablenew.DocumentState = VersioningDocumentState::Draft;             purchTablenew.update();         }         ttscommit;     } }and your final result looks like And after changing status you can apply your own purchase order workflow on it. For purchase order workflow you can refer to my blog

Develop Custom Workflow:Counting Journals workflow

In this blog, you will learn how to develop a custom workflow that is not present out of the box in D365 Finance. For this blog I’m creating a workflow for Counting Journal (Inventory management>>Journal Entries>>Item Counting>>Counting) because there is no such workflow for inventory management module. The followings are steps that are to be followed. Steps:- 1.       Create a base Enum for Document Status 2.       Create a table extension and add Enum to table 3.       Add document status field to form extension 4.       Create query 5.       Create workflow category 6.       Create a workflow type 7.       Now add can submit workflow and updateworkflow methods to tables class extension 8.       Update submitmanager class 9.       Update EventHandler class 10.   Workflow approval creation 11.   Add an element to workflow Type 12.   Now add workflow setup form to the inventory management module Now we are performing steps in detail 1.       Create a base enum for Document Status Make sure you have created a solution, project and assign the model to the project. Now add new item and select Base Enum. And Add Base Enum and name it as CFS_InventoryCountingWorkflow   2.       Create a table extension and add enum to table We are creating a workflow for inventory management so we need to create an extension of InventoryJournalTable and drag above created base enum to table which will create a new field of type Enum.   3.       Add Document status field to form extension Now we need to create an extension of form InventJournalCount and add newly created table field to forms grid by dragging it from data source to form grid control and assign label as approval status.   4.       Create query Now again add a new query to project and name it as CFS_InventoryCountingWorkflow and add the InventJournalTable and set properties as follows   5.       Create workflow category Now we are going to add Workflow category to project and name it as   CFS_InventJournalCounting and set the properties as follows   6.       Create a workflow type After adding workflow category its time to add workflow type name it as CFS_InventoryJournalCounting  and set its properties as follows this will create new elements such as classes and action menu items for submit and all actions.   7.       Now add can submit workflow and updateworkflow methods to tables To add a method to the table we need to create a table class extension for that add a new class and name it as InventJournalTable_CFSExtension and need to add updateWorkflow and canSubmitWorkflow methods. You can use the following code [Extensionof(tableStr(InventJournalTable))] final class InventJournalTable_Extension {     public boolean canSubmitToWorkflow(str _workflowType)     {         boolean ret = next cansubmitToWorkflow(_workflowType);             ret = this.CFS_InventoryCountingWorkflow == CFS_InventoryCountingWorkflow::Draft;             return ret;     }     public static void updateWorkflowStatus(RecId _documentRecId, CFS_InventoryCountingWorkflow _status)     {         ttsbegin;             InventJournalTable document;             update_recordset document         setting CFS_InventoryCountingWorkflow = _status         where document.RecId == _documentRecId;             ttscommit;     } } 8.       Update submitmanager class Make sure you have the same name for submitting manager class or rename it as follows and following code to that public class CFS_InventoryJournalCountingSubmitManager {     private InventJournalTable document;     private WorkflowVersionTable versionTable;     private WorkflowComment comment;     private WorkflowWorkItemTable workItem;     private SysUserId userId;     private boolean isSubmission;     private WorkflowTypeName workflowType;     public static void main(Args args)    {           //  TODO:  Write code to execute once a work item is submitted.         if (args.record().TableId != tableNum(InventJournalTable))         {             throw error(‘Error attempting to submit document’);         }         InventJournalTable document = args.record();         FormRun caller = args.caller() as FormRun;         boolean isSubmission = args.parmEnum();         MenuItemName menuItem = args.menuItemName();         CFS_InventoryJournalCountingSubmitManager manager = CFS_InventoryJournalCountingSubmitManager::construct();         manager.init(document, isSubmission, caller.getActiveWorkflowConfiguration(), caller.getActiveWorkflowWorkItem());         if (manager.openSubmitDialog(menuItem))         {             manager.performSubmit(menuItem);         }         caller.updateWorkflowControls();    }     /// <summary>     /// Construct method     /// </summary>     /// <returns>new instance of submission manager</returns>     public static CFS_InventoryJournalCountingSubmitManager construct()     {         return new CFS_InventoryJournalCountingSubmitManager();     }     /// <summary>     /// parameter method for document     /// </summary>     /// <param name = “_document”>new document value</param>     /// <returns>current document</returns>     public Inventjournaltable parmDocument(Inventjournaltable _document = document)     {         document = _document;         return document;     }     /// <summary>     /// parameter method for version     /// </summary>     /// <param name = “_versionTable”>new version table value</param>     /// <returns>current version table</returns>     public WorkflowVersionTable parmVersionTable(WorkflowVersionTable _versionTable = versionTable)     {         versionTable = _versionTable;         return versionTable;     }     /// <summary>     /// parameter method for comment     /// </summary>     /// <param name = “_comment”>new comment value</param>     /// <returns>current comment value</returns>     public WorkflowComment parmComment(WorkflowComment _comment = comment)     {         comment = _comment;         return comment;     }     /// <summary>     /// parameter method for work item     /// </summary>     /// <param name = “_workItem”>new work item value</param>     /// <returns>current work item value</returns>     public WorkflowWorkItemTable parmWorkItem(WorkflowWorkItemTable _workItem = workItem)     {         workItem = _workItem;         return workItem;     }     /// <summary>     /// parameter method for user     /// </summary>     /// <param name = “_userId”>new user value</param>     /// <returns>current user value</returns>     public SysUserId parmUserId(SysUserId _userId = userId)     {         userId = _userId;         return userId;     }     /// <summary>     /// parameter method for isSubmission flag     /// </summary>     /// <param name = “_isSubmission”>flag value</param>     /// <returns>current flag value</returns>     public boolean parmIsSubmission(boolean _isSubmission = isSubmission)     {         isSubmission = _isSubmission;         return isSubmission;     }     /// <summary>     /// parameter method for workflow type     /// </summary>     /// <param name = “_workflowType”>new workflow type value</param>     /// <returns>current workflow type</returns>     public WorkflowTypeName parmWorkflowType(WorkflowTypeName _workflowType = workflowType)     {         workflowType = _workflowType;         return workflowType;     }     /// <summary>     /// Opens the submit dialog and returns result     /// </summary>     /// <returns>true if dialog closed okay</returns>     protected boolean openSubmitDialog(MenuItemName _menuItemName)     {         if (isSubmission)         {             return … Continue reading Develop Custom Workflow:Counting Journals workflow

Overview of Modern POS with Retail Store Scale Unit (RSSU)

Retail Store Scale Unit allows retailers to sell products within store locations that have internet connectivity issues, where it fails to connect with headquarters (HQ). Retail Store Scale Unit support both Modern POS and Cloud POS within the store. MPOS with Retail Store Scale Unit allows users to perform cross-terminal scenarios across multiple POS devices, like Suspend Shift Close Shift Blind Close Shift Manage Shift Inventory Lookup Stock Count Print X-Report Print Z-Report whereas Cloud-based MPOS offline fails to perform these operations. MPOS with Retail Store Scale Unit fails to perform real-time operations such as Issue/pay Gift Cards Issue Loyalty Card Picking and Receiving Pay by Customer Account Credit Card transactions Order Fulfillment View/Create Time clock entries unless there is internet connectivity to HQ or a payment provider. If most of your transactions involve real-time transactions, then your Store Scale Unit will always need internet connectivity to enable the connection to HQ or payment provider.

RSAT (Regression Suite Automation Tool ) implementation and configuration for Finance and Operations

Purpose The Regression suite automation tool (RSAT) significantly reduces the time and cost of user acceptance testing. This tool enables functional power users to record business tasks using the Finance and Operations Task recorder and convert these recordings into a suite of automated tests without the need to write source code. Test libraries are stored and distributed in Lifecycle Services (LCS) using the Business Process Modeler (BPM) libraries. These libraries are also fully integrated with Azure DevOps Services (Azure DevOps) for test execution, reporting and investigation. Test parameters are decoupled from test steps and stored in Microsoft Excel files. Prerequisites Dynamics 365 for Finance and Operations test environment (Demo or tier 2 UAT environment Excel Azure DevOps: You will need an Azure DevOps Test Manager or Test Plans license. For example, if you have a Visual Studio Enterprise subscription, you already have a license to Test Plans. Pricing-https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/ For a demo environment, you don’t need to buy any license. Authentication Certificate: To enable secure authentication, RSAT requires a certificate to be installed on the RSAT client computer. The RSAT settings dialog box allows you to automatically create and install the authentication certificate. Installation Download Regression Suite Automation Tool.msi to your machine RSAT requires Selenium and web browser driver libraries. RSAT will prompt you if needed libraries are missing and will automatically install them for you. Configuration For RSAT Open RSAT application. Select the Settings button in the upper right to configure RSAT. And next steps will help you to find those required fields input. Go to project settings of Lcs for your projects. Go to Visual Studio Team Services. Here you need to mention the Azure DevOps project in the Azure DevOps site URL field. In order to do that, click on https://www.visualstudio.com Open Azure DevOps and create a new organization if there is not an existing one. Now create a new project as shown below Now you need to set up a security token by clicking on  account info>security Once you create the token, save it as you will not be able to access it again when you want to use it. Once that is done, go back to the main page and create a new test plan. Name it as RSAT-TT (or you can use any name) Now right click on RSAT-TT and create a new suite you can name it ‘Demo’. Azure DevOps setup is done. In Azure DevOps site URL mention Organization name that you set up in Azure DevOps. And in Personal access token field paste the token that you had earlier saved. Click on continue to select the project and continue, Save. Now you need to deploy it to the environment Next, open the Regression Suite Automation Tool, Go to settings in Azure Dev Ops Url field copy it from the LCS Access token should be the security token you had copied. Click on Test connection so the Project name and Test plan will populate. Now run VM. You will find Hostname and SOAP Hostname by going to IIS and then right-clicking on AOSService>Edit bindings. Copy both the Hostname and in Hostname and SOAP Hostname fields paste these values Admin username should be the username you use to login to your environment. To generate Thumbprint click on New and save at any location and then copy the generated certificate to the VM Open the copied certificate and install it to the local machine at personal and Trusted Root Certification Authorities locations.Now Open the wif file in admin mode in notepad from the given location of VM In wif file find CN name=127.0.0.1 exists or not. If not, copy the selected portion and paste it below the same authority block. Now add modify those lines as follows: <authority name=”CN=127.0.0.1″>             <keys>               <add thumbprint=”F46D2F16C0FA0EEB5FD414AEC43962AF939BD89A”/>             </keys>             <validIssuers>             <add name=”127.0.0.1″ />             </validIssuers>             </authority>  ( Note: Add thumbprint of installed Certificate in wif as shown)   Final steps include Copy thumbprint from RSAT settings (which was generated when you click on New) and paste it in wif file in your VM Then Mention the company name And Working directory Set default browser as internet explorer Save as and ok Next, Go to LCS open business process modeler and create a new Library Name it as RSAT, go to edit and rename the process as required and you may add a child node to it by clicking on the Add process.  Now go to Finance and operations, go to test recorder  Create recording by clicking on create a recording and perform the operation and then click on the stop button. Name it as per your need then Save it to Lifecycle services or Save this to PC option. Click ok Now go back to LCS in the project library and click on the requirement, tab check it’s syncing  Now Sync test cases and VSTS sync Next, go to Visual studio DevOps, test cases, click on Add existing Then click on the run query and click on Add test case  Now go to regression suite automation and load the test and download test cases. select test and click on new and generate test execution parameter files Then click on edit option for the older version to edit values in excel For older version For newer version Now edit metadata for the test in excel file and save and close Now Run the test after this step, automatic session for the test is handled by selenium where the browser will perform steps as test cases Then run the test and after it’s completed successfully click on upload (Note the result as passed)

How To Enable / Disable Maintenance mode Using SQL query in Dynamics 365 for Finance & Operations

In finance and operations for making any changes in License configuration form, you need to enable maintenance mode and after making desirable changes you need to disable to this mode. You can enable or disable maintenance mode using SQL query as well as command prompt. In this blog, we are performing this operation using the SQL query. The following are steps to enabling/disabling maintenance mode:- Open SSMS(Microsoft SQL Server Management Studio) in your server. Click on New Query and enter the following query to enable maintenance mode:- update dbo.SQLSYSTEMVARIABLESset dbo.SQLSYSTEMVARIABLES.VALUE =1 where dbo.SQLSYSTEMVARIABLES.PARM = ‘CONFIGURATIONMODE’   You can verify status using following command:-SELECT * FROM [AxDB].[dbo].[SQLSYSTEMVARIABLES] After this restart IIS services in some cases, you need to restart the server. perform your changes to the License configuration form. after desired changes are made you can disable mode using the following command update dbo.SQLSYSTEMVARIABLESset dbo.SQLSYSTEMVARIABLES.VALUE =0 where dbo.SQLSYSTEMVARIABLES.PARM = ‘CONFIGURATIONMODE’ Now again you can verify status using the same query in step 3 and again repeat step 4.I hope this blog will help you

Move database from sandbox to development in D365 Finance and Operations

Hello, In this blog I am going to demonstrate how to move database from sandbox to development environment. In some cases, there might be a situation where you need to debug the code with production data. For this, first we need to move database from production to sandbox with refresh database  in LCS as shown in below screenshot. Then we need to move database from sandbox to development as follows. Steps to move database from Sandbox to Dev Login to LCS and click on Sandbox Environment full details. On Maintain Tab click Move database. To export the Sandbox Database, click on Export Database. 4.  You can find the .bacpac file in Database backup of asset library after successfully executing export command . Download the .bacpac file to development VM. 5. Open SSMS in development server. Before importing the database AxDB you must rename the existing AxDB by the following Script. USE master;  GO  ALTER DATABASE MyTestDatabase SET SINGLE_USER WITH ROLLBACK IMMEDIATE GO ALTER DATABASE MyTestDatabase MODIFY NAME = MyTestDatabaseCopy ; GO  ALTER DATABASE MyTestDatabaseCopy SET MULTI_USER GO 6. Right click on Database, select Import Data-tier Application. 7. Click Next. 8. Change the New database name to AxDB and click Next. 9. Click Next and Browse to the folder where .bacpac is downloaded. 10. Click Finish to import database. 11.  You can see the Steps as follows. 12. Once Import is done, Open Visual Studio and do Full Synchronization.   I hope this blog will help you.    

How to resolve workflow editor error “Application cannot be started.Contact the application vendor”

Sometimes when you try to open workflow editor you receive error  as “Application cannot be started.Contact the application vendor” as shown in screenshot. this problem can be caused due to various versions of application are there on your system. Let see how to solve this problem :- First thing first make sure you are using internet explorer browser for the workflow. If you are using internet explorer the go to settings and go to internet options. Now try to connect the application. If above step is also not working for you then there must be multiple versions of applications are on your system. To resolve this you visit C:\Users\*YourUserFolder*\AppData\Local\Apps\2.0 and search as workfow and select second application file and open file location of it.(AppData may be hidden in some cases) Now in the opened location delete all the files other than folder in that location. after this try to download the latest workflow editor and it should work now.

How to create and apply workflow for purchase order in D365 finance and operations

In this blog we will learn how to create workflows in D365 finance and operation. For this blog we are taking example of Purchase order workflow. Which will allow us to create purchase order which is allocated to different persons for approval and review process. Navigate to Procurement and Sourcing >>Setup>>Procurement and Sourcing workflows, and click on new and select purchase order workflow as follows:- This will open workflow editor in you first need to provide log in details same as that of environment. Here we need to arrange various components and need to set their properties to resolve those following errors. Components for this of workflow: – Start: – To indicate start of workflow design. End:- To indicate end of workflow design. Review Purchase order:-This assign review(Complete/Return PO). Approve Purchase order:- This assign users who needs to approve purchase order.   To design workflow follow the steps :- Now In designer create design as shown in screenshot Set the Review element and right click and open properties and set as basic settings as follows:- In assignment make sure you have assigned type(in our case user) and user name You can also escalate roles after certain time as follows(we are not considering this setup for this blog) Now get back to approve purchase order and open its properties for and set automatic action as follows which will approve Purchase order below 10000 USD. Set the notification for person who will receive notification when particular operation is performed(for eg :- Approved/rejected etc) Now click on step 1 to enter in step 1 section and open its properties. First we are assigning user who will approve the purchase order as screenshot suggest as well as you can set time limit for approval and completion policy as well. You can also add the condition to step 1 which will decide whether to run this step or notNow close step and get back to main design of designer Now click on save and close and mention version notes and activate this workflow Now you can see new workflow in procurement and sourcing workflows Now create new purchase order and after that click on workflow button and click on submit you can also check history of it Now another user will complete the purchase order approval and mention comment. Now user with authority of approval will approve  from common>>Work Items assigned to me  

Model import and export in D365 Finance and Operations using Powershell

When we want to move customization done on specific model from one environment to other development environment we need to export and import the model file. Steps for model import and export using PowerShell :- Open PowerShell in administrator mode. Change directory to the path of package bin folder. Export command:-.\ModelUtil.exe -export -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=”name of model” -outputpath=path to store model after exportFor example: If model name is TOUpgradeModel and I want to store the model file to path is C:\Temp\ModelFile The command will be as follows: .\ModelUtil.exe -export -metadatastorepath=K:\AosService\PackagesLocalDirectory -modelname=”TOUpgradeModel” -outputpath=C:\Temp\ModelFile Output file you can see on the specified path as Import Command :-.\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=the path from. axmodel to importFor example: .\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=C:\Temp\ModelFile\TOUpgradeModel-Cloudfront.axmodel ( Note : If model already exist in your environment, trying to import the same model you will     receive the error message of “Model already exist”. So, delete the existing model by command .\ModelUtil.exe -delete -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=” TOUpgradeModel ” try to import the model)

SEARCH :

FOLLOW CLOUDFRONTS BLOG :

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange