D365 Finance and Operations Archives - Page 7 of 24 - - Page 7

Category Archives: D365 Finance and Operations

Integration with Finance and Operations – From Basics (Part 1)

Introduction Finance and Operations provides two major ways to interact with tables (or data entities) for external system using APIs; namely Custom Services and Data Entities. Data entities in D365 Finance and Operations simplify data management by grouping data from multiple tables. They make it easier to import, export, and integrate data with other systems. Custom services in D365 Finance and Operations allow developers to create web services for specific business needs.They enable external systems to interact with D365 F&O by exposing custom logic and operations.This helps in integrating and automating processes with other applications. Feature Data Entities Custom Services Purpose  Simplify data management tasks like import, export, and integration.  Expose custom business logic and operations as web services. Functionality  Provide structured access to data from multiple tables in a unified format.  Allow external systems to perform actions or retrieve data via API calls. Usage  Used for bulk data operations, data migration, and integration with external systems.  Used for real-time integration, extending functionality, and custom business process automation. Typical Use Cases  Data import/export, data synchronization, and data migration.  Integrating with external applications, custom business processes, and real-time data access. Data Handling  Focuses on data in bulk.  Focuses on specific operations or business logic. Pre-requisites References Data Entities Overview – Finance and Operations Build and consume data entities – Finance and Operations Exposing an X++ class as a Data Contract Configuration Here, to understand creation of APIs in either case, we’ll expose the same table using both Data Entities and Custom Services. Data Entity: Right click on the project and click on “Add” and then “New Item” Click on Finance and Operations > Dynamic 365 Items > Data Model and then select “Data Entity” Select the table that you want to expose in the “Primary Data Source” field, appropirate “Entity Category”, “Public Entity Name” and “Public Entity Set Name” (which is what the endpoint will be), and the Staging Table name. Select the necessary fields from the primary data source. You can add related tables by clicking on the small arrow next to the table name, which displays the list of all associated tables. Then you can select the relevant fields from the associated tables. Once done, you’ll get one data entity, two security privileges and one staging table created. If you want to add new data sources, then you can right click on the Primary Data Source’s “Data Sources” tab and add new data source. You can drag fields from any of the data sources into the “Fields” section of the data entity to make them available on the API. Calling the Data EntityYou can call <base url>/data url to get a list of all the data entities available in the system. From here, if I call a “GET” request on my Data Entity (the “Public Collection Name” property of the data entity, which we set in the Data Entity wizard), I’ll get the following response.Please note that this “Public Collection Name” is case sensitive. Now, if I need to create a “Customer” record then I can simply pass the same keys into a “POST” request. And we can see the same in FnO. If we want to update a record, then we make the PUT request with the syntax –  {{base_url}}/data/TestCustomers(dataAreaId='<Company Name>’,CustomerId='<Customer Id>’) It will include all the Entity Keys defined on the Data Entity as we only have one field then we are simply passing that. Passing it without the DataAreaId will throw errors. You can delete the record using the same syntax but with the “Delete” request. Conclusion: In this blog, we explored how to create APIs using Data Entities in Dynamics 365 Finance and Operations, simplifying data management and external system integrations. Data Entities offer an efficient way to handle bulk data operations, while Custom Services provide flexibility for exposing specific business logic We’ll see how to create APIs using Custom Services in the next blog. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to create user specific Purchase Requisition in Multi Entity Environment

Introduction In multinational companies having different hierarchy for purchase, i.e., in some companies, purchase is made at regional level, but in some companies, purchase is made at headquarter level by consolidating all regional requirements to have higher buying power and economies of scale and to maintain the same quality of goods over all regions. In a multientity environment having separate sets of employees for each legal entity and only headquarter employees having purchasing authority for all regional entities, the decision of inventory is taken by regional entity employees. In this case, each region submits a requirement to headquarters. In this case, to post and report on what and how much purchase was made for the respective regional entity, we need to create a purchase requisition for the respective buying legal entity. i.e., USMF is the headquarters entity, and PM is the regional entity. Problem statement While creating Purchase requisition from headquarter employee’s login, it is created with buying legal entity as headquarter entity. i.e. Julia is an employee of the headquarters USMF entity who will be going to issue the purchase requisition, and Mahesh is an employee of the regular entity PM. When we login to the PM entity from Julia’s login and create a purchase requisition, then the entity will automatically change to USMF. i.e., when a purchase requisition is made for a PM entity through Julia’s login with details given by Mahesh, it should remain for the PM entity, but the entity changes to USMF. & hence purchase requisition is registered at USMF. Follow the below steps in order to create a purchase requisition with a buying legal entity as per information given by respective regional entity employees and to maintain all details on the respective entity. i.e., details given by Mahesh for purchase requisition will be maintained on PM entity. Needs to add Mahesh name as requester for PM entity in Julia’s account. Go to Procurement & Sourcing > Setup > Policies > Purchase Requisition Permissions > 2. Then the below screen will appear. By preparer, choose Julia. Requester: Add all required employees of the PM entity, i.e., Mahesh, Jodi, Charlie, Ramesh, etc. 3. Go to Procurement & Sourcing -> Purchase Requisitions -> All purchase requirements and create new Purchase requisition from PM entity & click on requester, then all above added names will be available for selection. 4. And now if we add requester as Mahesh or any other name from the list, then a purchase requisition will be created for the PM entity, and the window will not switch back to USMF. Now all items added to this purchase requisition will be ordered and maintained for the PM entity. Conclusion For businesses with multiple legal entities, appropriately configuring purchase requisition permissions in Dynamics 365 guarantees that purchases are correctly attributed to the appropriate legal entities. This method improves reporting accuracy at the regional levels while also streamlining the procurement process. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

CI/CD with TFS for Finance & Operations

Introduction There are 100 million developers around the globe who use Git. However, if you want to work with customizations in Finance and Operations, you need to learn how to use TFS. Initially, I was frustrated and confused about this requirement, but as I learned more about how projects are structured both locally and in TFS, things started to make sense. TFS (Team Foundation Server) is Microsoft’s take on source control management, continuously released and improved since 2005. TFS keeps the source code centralized and tightly integrated with the Microsoft ecosystem. Despite the differences, if you are familiar with Git, transitioning to TFS shouldn’t be too difficult. TFS shares similar concepts with Git, such as checking in, branching, merging, version tracking, and other standard features of a source control management system. Understanding these similarities can make learning TFS easier and help you leverage its full potential in Finance and Operations projects. Pre-requisites Configuration So, here we’ll be starting with a new project.(If you are working with an already created repository then you can skip ahead.) Now, I’ll be adding two folders here, “Main” and “Released”. Later, we’ll convert them into branches from Visual Studio. In TFS, we have the concepts of branches and in addition to that branches can contain folders as well. Folders are used for organizing files and do not impact the version control flow directly.They are simply a way to keep the repository structured and manageable. Branches here (similar to Git) are used to manage different versions or lines of development in the repository.They allow for parallel development and keep separate histories until changes are merged. Inside Main I’ve added a trunk folder as well. Now, let’s head into our Development Enviroment and connect this project to Visual Studio. I’ve clicked on “Continue without Code” for now. I’ll click on View and “Team Explorer” Here, it says “Offline” as currently there’s no Azure DevOps project connected to it.So, let’s do that! I’ll click on “Manage Connections” and “Connect to a Project” Here, as the project is hosted in my own organization on Azure DevOps,I’ll use the same credentials to log in. Here, we can see all the different projects I have created within my organization in Azure DevOps.I’ll click on the relevant one and click on Connect. Here, we see the 3 sections in the Team Explorer view. 1 – Which credentails are being used to connect.2 – The name of root of the project and where it is planning to download the content from TFS.3 – The different components where we’ll be doing most of our work once the initial set up is completed. For now, I’ll just click on “Map & Get” Here, we can see that the mapping was successful. Next, we click on the Source Control Explorer to see the actual content of the TFS server. Now, we can convert the “Main” and “Release” folders into Branches. We can do this by right clicking on the folder -> Branching and Merging -> Convert to Branch  After converting them to Branches, the icon next to them changes. Next, I’ll right click on my “Main” branch and add two new folders here. “Metadata” and “Projects” Now, before we can use these folders anywhere, we need to “push” these changes to the TFS.For that, we right click on “Trunk” folder and click on “Check in Pending Changes”. Now, we add a comment here describing what changes have been done (similar to a commit message)At the bottom we can see the files that have been created or modified. Once the check is done, we can see that the “+” icon next to the folders disappears and we get a notification that the checkin has been completed successfully. Now, this is where TFS shines through as better source control management for Finance and Operations. In (FnO) models and projects are stored in separate folders.  Using Git for this setup can be tricky, as it would either mean managing two different repositories or dealing with a huge .gitignore file.  TFS makes it easier by letting you map local folders directly to TFS folders, simplifying the management process. Here, we can see that currently, our mapping is a bit different than what we need, this is because of the “Map & Get” we did initially.So, to change that mapping, click on “Workspaces” Then click on “Edit” Now, we click on a new line to create a new mapping.Here, I’m creating a mapping between the “Metadata” folder in the “Main” branch of the TFS and the “PackageLocalDirectory” the place where all the models are stored for my system, Now, I’ll create another mapping between the Projects Folder and the local folder where my projects are stored. Now, once I click on “OK” it’ll prompt me if I want to load the changes. Click on “Yes” and move forward. But nothing changes here in Source Control Explorer. That’s because the Source Control Explorer shows what is stored in the TFS.And right now, nothing is; so we’ll have to add some models or projects here.Either we can add existing ones or we can create a new one.Let’s try to create a new model. Now, that the Model is created we’ll need to add it to our Source Control. Click on the blank space within the “Metadata” folder and select “Add Items to Folder” In the window, we can see that because of the mapping, we are sent to the local directory “PackageLocalDirectory”, and we can see our model inside it.Select that and click on “Next”. In the next view, we can see all the files and folders contained within the selected folder.Out of these, we can exclude the “Delta” folders. After, this we are left with these folders for the different elements.We can remove the content from the “XppMetadata” folders as well. Which leaves us with just the Description xml file. **Please do not exclude the descriptor file as without it Visual Studio will not be able to refer to your model or it’s … Continue reading CI/CD with TFS for Finance & Operations

Share Story :

Leverage Postman for Streamlined API Testing in Finance and Operations

Introduction Postman is an essential tool for developers and IT professionals, offering a robust platform for testing APIs, automating test processes, and collaborating efficiently across teams. In this blog we’re going to connect Postman to our Finance and Operations environment so we can test out standard or custom APIs. This connection is a crucial step in ensuring that your APIs function as expected, and it helps streamline the integration of various business processes within your organization. Whether you’re new to API testing or looking to optimize your current setup, this guide will walk you through the process with clear, actionable steps. I’ve already covered automating the testing in Postman in my blog here so once the connections are in place you’ll be good to go! Pre-requisites Configuration We’ll start with creating an App Registration in Azure Portal. Go to the Azure Portal (of the same tenant of your FnO Environment). Search for “App Registration” and click on “New Registration”. Add a name for your new app and click on “Register.” Once it is completed, you’ll be taken to the Overview of the app. Here, click on the “Add a certificate or secret” under the “Client Credentials.” Add an appropriate name and select the expiration date of the certificate as necessary. Once you click on add you’ll get the confirmation message that the client credential has been created and you’ll be able to see the value of the secret. ** Be sure to copy this value and keep it securely as once we refresh from this page, this value will not be available. ** Now that everything is done on the Azure side, open your FnO environment and search for “Microsoft Entra Applications.” Click on “New.” Paste the “Application (Client) ID” into the “Client ID” field, then assign it a suitable name and a User ID.  The permissions given to the User ID will determine the permissions for the App. For now, I’ve assigned the “Admin” user. That’s all the configuration required at the FnO side. Now, let’s jump back into Postman. Now, in Postman we’ll start with a blank workspace and create a simple collection. The first thing that I like to do is to create different environments. As in FnO, we have a Production, a Sandbox and we can have multiple development environments so it may be possible that different environments are using different apps. So, to represent these environments, I like to create different environments in Postman as well. This is done by going to the “Environments” then clicking on “+” to create a new environment and giving it the appropriate name. Now, in this environment, I’ll add my app details as environment variables. The values for these can be found as follows –  “grant_type” can be hard coded to “client credentials” and we can leave “Curr_Token” as blank for now. So, at the end we get – We can also change the type to “Secret” so that no one else can see these values. Now, the necessary credentials have been added to Postman. Next, we’ll set up our collection for generating the Auth Token. For that, we’ll copy the “OAuth 2.0 token endpoint (v2)” from the “Endpoints” in our Azure App Registration’s Overview screen. In Postman, click on your Collection then “Authorization” then selects “OAuth 2.0” I’ll paste the URL we copied from the “OAuth 2.0 token endpoint (v2)” from the “Endpoints” in our Azure App Registration in the “Access Token URL” field and I’ll add my variables as defined at the Environment variables. If you get the error “Unresolved Variable” even after defining them, then it could mean that the Environment isn’t set as the active. So go back to the Environments list and mark it as active. This way we can easily swap between Environments. Once the environment is marked as Active, we can see that the variable is found correctly. I’ll also ensure that my Access Token URL refers to the tenant ID as per my variable by embedding my “tenant_id” variable in it. Next, I’ll click on “Get New Access Token.” If everything has gone well, you’ll be able to generate the token successfully. After that, you can give it a name and click on “Use Token” to use it. I’ll now create a simple API which we can use to test this token. Right click on the “Collection” and click on “Add Request” give it an appropriate name. The “{{base_url}}/data” returns a list of APIs available in the system. I’ve set the Authentication to “Inherit Auth from parent” which means it relies on the Authentication set on the “Collection” for calling the request as is shown on the right side of the screen. Here we see that the request was executed successfully. If for some reason, you cannot use the standard Postman way of generating the Token, you can create a seperate API responsible for generating the Auth Token, store it as a variable and use it in your requests. From here, you can use the generated “access_token” and pass it as a “Bearer Token” to your requests. Or you can select the entire token and set it to your “Curr_Token” variable. And then you can pass this variable to the requests like –  From Postman, we can then share these collections (which contain API data) and environments (which contain credentials) seperately as needed. All Data Entities follow the pattern – {{base_url}}/data/{{endpoint}} All services follow the pattern – {{base_url}}/api/services/{{service_group_name}}/{{service_name}}/{{method_name}} If I call a “Get” request on them, I get the details of the services for instance, here I’m getting the type of data I have to send in and the response I’ll be getting back in the form of object names. Moving back one step, I’m getting the names of the Operations (or methods) within this service object. Moving back one step, I’m getting the services within this Service Group. Moving back one step, I can see all the service groups available in the system. To actually run the logic behind the services … Continue reading Leverage Postman for Streamlined API Testing in Finance and Operations

Share Story :

How to Consolidate Balances of Multiple Entities and Set Up Elimination Rules in D365 F&O

Introduction Accurate reporting and analysis require the consolidation of balances from various entities in complex financial environments. For this reason, Microsoft Dynamics 365 Finance and Operation provides strong tools that let businesses expedite their financial consolidation procedures. This blog will show you how to properly handle currency translation, set up elimination rules, and create a new entity for consolidation. You can make sure that your consolidated financial statements give a true and accurate picture of the financial health of your company by being aware of these procedures. To consolidate the balances of multiple entities, a new entity is created where the balances of the selected entities are consolidated and eliminated as per the requirement. In the Organization administration module>Organizations>Legal entities select Use for financial consolidation process and use for elimination process. Another part of set up is to create Elimination Rule:Create Elimination Journal by using the below screen: To run the consolidation process, navigate from the Main Menu to Consolidation -> Consolidate Online. The consolidation window opens, where the user can select the options as explained below: Go to the Legal Entities tab. Inside the legal entities, the user can select the entities to consolidate and the percentage of balances to be consolidated. Go to the Elimination tab. In the proposal options, keep the option as Proposal only. This will run the elimination of balances, but it will not post the amounts. The amounts will be posted to the ledger separately by the user. Add the elimination rule in the line. The elimination rule will eliminate balances based on 2 methods: Select the GL posting date for the date on which the elimination of the balances will be posted. Ideally this date will be the last date of the fiscal period. Go to the currency translation tab. The system will display the selected legal entities along with their base currency. At the bottom, select the exchange rate type. The exchange rate type will automatically convert the base currency of all entities to the base currency of the consolidation entity. In the above example, the exchange rate will convert INR and BRL to SGD. Note – This will work if the exchange rates are defined first. Lastly, click on OK. The system will run the consolidation process as a batch job and will provide the results in the trial balance after a few minutes. To verify the balances, open the trial balance for the fiscal period used in the consolidation. The TB will display the consolidated amounts of all entities in SGD only. For updating opening balances: General Journals to be used for updating opening balances For currency exchanges rates: Separate currency exchange rate type consolidation to used. In doing the currency translation, distinction should be made for monetary items and non-monetary items in the Balance Sheet. Normally, the latter should be part of Other Comprehensive Income (OCI). In the consolidation process, we can map different currency rate to different accounts through this screen. for Equity Method where only profit or loss has to be accounted in the consolidated entity, Journal entry has to be passed. Conclusion Maintaining financial accuracy and transparency in Finance and Operations requires successfully consolidating balances and establishing elimination rules. You can handle currency translation, properly apply elimination rules, and efficiently oversee the consolidation process by following the steps outlined in this blog. This strategy strengthens overall financial management within your company as well as the accuracy of your financial reports. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Managing Task Limits per Batch Group in Microsoft Dynamics 365 for Finance and Operations

Effective task management is essential in the field of enterprise resource planning. Administrators could designate particular batch servers to a batch group with more flexibility prior to Microsoft Dynamics 365 for Finance and Operations (D365FO) introducing “Priority Based Scheduling”. This capability was not merely about enhancing capacity but also about controlling and limiting it. The Problem of Overutilization Recently, our team encountered a significant challenge. We had 98 tasks, all marked with a “normal” scheduling priority, that were able to execute simultaneously. Given that more than 98 tasks were available, all of them entered the executing state at the same time. This situation led to a 100% Database Transaction Unit utilization over a prolonged period, which is far from ideal. Such a high utilization rate can strain the system, leading to performance issues and potentially impacting other operations. In the past, this kind of issue could have been mitigated. The older batch group mechanism allowed us to limit the number of batch servers assigned to a batch group, thereby controlling the number of parallel tasks. Unfortunately, with the shift to “Priority Based Scheduling,” this direct control seemed to have been lost, leading to the problems we recently faced. Discovery of Batch Concurrency Control With the release of version 10.0.38 PU63, a new feature called “Batch Concurrency Control” caught my attention. This feature reintroduces the ability to limit or throttle the number of parallel tasks in a specific batch group. Had we been aware of this feature earlier, and had the users selected the correct batch group in their request forms, we could have limited the number of parallel tasks to a manageable number, such as 10. This would have prevented the processing from adversely affecting other users and maintained overall system performance. Activating and Utilizing the Feature After activating the “Batch Concurrency Control” feature, you will notice a new field in the batch group settings. This field is crucial for managing task concurrency effectively. Understanding the Help Text The maximum number of tasks that can run in parallel at a time for Batch Jobs in this Batch Group. This setting should be set to zero if concurrency control is not required. To completely stop all batch jobs in this Batch Group, set the value to -1. It’s important to remember that using this feature on batch jobs with more than 5000 concurrent tasks that are prepared to run could have a negative effect on batch scheduling performance. This explanation is vital. Setting the value to zero means no concurrency control, while setting it to -1 halts all batch jobs in the group. However, caution is advised against using this feature for batch jobs with more than 5000 concurrent tasks, as it could degrade the performance of batch scheduling. Implementing the Feature in Our Workflow In our operations, we now actively use this feature to manage the number of available tasks per batch group. This approach mirrors our previous strategy, where we selected only a few batch servers for a specific batch group. By doing so, we can effectively throttle the tasks and ensure a balanced load across the system. Conclusion The introduction of “Batch Concurrency Control” in Microsoft Dynamics 365 for Finance and Operations has provided us with a much-needed tool to manage and control task execution within batch groups. By setting appropriate limits, we can prevent system overloads, maintain performance, and ensure a smoother operation. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com 

Share Story :

Security Roles in D365 Finance and Operation

Introduction: Ensuring user security is a crucial aspect in Dynamics 365 for Finance and Operations. To access or utilize the features of Dynamics 365 F&O, a user must have a role assigned to them. Without a role, the user will be unable to perform any actions within the system. Access levels and business processes for a particular role are determined by the duties and privileges associated with that role. In this blog post, we will explore two recently introduced features that simplify the process of comprehending and setting up customized security roles within Dynamics 365: security diagnostic and security configuration tools. Defining terms used in Security: Security Roles: – Security roles in Dynamics 365 define how users can access different modules. – The system comes with pre-defined security roles that can be assigned to users. A user has the ability to possess multiple security roles. – Data security policies can only be applied by the administrator to limit user access to data. – To gain access to Finance and Operations, it is mandatory for users to be assigned to at least one security role.  – Security roles correspond to company responsibilities and contain a set of duties required to carry out functions. Duties: – Duties correspond to tasks of a role and are part of a business process. – They are composed of privileges necessary for performing an action. – Duties can be assigned to multiple security roles and help reduce fraud and detect errors. – Segregating duties is important for regulatory compliance such as SOX and IFRS. – Default duties are provided, and administrators can modify or create new duties. Privileges: – Privileges are unit action sets that correspond to system functions. – They specify the level of access required to perform a job or complete an assignment. – Privileges refer to specific permissions granted to application objects such as UI elements and tables. – Default privileges are provided, and administrators can modify or create new privileges. Permissions: – Permissions are required for accessing functions in Dynamics 365. – Access levels are grouped for permissions to tables, fields, forms, or server-side methods. – Permissions include any tables, fields, forms, or server-side methods accessed through the entry point. Security Configuration Tool: The Security Configuration Tool is a useful tool for administrators as it enables them to create and manage security roles, duties, and privileges.  The Security Configuration Tool is a feature in Dynamics 365 that offers various benefits to users. Here are some of the benefits: Display Entry Point Permissions: The tool enables administrators to display entry point permissions for a given role, duty, or privilege. Test Security Role is a useful feature that allows users to check if a security role, duty, or privilege has been newly created or modified, without having to create a separate test user account.  Non-Permanent Changes: Changes made in the Security Configuration Tool are not permanent and must be published to take effect. Data Export/Import: Changes can be saved as a data export file that can be imported into desired environments. Full Hierarchy View: Users can access the tool by going to System Administration > Security > Security Configuration and have a full hierarchy view of roles, duties, privileges, and entry point security assignments. Duplicate Existing Roles: Users can duplicate existing roles, duties, and privileges. Various Options: The tool offers several options for performing against the currently selected role/duty/privilege, including undo/redo customizations, creating new roles, showing all levels, deleting roles, duplicating roles, copying roles, viewing permissions, and displaying the audit trail. To add a new role in Dynamics 365 with the Security Configuration Tool, you can follow these easy steps: Step 1: Select the ‘Roles’ tab and click ‘Create new’ to create a new role in Dynamics 365. Step 2: Enter the name of the new role using a different naming convention so that it is easily identifiable. Step 3: To add a new duty to a role, highlight it, go to the Duties column, and select Add references. All duties (and customs if created) will be available in the list. Step 4: You can select certain tasks, and their corresponding privileges, to be available in a role. If needed, users can remove certain tasks from the role. Step 5: To modify object permissions, go to the Privileges section. Dynamics 365 has different access levels, such as Read, Update, Create, and Delete, that determine a user’s level of access to a particular record or record type. Security roles have three types of access levels: Unset, Grant, and Deny. Step 6: Any modifications made in the user interface must be published before they are implemented. This list shows all the changes that are not yet published. Security Diagnostic Tool: The Security Diagnostic Tool is a unique feature of Dynamics 365 that empowers individuals with a security administrator or system administrator role to conduct an assessment on any form to identify the roles, duties, and privileges required to accomplish a task.  The tool provides numerous advantages to its users, including: To use the Security Diagnostic Tool, simply follow these steps: go to the Option tab, then select Page Options, and finally, click on Security Diagnostics.  This will automatically run the tool for you. Remember that the Security Diagnostic Tool is available on any form.  After the tool runs, it will generate a comprehensive list of all the roles, duties, and privileges related to that particular form.  This enables administrators to quickly identify any gaps in security and make necessary adjustments to ensure the protection of the system and its data. Conclusion: In conclusion, Dynamics 365 for Finance offers a reliable and secure role-based security system that guarantees that users have access to only the data that is necessary for them to carry out their tasks. The security roles, duties, privileges, and permissions collaborate to create a comprehensive security system that is effective.  Moreover, the security diagnostic and configuration tools make it simpler to comprehend and customize security roles in Dynamics 365. We hope you found this … Continue reading Security Roles in D365 Finance and Operation

Share Story :

Sales Return Process in Dynamics 365 Finance and Operations Part 1

In the world of retail and commerce, managing sales returns efficiently is a critical aspect of customer satisfaction and operational excellence. In this blog, let’s explore how the sales return process works in Dynamics 365 Finance and Operations (D365FO) and explore how businesses can leverage the capabilities of this robust ERP system to streamline and enhance their return management. In this part, I will walk you through the standard process. Please keep in mind that the steps and setups may vary based on the business requirement. Let’s consider a scenario wherein we have sold 10 quantities to the customer and amongst them 3 quantities get damaged during transit. So now the customer wants to return those items back to us. So, lets create a Return Order, to do that go to Sales and Marketing>Sales Return>All Return Orders. Create a New Return Order. The RMA number is generated automatically based on the Number sequence set up. Tip: In order to generate the RMA Number manually you can enable the Manual number sequence parameter in the RMA Number Sequence set up. For this scenario, I have enabled the Manual Parameter. The next step is to Enter the Customer Details, Site, Warehouse, Return Reason Code and RMA Number. Then click OK. Now to add the line item on the Return Order line there are two ways to do it: For now, I will go with the Find Sales Order function. To use that go to the Return Order fast tab in that under the Return Tab click on the Find Sales Order button. The next step is to select the Sales Order Invoice for which the Return Order will be Created.   Based on my scenario the system has automatically taken the quantity as 3. In the above screen shot you can see that a New Return order is created with the Negative line quantity. Return Order processing: There are two ways to process a Sales Order Return: 1.Credit Only: In the credit-only process, the customer’s account is credited without the need for replacing or returning the item. Here, the Sales price is credited to the Customer deducting the charges. 2.Physical Return: The Physical Return process involves the Return of Item to the customer. Here during registration, a Disposition Code is assigned which determines the Sales Return process for that particular Item. To keep this simple and easy to understand I will go ahead with Credit Only process in this part. But for this we will first need to create a Credit Only Disposition Code. For that go to Sales and Marketing>Set Up>Returns>Disposition Codes. In that Click New and create a New Credit Only Disposition Code. Now the next step is to Register the Return order for that on the Return order lines click Update Line then under that click on Registration. Then select the Credit Only Disposition Code and click OK. Then Confirm the Registration. As I click on the Confirm Registration button a New Sales Return Order is created with the same quantity and same customer on the All-Sales Order page. The next step is to Invoice the Sales Return Order. After Invoicing the Sales Return Order, in the below screenshot you can see that the Return status of the order is changed to Invoiced. This completes the Credit only sales return order process. Maintaining customer satisfaction and operational efficiency in Sales and Marketing involves effectively managing sales returns. Dynamics 365 Finance and Operations (D365FO) simplifies this task, whether you opt for a credit-only strategy or handle physical returns. By leveraging D365FO’s powerful features, businesses can ensure precise and efficient return management, enhancing both customer relations and operational excellence. Stay tuned for the next section, where we’ll dive into the Physical Return process. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com 

Share Story :

What’s the right platform for your company D365 Business Central or D365 Finance & Supply Chain? 

Introduction:  As a business owner, you might have come across a situation where you want to upgrade your current systems with renowned ERP solutions available on the market. One of the ERPs you would consider is Microsoft Dynamics. However, even choosing this would require a lot of brainstorming from all decision-makers because Microsoft Dynamics comes with two ERP platforms, i.e., Dynamics 365 Business Central and Dynamics 365 Finance & Supply Chain. I assume this brainstorming itself would have probably landed you on this article. Let’s dive into the key differences and use cases for each platform. Hopefully, by the end of the article, you will be able to make a decision and choose the right platform for your business.  Below are the key factors differentiating Dynamics 365 Business Central and Dynamics 365 Finance & Supply Chain: Company Size: When determining company size, the usual factors that are considered are revenue and employee count. The definition of which can change based on the country you are located in. Here, for reference, we would consider the following: – Revenue: a. SME: Having revenue between 0 and 1 billion USD. Business Central is ideal for this size of company. b. Large companies: Having revenue above 1 billion USD. Finance and Operations is the ideal platform for large organizations. – Employees: a. SME: having employees between 0 and 500. b. Large companies: having more than 500 employees. Number of Entities: If your company has multiple legal entities in multiple geographical locations across the world,a. Business Central is ideal for companies with a single legal entity or multiple legal entities in the same country. Business Central allows you to create and manage individual products and accounts for each legal entity; however, it cannot be managed centrally. b. Finance and supply chain are ideal for companies with multiple legal entities across the world. Finance & Supply Chain allows you to manage products and accounts centrally and release them centrally to each legal entity across the world. Business Operations: Does your company have streamlined and simple operations? a. Business Central can handle operations for companies with streamlined and simplified operations that do not require very detailed data capture or sophisticated reporting. b. Finance & Supply Chain captures detailed data, covers a lot more processes than Business Central, and hence can provide robust and detailed reporting. Future Growth: It is also important to consider what the growth plans are for your company. If you currently have 2–3 legal entities, you may be tempted to go with Business Central, as it comes with low implementation and operating costs, ease of use, and faster implementation timelines. However, ERP projects are not done frequently, and it is important that you consider future organizational plans. Let’s say you intend to expand over the next three to five years into multiple different geographical areas. You should think about finance and supply chain as your organization’s go-to platform since this will be a big, long-term investment. You might want to consider the following factors while making the decision: – Licensing: a. Since finance and supply chain are for large companies, they come with a minimum licensing requirement of 20 licenses. b. Business Central is perfect for small and medium-sized businesses (SMEs) because it only requires one license and has no minimum licensing requirements. – Implementation timelines:a. Finance and Supply Chain has a typical implementation timeline of 6 months or more, considering the size of the implementation and global rollout.b. Business Central can be up and running in 3–6 months.    Conclusion:  Choose Business Central if: You’re an SME seeking an easy-to-use, all-in-one solution with a lower upfront and operating cost.  Choose D365 Finance & Operations if: You’re a large enterprise requiring extensive functionalities, deep customization, and global capabilities.  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Azure Integration with Dynamics 365 Finance & Operations

Introduction: Businesses in the digital age depend on cloud platforms and ERP systems integrating seamlessly. Dynamics 365 Finance & Operations (F&O) and azure integration is one such potent combination. Numerous advantages, such as improved scalability, agility, and data-driven decision-making capabilities, are made possible by this integration. The step-by-step instructions for connecting Azure with Dynamics 365 F&O will be provided in this blog. Steps to achieve the goal: Step 1: Setting up Azure Services a. Create an Azure account: Sign up for an Azure account if you don’t have one already. b. Provision Azure resources: Set up the required Azure resources such as virtual machines, databases, storage accounts, and other services according to your needs. Below are few links to create azure account. https://learn.microsoft.com/en-us/answers/questions/433827/how-to-get-an-azure-account-without-credit-card https://azure.microsoft.com/en-in/free/students Step 2: Configure Azure Active Directory (AAD) a. Click on New on the App Registration page. Set the name and set the type like below screenshots. b. Once you click on Ok button you would get notification like below. c. Now go to API Permission and click on Add permission d. Select Dynamics ERP e. Select Delegated Permission f. Select all permission and then click on Add Permission g. After selecting this permission again add permission on the screen this time selected Application Permission. h. Now we have to generate client secret value. Just select Certificates and secret. i. You will see the below screen where you can generate a new client secret j. Once you click on new you will see below screen where you can set the date to which this secret key would be valid. Max validity is 2 years. k. This is how the secret value would look like just copy Value. l. Now copy the Directory ID and Application ID Step 3: Connect Azure Services to F&O a. Go to Finance and Operations and serach globally Azure Active Directory/Microsoft Entra ID b. And then click on New and add your client id over here and set User ID as Admin. Please Note you should have the admin access right if not this won’t work. Conclusion: Azure integration with Dynamics 365 Finance & Operations empowers businesses to streamline processes, unlock data insights, and achieve operational excellence. Next blog would be how to connect standard API on postman and perform get and post function. Stay tuned! We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange