Automating Opportunity Timeline Updates for Owners and Sales Teams in Dynamics 365 using power automate
What is Opportunity in D365 CRM? The opportunity table represents a potential sale to new or established customers. It helps you to forecast future business demands and sales revenues. You can create a new opportunity in Dynamics 365 for Customer Engagement to monitor an inquiry from your existing customer or convert a qualified lead into an opportunity. Opportunities are frequently used by salespeople to monitor the sales engagements they are presently working on. For More details, please follow the linkhttps://learn.microsoft.com/en-us/dynamics365/sales/developer/opportunity-entities What are Notes in Timeline Section of D365 CRM? The timeline makes the entire history of activities visible to app users. The timeline control is used to capture activities like notes, appointments, emails, phone calls, and tasks to ensure that all interactions with the related table are tracked and visible over time. Use the timeline to quickly catch up on all the latest activity details. For More details, please follow the link https://learn.microsoft.com/en-us/power-apps/maker/model-driven-apps/set-up-timeline-control Use Case: Using Power Automate, whenever the notes in the Opportunity’s Timeline are updated, an automated email will be sent to the Opportunity Owner and associated Sales Team in the timeline of that Opportunity. Steps: – Login to make.powerautomate.com with your CRM credentials and you will land onto this page. – Once you have landed into Power Automate Page, click on Create and selected Automated Cloud Flow – Set your Trigger as the below since the flow should start working only when the Notes are added in the timeline of the Opportunity. – I have also set a certain condition to this flow. In other words, it checks whether there is a non-empty value in _objectid_value and that this value’s type is ‘opportunities’. The expression returns true if both requirements are satisfied and returns false otherwise. – The Expression is @and(not(empty(triggerOutputs()?[‘body/_objectid_value’])), equals(triggerOutputs()?[‘body/_objectid_type’], ‘opportunities’)) – Then Initializing variable for Email Addresses – Initializing variable for Notes Table – Retrieving the Owner’s Email Address. This step is necessary to obtain the Opportunity Owner’s email address so that we can send the initial notification email to them. – List All Notes in the Opportunity. We use the List rows action to retrieve multiple Note records associated with the Opportunity. This allows us to access all the notes within the Opportunity’s timeline. – Get opportunity by ID. Here we are fetching the complete details of to access all the fields and data associated with the specific record. We are filtering out based on name and opportunity id. The row ID is typically obtained from another step in your flow, such as a “List rows” action or a trigger that provides record details. My record details where I need fetch details is from Opportunity. – After retrieving the record, we need to fetch details of the associated Sales Team. This ensures that whenever a record is linked to the Sales Team, all members of the Sales Team receive an email notification. Thus, we are connecting the Sales Team to the Opportunity to include them in the notification process. The Fetch XML needs to be taken from Advanced Find in CRM. – In this step, we need to store the email address of the sender (i.e., the “From” user). We initialized this variable in step 3, so here we will save the GUID of the sender into that variable. – In this step, we need to save the email address of the recipient (i.e., the “To” user). We initialized this variable in step 3, so here we will store the GUID of the recipient into that variable. a participationtype mask of 2 indicates a specific participant role or type, i.e., sales team members associated with this Opportunity. – Next, we need to ensure that the content is structured within a table. As specified in step 6, I created a variable called `NotesTable` to hold this data. We will use this table to format the content into an HTML table for the email. – In this step, we are configuring the URL link for the Opportunity. Include the base URL of the environment and append the unique identifiers for both the Opportunity and the Topic field (which is a field within the Opportunity). – Sending an Outlook Email to the Opportunity Owner and Sales Team associated with the Opportunity. This Outlook mail works only if ‘Email Addresses Sales Team is Skipped’. – In Power Automate, adding a new row typically involves using actions provided by connectors such as Dataverse, SQL Server, SharePoint, or others, depending on where your data is stored. Here we have created an email body in this action. – In this step, we are using a bound action to send emails within the CRM system. Output: – Once, I click on Add Note, waiting 5-10 seconds and then you find the email within the timeline. Please note that the Email is tracked within CRM itself. – Below is the Opportunity which contains the Opportunity Owner, and the Sales team associated with that Opportunity. The Owner is CF Admin, and the Sales Team Members are Amit Prajapati, Ethan Rebello and Mithun Varghese. – The Opportunity Owner and Sales Team will receive notifications about the Notes in the timeline. All email interactions will be tracked in the Opportunity’s timeline, where you can also view all previous notes associated with this timeline. Conclusion: Automating bulk case resolution using Power Automate in Dynamics 365 CRM offers an efficient way to streamline your workflows and reduce manual errors. By setting up automated email notifications for updates in the Opportunity’s timeline, your sales team and opportunity owners stay informed, ensuring smoother communication and faster response times. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Understanding Additional Reporting Currency in Microsoft Dynamics 365 Business Central
Introduction The Additional Reporting Currency feature in Business Central allows a company to maintain its financial records in a secondary currency, in addition to the primary currency. This secondary currency is used for reporting and analytical purposes, providing a clearer picture of the company’s financial health in the context of different economic environments. The use of Additional Reporting Currency for Regulatory Compliance, Simplified Financial Reporting, Enhanced Decision Making. Steps to achieve the goal: 1. Defining the Additional Reporting Currency Navigate to the “General Ledger Setup” page and specify the additional reporting currency. This can be any currency other than the primary currency of your company’s base country/region. Here I am going to set SGD as my Additional Reporting currency. Before I set SGD in my Additional Reporting currency, I have to make sure I am assigning the exchange rates properly. 2. Specifying Exchange Rates Define the exchange rates between the primary currency and the additional reporting currency. This can be done through the “Currency Exchange Rates” page. It is crucial to regularly update these rates to reflect current market conditions. 3. Specify the residual gain and loss account in your Currency The field is ideally not visible in screen. You can personalize and make those fields visible on your screen. Click on Settings icon-> Personalize->Field->Select the field and drag the field in your screen. Set the GL account to update its additional Currency value for future transactions. Globally search Chart of Accounts and Open the G/L Account that you wished to update its Additional currency value whenever you Adjust the transactions. No Adjustment: The default selection. No adjustments are made to the account Adjust Amount: Any gain or loss in exchange rates is reflected in the LCY amount field. Adjust Additional-Currency Amount: Any gains or losses in exchange rates are taken into account when adjusting the additional currency amount field. Please Note: You cannot set VAT Purchase or Vat Sales Account and G/L Accounts which you have tagged in Currency page (Realized gain and loss, Unrealized gain and loss, residual gain and loss) as for Additional reporting currency. As it can throw error when you perform Revaluation in the system. 4. Final Setup Go to general ledger setup and set the Additional Reporting currency SGD and set Retained Earnings Account, Set Document not as per the screenshot below and click on OK. This is batch job used to convert LCY transactions to Additional Currency. The exchange rate that is in effect on the work date is used in the job.The entry that is posted to the retained earnings account should be indicated in the Document No. field. On the last day of every closed year, this rounding entry is made to ensure that all income and expense accounts have a zero balance.The same account used when running the Close Income Statement batch job. You would view the below message once the transaction is calculated in the system. Click on OK. You can change the Additional Reporting currency again in future once it is set. Please note any Analysis created for previous Additional Currency that you must delete. Before and After setting up this configuration – Before the Additional Currency Setup The Chart of Accounts Additional Currency Net change and Additional Currency Balance to Date is blank no values. – After the Additional Currency Setup The Chart of Accounts Additional Currency Net change and balance to date value has been set. Please Note: Warning Issued by Microsoft on Additional reporting Currency Conclusion The Additional Reporting Currency feature in Microsoft Dynamics 365 Business Central offers a robust solution for maintaining financial transparency and compliance. By setting up and leveraging this functionality, businesses can streamline their financial reporting processes, enhance decision-making, and ultimately achieve greater financial clarity and control. Whether you are a small business expanding into new markets or a large enterprise with operations in multiple countries, the Additional Reporting Currency feature in Business Central can provide the tools you need to succeed in a complex financial landscape. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
How to bulk resolve cases using Power Automate?
Introduction When dealing with a large volume of cases, manually handling each one can be time-consuming and error prone. Thankfully, Microsoft Power Automate provides a powerful solution for automating bulk case resolution, streamlining your workflow, and saving valuable time. Why Automate Bulk Case Resolution? Bulk case resolution involves addressing multiple cases at once, which can be necessary for various reasons, such as resolving customer complaints, updating status, or closing resolved cases. Automating this process can: Getting Started with Power Automate Flows Log in to Power Automate and sign in with your credentials. Start a new flow. Click on ‘Create’ from the left-hand menu and select ‘Automated flow’ for a trigger-based flow or ‘Instant flow’ for a manual trigger. Click on ‘Create’ from the left-hand menu and select “Automated flow” for a trigger-based flow or ‘Instant flow’ for a manual trigger. Add an appropriate Flow Name and also, select the Trigger. Once the trigger has been added to the flow, click on ‘+ New Step’ to add an action to process the cases. We have an Excel sheet that contains the records of the cases to be resolved. So, we add an action of ‘List rows present in a table’. Add a step ‘Apply to Each’ where it iterates through list of cases in the Excel sheet and retrieves the case using ‘Get a row by ID’ Finally, add another step ‘Add a new row’ a record of Case Resolution and pass the Case GUID which resolves the case. Conclusion Automating bulk case resolution with Microsoft Power Automate can significantly improve your team’s efficiency, reduce manual errors, and free up valuable time for more strategic tasks. By setting up flows to handle multiple cases at once, you can streamline your workflow and ensure that cases are resolved quickly and accurately. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
How to Adjust Exchange Rates in Microsoft Business Central: New Preview Posting Feature Explained
Introduction For companies operating in diverse countries or regions, managing business transactions and financial reporting in multiple currencies is essential. Due to frequent fluctuations in exchange rates, it’s necessary to update these rates regularly in Business Central. Microsoft recently released a new feature you can now see how an exchange rate adjustment will affect your records before finalizing it. Just use the “Preview Posting” option on the Exch. Rates Adjustment report (Report 596). You can choose to see either detailed or summarized results and decide how dimensions are managed for gains and losses. Steps to achieve the goal: – Enable the New Feature: – Access Exchange Rate Adjustment: – Choose Dimension Settings: – Preview Posting View: Note: Due to local regulations, it’s not recommended to enable the “Enable use of new extensible exchange rate adjustment, including posting review” feature in the Swiss (CH) version. Conclusion The steps outlined in this blog, you can effectively utilize this feature to maintain accurate records and enhance your organization’s financial management capabilities. Whether you’re adjusting for a specific period or managing multiple dimensions, this feature streamlines the process and helps you stay compliant with local regulations. Implement these practices to ensure your business remains responsive to currency fluctuations. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Displaying Associated Records on the Main Grid Form Using Power Apps Grid Control
Introduction Microsoft Power Apps is a pivotal tool for creating custom apps tailored to specific business needs. A powerful feature of Power Apps is the ability to display associated records on the main grid form using Grid Control. This functionality provides users with a comprehensive view of related data, enhancing user experience and productivity. In this blog, we’ll explore how to effectively display associated records on the main grid form using Power Apps Grid Control. Steps to Display Associated Records Open your Power Apps Studio. From the Power Apps homepage, go to the Tables section. Select your desired table. In this case, it is the ‘Task’ entity. Go to the View Section on the ‘Task’ Entity. Select the desired view such as ‘All Tasks’ Click on ‘Components’ from the Navigation Bar Click on ‘+ Add a component’ followed by ‘Get more components’. This will open a library of available components that you can add to your table. Choose ‘Power Apps grid control’ and click ‘Add’ below. This control allows you to customize how data is displayed within the grid, including the ability to show related records. It will be present in the control list like this. Select it. In the Power Apps grid control settings, select the related entity for which you want to display the associated records. For example, if you want to display users assigned to tasks, choose ‘Assigned Users’. Click ‘Done’ After configuring the grid control, save your changes to the app. Make sure to test the configuration to verify that the associated records are displayed as expected. Once everything looks good, publish the app to make the changes live for all the users.Voila, you see the Assigned Users’ Associated Records on the View. Conclusion By effectively utilizing Microsoft Power Apps Grid Control, you can significantly enhance the user experience by providing a comprehensive view of associated records directly on the main grid form. This step-by-step guide equips you with the knowledge to configure and display related data, streamlining your app’s functionality and improving productivity. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Setup supplementary item/items on purchase order
Introduction In today’s competitive market it is very much important to have edge over other competitors for sustainable business & growth. There are various ways to gain market share like affordable rates, specialized items, etc. One of the ways to offer some essential supplementary items that are required for the optimal use and functionality of the primary products. Hence, many suppliers are offering various supplementary items either free or at with minimum price. Or offering certain quantity of same item on purchase of bulk quantity. i.e. Luggage bag with luggage bag cover as supplementary item or Buy 4 soaps & get 1 free. Problem Statement In this scenario we need to define supplementary item & conditions to add same on purchase order. i.e. Which supplementary item or items to for which vendor for what quantity of main item, at what rate & for what time period. Solution steps Follow below steps to create & add supplementary item/items on purchase order. Create main item. Create supplementary item – Then go to main item -> Purchase Tab -> Supplementary purchase items New window will open -> Click on new Then add supplementary item details based on different scenarios. Scenario 1 – When will buy 1 “Luggage bag” from Vendor 1002 will get 1 “Luggage cover” free. Define supplementary item accordingly. Add Vendor details – Select “Account code” as Table if only specific vendor is going to provide supplementary item. & Select “Account code” as All if all vendor will provide supplementary item. Quantity limit – Select quantity of main item. i.e. Luggage bag Supplementary item – Select code of Supplementary item. i.e. Luggage bag cover Supplementary quantity – Select quantity of supplementary item Multiple quantity – Select what will be incremental quantity. i.e. in this case it is 1, means if we buy 1 bag then will get 1 cover free, if we buy 2 bags then will get 2 covers free. Date range – If supplementary item will get only for specific period, then define from date & to date. Free of charge – If this toggle is Yes then supplementary item will be added to purchase order without price. If it is No, then supplementary item will be added with price. & save. Now create purchase order for vendor 1002 & add item luggage bag – go to Procurement & sourcing -> All purchase orders Now to add supplementary item click on Purchase order line -> Supplementary items New window will open. Click Ok. Then Supplementary item will be added to the purchase order. Follow regular procedure to further process the purchase order. Scenario 2 – When will buy 5 “Luggage bags” from Vendor 1001 will get 1 “Luggage bag” free. Define supplementary item accordingly. Add Vendor details – Select “Account code” as Table if only specific vendor is going to provide supplementary item. & Select “Account code” as All if all vendors will provide supplementary item. Quantity limit – Select quantity of main item. i.e. Luggage bag Supplementary item – Select code of Supplementary item. i.e. Luggage bag Supplementary quantity – Select quantity of supplementary item. Multiple quantity – Select what will be incremental quantity. i.e. in this case it is 5, means if we buy 5 bags then will get 1 bag free, if we buy 10 bags then will get 2 bags free. Date range – If supplementary item will get only for specific period, then define from date & to date. Free of charge – If this toggle is Yes then supplementary item will be added to purchase order without price. If it is No, then supplementary item will be added with price. & save. Create PO & add Luggage bag 10 quantity. Now to add supplementary item click on Purchase order line -> Supplementary items For 10 quantity of Luggage bags 2 bags will be added as supplementary. As we have set up of on buy of 5 bags 1 bag free. Follow regular procedure to further process the purchase order. Conclusion In above mentioned way, we can setup different supplementary item/items to be used on purchase orders. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
How to implement Azure Blob Lifecycle Management Policy
Introduction Azure Blob Storage Lifecycle Management allows you to manage and optimize the storage lifecycle of your data. You can define policies that automate the transition of blobs to different access tiers or delete them after a specified period. This can help reduce costs and manage the data efficiently. This Blog shows how to set up and manage lifecycle policies: Steps to Create a Lifecycle Management Policy Access the Azure Portal: Sign in to your Azure account and navigate to the Azure Portal. Navigate to Your Storage Account: – Go to “Storage accounts”. – Select the storage account where you want to apply the lifecycle policy. Configure Lifecycle Management: – In the storage account menu, under the “Blob service” section, select “Lifecycle management”. Add a Rule: – Click on “+ Add rule” to create a new lifecycle management rule. – Provide a name for the rule. Define Filters: You can specify filters to apply the rule to a subset of blobs. Filters can be based on: – Blob prefix (to apply the rule to blobs with a specific prefix). – Blob types (block blobs, append blobs, page blobs). Set Actions: – Define the actions for the rule, such as moving blobs to a cooler storage tier (Hot, Cool, Archive) or deleting them after a certain number of days. – You can specify the number of days after the blob’s last modification date or its creation date to trigger the action. Review and Save: – Review the policy settings. – Save the policy. Key Points to Remember – Access Tiers: Azure Blob Storage has different access tiers (Hot, Cool, Archive), and lifecycle policies help optimize costs by moving data to the appropriate tier based on its access patterns. – JSON Configuration: Policies can be defined using JSON, which provides flexibility and allows for complex rules. – Automation: Lifecycle management helps automate data management, reducing manual intervention and operational costs. Conclusion By setting up these policies, you can ensure that your data is stored cost-effectively while meeting your access and retention requirements. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
How to create user specific Purchase Requisition in Multi Entity Environment
Introduction In multinational companies having different hierarchy for purchase, i.e., in some companies, purchase is made at regional level, but in some companies, purchase is made at headquarter level by consolidating all regional requirements to have higher buying power and economies of scale and to maintain the same quality of goods over all regions. In a multientity environment having separate sets of employees for each legal entity and only headquarter employees having purchasing authority for all regional entities, the decision of inventory is taken by regional entity employees. In this case, each region submits a requirement to headquarters. In this case, to post and report on what and how much purchase was made for the respective regional entity, we need to create a purchase requisition for the respective buying legal entity. i.e., USMF is the headquarters entity, and PM is the regional entity. Problem statement While creating Purchase requisition from headquarter employee’s login, it is created with buying legal entity as headquarter entity. i.e. Julia is an employee of the headquarters USMF entity who will be going to issue the purchase requisition, and Mahesh is an employee of the regular entity PM. When we login to the PM entity from Julia’s login and create a purchase requisition, then the entity will automatically change to USMF. i.e., when a purchase requisition is made for a PM entity through Julia’s login with details given by Mahesh, it should remain for the PM entity, but the entity changes to USMF. & hence purchase requisition is registered at USMF. Follow the below steps in order to create a purchase requisition with a buying legal entity as per information given by respective regional entity employees and to maintain all details on the respective entity. i.e., details given by Mahesh for purchase requisition will be maintained on PM entity. Needs to add Mahesh name as requester for PM entity in Julia’s account. Go to Procurement & Sourcing > Setup > Policies > Purchase Requisition Permissions > 2. Then the below screen will appear. By preparer, choose Julia. Requester: Add all required employees of the PM entity, i.e., Mahesh, Jodi, Charlie, Ramesh, etc. 3. Go to Procurement & Sourcing -> Purchase Requisitions -> All purchase requirements and create new Purchase requisition from PM entity & click on requester, then all above added names will be available for selection. 4. And now if we add requester as Mahesh or any other name from the list, then a purchase requisition will be created for the PM entity, and the window will not switch back to USMF. Now all items added to this purchase requisition will be ordered and maintained for the PM entity. Conclusion For businesses with multiple legal entities, appropriately configuring purchase requisition permissions in Dynamics 365 guarantees that purchases are correctly attributed to the appropriate legal entities. This method improves reporting accuracy at the regional levels while also streamlining the procurement process. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
CI/CD with TFS for Finance & Operations
Introduction There are 100 million developers around the globe who use Git. However, if you want to work with customizations in Finance and Operations, you need to learn how to use TFS. Initially, I was frustrated and confused about this requirement, but as I learned more about how projects are structured both locally and in TFS, things started to make sense. TFS (Team Foundation Server) is Microsoft’s take on source control management, continuously released and improved since 2005. TFS keeps the source code centralized and tightly integrated with the Microsoft ecosystem. Despite the differences, if you are familiar with Git, transitioning to TFS shouldn’t be too difficult. TFS shares similar concepts with Git, such as checking in, branching, merging, version tracking, and other standard features of a source control management system. Understanding these similarities can make learning TFS easier and help you leverage its full potential in Finance and Operations projects. Pre-requisites Configuration So, here we’ll be starting with a new project.(If you are working with an already created repository then you can skip ahead.) Now, I’ll be adding two folders here, “Main” and “Released”. Later, we’ll convert them into branches from Visual Studio. In TFS, we have the concepts of branches and in addition to that branches can contain folders as well. Folders are used for organizing files and do not impact the version control flow directly.They are simply a way to keep the repository structured and manageable. Branches here (similar to Git) are used to manage different versions or lines of development in the repository.They allow for parallel development and keep separate histories until changes are merged. Inside Main I’ve added a trunk folder as well. Now, let’s head into our Development Enviroment and connect this project to Visual Studio. I’ve clicked on “Continue without Code” for now. I’ll click on View and “Team Explorer” Here, it says “Offline” as currently there’s no Azure DevOps project connected to it.So, let’s do that! I’ll click on “Manage Connections” and “Connect to a Project” Here, as the project is hosted in my own organization on Azure DevOps,I’ll use the same credentials to log in. Here, we can see all the different projects I have created within my organization in Azure DevOps.I’ll click on the relevant one and click on Connect. Here, we see the 3 sections in the Team Explorer view. 1 – Which credentails are being used to connect.2 – The name of root of the project and where it is planning to download the content from TFS.3 – The different components where we’ll be doing most of our work once the initial set up is completed. For now, I’ll just click on “Map & Get” Here, we can see that the mapping was successful. Next, we click on the Source Control Explorer to see the actual content of the TFS server. Now, we can convert the “Main” and “Release” folders into Branches. We can do this by right clicking on the folder -> Branching and Merging -> Convert to Branch After converting them to Branches, the icon next to them changes. Next, I’ll right click on my “Main” branch and add two new folders here. “Metadata” and “Projects” Now, before we can use these folders anywhere, we need to “push” these changes to the TFS.For that, we right click on “Trunk” folder and click on “Check in Pending Changes”. Now, we add a comment here describing what changes have been done (similar to a commit message)At the bottom we can see the files that have been created or modified. Once the check is done, we can see that the “+” icon next to the folders disappears and we get a notification that the checkin has been completed successfully. Now, this is where TFS shines through as better source control management for Finance and Operations. In (FnO) models and projects are stored in separate folders. Using Git for this setup can be tricky, as it would either mean managing two different repositories or dealing with a huge .gitignore file. TFS makes it easier by letting you map local folders directly to TFS folders, simplifying the management process. Here, we can see that currently, our mapping is a bit different than what we need, this is because of the “Map & Get” we did initially.So, to change that mapping, click on “Workspaces” Then click on “Edit” Now, we click on a new line to create a new mapping.Here, I’m creating a mapping between the “Metadata” folder in the “Main” branch of the TFS and the “PackageLocalDirectory” the place where all the models are stored for my system, Now, I’ll create another mapping between the Projects Folder and the local folder where my projects are stored. Now, once I click on “OK” it’ll prompt me if I want to load the changes. Click on “Yes” and move forward. But nothing changes here in Source Control Explorer. That’s because the Source Control Explorer shows what is stored in the TFS.And right now, nothing is; so we’ll have to add some models or projects here.Either we can add existing ones or we can create a new one.Let’s try to create a new model. Now, that the Model is created we’ll need to add it to our Source Control. Click on the blank space within the “Metadata” folder and select “Add Items to Folder” In the window, we can see that because of the mapping, we are sent to the local directory “PackageLocalDirectory”, and we can see our model inside it.Select that and click on “Next”. In the next view, we can see all the files and folders contained within the selected folder.Out of these, we can exclude the “Delta” folders. After, this we are left with these folders for the different elements.We can remove the content from the “XppMetadata” folders as well. Which leaves us with just the Description xml file. **Please do not exclude the descriptor file as without it Visual Studio will not be able to refer to your model or it’s … Continue reading CI/CD with TFS for Finance & Operations
Share Story :
Leverage Postman for Streamlined API Testing in Finance and Operations
Introduction Postman is an essential tool for developers and IT professionals, offering a robust platform for testing APIs, automating test processes, and collaborating efficiently across teams. In this blog we’re going to connect Postman to our Finance and Operations environment so we can test out standard or custom APIs. This connection is a crucial step in ensuring that your APIs function as expected, and it helps streamline the integration of various business processes within your organization. Whether you’re new to API testing or looking to optimize your current setup, this guide will walk you through the process with clear, actionable steps. I’ve already covered automating the testing in Postman in my blog here so once the connections are in place you’ll be good to go! Pre-requisites Configuration We’ll start with creating an App Registration in Azure Portal. Go to the Azure Portal (of the same tenant of your FnO Environment). Search for “App Registration” and click on “New Registration”. Add a name for your new app and click on “Register.” Once it is completed, you’ll be taken to the Overview of the app. Here, click on the “Add a certificate or secret” under the “Client Credentials.” Add an appropriate name and select the expiration date of the certificate as necessary. Once you click on add you’ll get the confirmation message that the client credential has been created and you’ll be able to see the value of the secret. ** Be sure to copy this value and keep it securely as once we refresh from this page, this value will not be available. ** Now that everything is done on the Azure side, open your FnO environment and search for “Microsoft Entra Applications.” Click on “New.” Paste the “Application (Client) ID” into the “Client ID” field, then assign it a suitable name and a User ID. The permissions given to the User ID will determine the permissions for the App. For now, I’ve assigned the “Admin” user. That’s all the configuration required at the FnO side. Now, let’s jump back into Postman. Now, in Postman we’ll start with a blank workspace and create a simple collection. The first thing that I like to do is to create different environments. As in FnO, we have a Production, a Sandbox and we can have multiple development environments so it may be possible that different environments are using different apps. So, to represent these environments, I like to create different environments in Postman as well. This is done by going to the “Environments” then clicking on “+” to create a new environment and giving it the appropriate name. Now, in this environment, I’ll add my app details as environment variables. The values for these can be found as follows – “grant_type” can be hard coded to “client credentials” and we can leave “Curr_Token” as blank for now. So, at the end we get – We can also change the type to “Secret” so that no one else can see these values. Now, the necessary credentials have been added to Postman. Next, we’ll set up our collection for generating the Auth Token. For that, we’ll copy the “OAuth 2.0 token endpoint (v2)” from the “Endpoints” in our Azure App Registration’s Overview screen. In Postman, click on your Collection then “Authorization” then selects “OAuth 2.0” I’ll paste the URL we copied from the “OAuth 2.0 token endpoint (v2)” from the “Endpoints” in our Azure App Registration in the “Access Token URL” field and I’ll add my variables as defined at the Environment variables. If you get the error “Unresolved Variable” even after defining them, then it could mean that the Environment isn’t set as the active. So go back to the Environments list and mark it as active. This way we can easily swap between Environments. Once the environment is marked as Active, we can see that the variable is found correctly. I’ll also ensure that my Access Token URL refers to the tenant ID as per my variable by embedding my “tenant_id” variable in it. Next, I’ll click on “Get New Access Token.” If everything has gone well, you’ll be able to generate the token successfully. After that, you can give it a name and click on “Use Token” to use it. I’ll now create a simple API which we can use to test this token. Right click on the “Collection” and click on “Add Request” give it an appropriate name. The “{{base_url}}/data” returns a list of APIs available in the system. I’ve set the Authentication to “Inherit Auth from parent” which means it relies on the Authentication set on the “Collection” for calling the request as is shown on the right side of the screen. Here we see that the request was executed successfully. If for some reason, you cannot use the standard Postman way of generating the Token, you can create a seperate API responsible for generating the Auth Token, store it as a variable and use it in your requests. From here, you can use the generated “access_token” and pass it as a “Bearer Token” to your requests. Or you can select the entire token and set it to your “Curr_Token” variable. And then you can pass this variable to the requests like – From Postman, we can then share these collections (which contain API data) and environments (which contain credentials) seperately as needed. All Data Entities follow the pattern – {{base_url}}/data/{{endpoint}} All services follow the pattern – {{base_url}}/api/services/{{service_group_name}}/{{service_name}}/{{method_name}} If I call a “Get” request on them, I get the details of the services for instance, here I’m getting the type of data I have to send in and the response I’ll be getting back in the form of object names. Moving back one step, I’m getting the names of the Operations (or methods) within this service object. Moving back one step, I’m getting the services within this Service Group. Moving back one step, I can see all the service groups available in the system. To actually run the logic behind the services … Continue reading Leverage Postman for Streamlined API Testing in Finance and Operations