Dynamics 365 Archives - Page 10 of 88 - - Page 10

Category Archives: Dynamics 365

User Adoption Strategy for Dynamics 365 Implementation Success 

Posted On October 3, 2024 by Priyesh Wagh Posted in Tagged in

Problem Statement  After an implementation has gone live, do we experience that the user participation tapers off before starting to decline? That’s when we feel the need to have an adoption strategy and metrics into place.  Strategizing User Adoption  Here are some thoughts on thinking about how to plan for User Adoption –   User Adoption Measurements  Here are some of the measurements and steps to take for User Adoption –   Conclusion  When you measure user adoption through trackable metrics and numbers, you can monitor the implementation’s success path.  This helps in rethinking the implementation and taking corrective action before the implementation itself is in question and why was it planned in the first place. Participation of partners, peers and all stakeholders share equal responsibility to make the implementation a resounding success.  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Inventory closing and Recalculation

How to do Inventory closing and Recalculation? In D365 Finance & Operations, inventory closing, and recalculation are essential process and recommended to be part of the month close standard operating practices. In case these practices are not followed, companies may face issues like inventory miscalculations, inconsistent inventory values in defined dimensions. Go to Inventory Management>Periodic Tasks>Closing and Adjustment. Then from the Action tabs click on Close Procedure dropdown, in that click on Close Inventory. After clicking on the Close Inventory a dialogue box will open in that select the Closing Period Code. Then under the Post-Closing you can see the Run Recalculation after Closing parameter, enable this and then click OK for Inventory Closing. Enabling this parameter will Run the Recalculation right after the Inventory Closing. After clicking OK the system will run the Closing and Recalculation Batch Job. This is how the Inventory Closing procedure takes places in D365 Finance and Operations. What happens by Inventory month close and Inventory Recalculation: – Inventory Month Close: The system generates inventory closing journals and settlement entries for the closed transactions resulting into adjustments to update inventory accounts like inventory value and cost of goods sold. It blocks the inactive dimensions from being considered into any of the valuation process. – Inventory Recalculation: The system does an inventory revaluation to adjust inventory values based on the latest costs, market values and inventory valuation method selected (FIFO, LIFO, Weighted Average, Standard Cost). By including inventory month close and revaluation as part of the month end SOPs, companies can achieve efficient inventory management. Conclusion: Inventory closing and recalculation in D365 Finance & Operations are critical processes for maintaining accurate inventory values and ensuring smooth month-end procedures. By performing these tasks regularly, businesses can prevent discrepancies, update inventory accounts effectively, and reflect true inventory costs based on the chosen valuation method. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Creating an Application User for Dynamics 365 CRM in the Azure Portal and When to Use It

Introduction In Dynamics 365 CRM, integrating with external systems, running automated processes, and developing custom applications often requires non-interactive access to CRM data. One of the most secure and efficient ways to achieve this is by creating an Application User via the Azure Portal. In this blog, we’ll guide you through the step-by-step process of setting up an Application User and explain when and why you should use it in your CRM environment. Steps to Create an Application User: – Navigate to the Azure Portal and log in with your Azure account. – Search for Azure Active Directory or select from the left-hand menu. – Click on “App registrations” in the Azure Active Directory blade and click on “New registration”. – Enter the following details: – Click “Register”. – Select the newly created application from the App registrations list and click on “API permissions” in the left-hand menu. – Click on “Add a permission”. – Select “Dynamics CRM”. – Select “Delegated permissions” and check the necessary permissions such as user_impersonation. – Click “Add permissions”. – Click on “Grant admin consent for [your organization]” and confirm. – Go to “Certificates & secrets” in the application settings. – Click on “New client secret”. – Add a description (e.g., “CRM App Secret”) and set an expiry period. – Click “Add”. – Copy the value of the client secret and store it securely. You will need it later. – Add Application User in Dynamics 365 CRM – Log on to the Microsoft Power Platform Admin (D365 Admin) centre as a system administrator. – In the navigation pane, go to Environments, and then select an environment form the list. – On the Settings tab, go to Users + permissions, and then select Application users. – The application users page appears. – Click + New app user. – After clicking on + New app user. A side menu slider will appear. Here you will have to: When to Use an Application User Conclusion Creating an application user in Dynamics 365 CRM via the Azure Portal is a straightforward process that enhances the integration capabilities and automation potential of your CRM environment. By following the steps outlined above, you can set up an application user and leverage it for various integration and automation scenarios. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Customizing the Opportunity Close Dialog Box in Dynamics 365 CRM

Introduction When managing Opportunities in Dynamics 365, the default dialog box for closing an Opportunity as either won or lost often lacks specific fields that may be essential for your business process. Attempting to close an Opportunity as either won or lost, you will encounter the following dialog box with the following options. Curious about customizing this dialog box with additional fields? In this blog, I’ll guide you through customizing your Opportunity Dialog Box when marking it has won or lost. Steps : – Sign in to Classic Dynamics 365 using your URL, such as abc.dynamics.com, and enter your credentials. – Create a Solution and include the out of the box called ‘Opportunity Close’ table/entity. – After adding the table/entity, navigate to the forms section and include the Quick Create form for ‘Opportunity Close’. – Include the fields you need in this Quick Create Form. You can also add your own custom fields. Here, I’ve included the out-of-the-box fields. After making changes, save and then publish. – After saving and publishing your changes, navigate to the model-driven app and choose the app where you wish to incorporate the entity. – When customizing your form in Dynamics 365 the classic way, ensure to remove and re-add the Quick Create form for Opportunity Close in the Model-Driven Apps section. – Ensure that your entity is added, and all forms are included as shown below. Once saved, remember to publish your changes. – Navigate to the Dynamics 365 page and refresh it 2-3 times. You will notice that when you attempt to close the Opportunity as won or lost, the default dialog box will no longer appear. Instead, the custom Quick Create form you created will be displayed. Conclusion Customizing the Opportunity Close dialog box in Dynamics 365 allows you to gather more relevant data at critical stages in the sales process. By following these steps, you can easily modify the default form and include additional fields that align with your organization’s needs. This not only improves data capture but also ensures a more streamlined experience for your sales team. Hopefully, this guide has helped you understand the customization process and enabled you to take advantage of Dynamics 365’s flexibility. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Advance warehouse management – Location Directives in Microsoft D365 F&O – Part 4 

Introduction In this blog we will learn about the basic setups required for the Advanced Warehouse Management process. These setups may vary depending on the business scenarios. For a Location Directives to work in an Advance warehouse scenario, there are some prerequisites that we need to do first. The following are the setups that we need to configure:  The Location Directive plays a significant role in inventory movement in advanced warehouses. Location Directives are the set of rules which define the pick and put, Counting, License Plate building, Status change and Quality check etc. for individual warehouse or group of warehouses.  – For my current scenario, I will create a location Directive for a Sales Order and Transfer Order transactions.  – In the work Order Type, from the drop-down menu select “Sales Orders”  – Enter the information below as per business use cases.  – Enter Location Directive name. Here I have mentioned, “SO Pick”  – Enter work type as “Pick”  – Enter Scope. Here I have mentioned for “Multiple Items”  – Enter warehouse.  – Add From and To quantity fields.  – Add location directive actions. Here I have mentioned “Fixed and Non-Fixed locations”  – I have done the same setup for Sales Order “Put”.  – Now we will do setup for transfer Order location directive.  Here, we will have to do the separate setup for Transfer Issue and Transfer Receipt.  Transfer issue:  – Enter Name.  – Enter work type as “Pick”.  – Enter scope. I have selected here as Multiple Items.  – Select the warehouse. I have selected here as All warehouses.  – Enter quantity.  – Enter fixed location usage.  Transfer receipt:  – Enter Name.  – Enter work type as “Put”.  – Enter scope. I have selected here as All Items.  – Select the warehouse. I have selected here as All warehouses.  – Enter quantity.  – Enter fixed location usage.  Here I have mentioned “Fixed and Non-fixed locations”  Now, location directives are ready to use in Advance Warehouse process.  That’s it for this blog!!  How to use these location directives in actual transactions will be discussed going forward in the blog series.  Conclusion We’ve explored the essential setups required for implementing the Advanced Warehouse Management process, focusing on the creation of Location Directives tailored to specific business scenarios. By configuring these directives, you can effectively manage inventory movements, ensuring a smoother workflow for Sales Orders and Transfer Orders. Next in the Blog series:  How to create Work Classes and Work Templates in Advance warehouse management in D365.  How to set up Worker in Advance warehouse management in D365.  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

The Importance of Physical Tracking and Negative Inventory Control to Prevent Stock Outs in D365 F&O

Introduction In the field of stock management, keeping precise stock levels is essential to the success of your business. Two key components in achieving this are physical tracking and negative inventory control. Physical tracking keeps track of inventory movements in real time, and negative inventory control makes sure that stock levels never fall below zero, which keeps operations from being disrupted. In this first part of the blog, I will explore the theory behind these concepts, highlighting their importance in preventing stock outs. Understanding these principles is essential for effectively managing inventory in Dynamics 365 Finance & Operations. First let me explain Physical Inventory: Physical inventory involves continuously monitoring the movement of goods, including receiving, storing, and shipping. This tracking helps businesses keep an up-to-date record of their stock levels and avoid discrepancies. This is the practice of recording the actual quantities of products on hand at various locations, such as warehouses, distribution centers, and retail stores. The goal is to ensure that the recorded inventory levels match the actual quantities available, which is essential for accurate stock management and financial reporting. So, physical inventory is a critical component of inventory management that ensures businesses maintain accurate records of their actual stock levels. By implementing effective physical inventory practices, companies can improve inventory accuracy, prevent stock outs, and enhance overall operational performance. How Physical Tracking works in D365 Finance and Operations? In D365 F&O, physical tracking works by leveraging inventory dimensions, tracking codes, and item model groups to monitor and manage inventory accurately. Physical tracking in Dynamics 365 Finance & Operations helps you monitor and manage the actual stock of items in real-time. You can define what you want to track (e.g., batch numbers, serial numbers) and then assign these dimensions to item model groups to specify how tracking is applied. In short Physical tracking ensures that you always have an accurate view of your inventory by recording and updating item details as they move through your supply chain. Physical Negative Inventory: Physical negative inventory occurs when the recorded quantity of items in your inventory system drops below zero. This situation arises when more items are issued or sold than are actually available in stock. For example, if your system shows you have 10 items in stock, but you issue 15 items, your inventory record will show a negative quantity of -5 items. In Dynamics 365 Finance & Operations, you can control this by setting parameters that prevent negative inventory from being recorded. If you disable the option for allowing physical negative inventory, the system will only permit transactions if there is enough stock on hand. This helps ensure that your inventory records are accurate and reflect the true quantity of items available, preventing potential issues such as stock outs or discrepancies between physical stock and system records. How Physical Negative Inventory works in D365 Finance and Operations? In Dynamics 365 Finance & Operations, physical negative inventory refers to how the system manages inventory levels when they fall below zero. If you allow negative inventory, the system permits transactions even if the stock levels go below zero, which can happen if more items are shipped or adjusted out than are available. To control this, you can configure settings in the system: by navigating to Inventory Parameters and Item Model Groups, you can choose whether to permit or prevent negative inventory. When negative inventory is disabled, the system ensures that transactions only occur if there is sufficient stock, preventing inventory records from showing negative amounts. This helps maintain accurate inventory records and avoids potential issues like stock outs. Regular cycle counts and inventory adjustments are also important to keep the system aligned with actual stock levels and address any discrepancies. Conclusion Dynamics 365 Finance & Operations (D365F&O) helps businesses keep track of their inventory and prevent stock issues in a simple and effective way. For physical tracking, D365 F&O uses inventory dimensions like site, warehouse, batch number, and serial number. This means you can always see where your items are and how many you have in real-time, which helps avoid mistakes and keeps your operations running smoothly. To control negative inventory, D365 F&O lets you set rules to stop inventory levels from dropping below zero. You can find these settings in Inventory Management and Item Model Groups. If you choose to prevent negative inventory, the system will block any transactions that would cause your stock to go below zero. This ensures your inventory records stay accurate, and you don’t run into issues like running out of stock or having financial discrepancies. Additionally, D365 F&O supports regular cycle counts and inventory adjustments. These regular checks help ensure that the actual physical stock matches what’s recorded in the system, allowing you to correct any differences quickly. With these features, D365 F&O makes it easy for businesses to manage their inventory accurately and efficiently, supporting better decisions and smoother operations. That’s it for this part of the blog. In the next part, I will walk you through the process with examples, including products tracked by Serial Numbers and Batch Numbers. I will also explain how these features impact inventory transactions. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Integration with Finance and Operations – From Basics (Part 2)

Introduction Finance and Operations provides two major ways to interact with tables (or data entities) for external system using APIs; namely Custom Services and Data Entities. Data entities in D365 Finance and Operations simplify data management by grouping data from multiple tables. They make it easier to import, export, and integrate data with other systems. Custom services in D365 Finance and Operations allow developers to create web services for specific business needs. They enable external systems to interact with D365 F&O by exposing custom logic and operations. This helps in integrating and automating processes with other applications. In the previous blog, we saw how we can use Data Entities to create APIs.In this blog, we’ll see how we can use Custom Services to create APIs. References Custom Service DevelopmentExposing an X++ Class as a Data Contract Using Data Contracts Pre-requisites Configuration Right click on the project and click on “Add” and then “New Item” Click on Services and select the “Service Group.” Add the appropriate name for your Service Group.Do note that this will be a part of your endpoint url. Once that is done, we’ll need to create a new Service as well.Repeat the same steps but this time select the “Service” object and add the appropriate name. Once both the Service Group and Service objects are created, we’ll need to create request, response and request processing objects. For that, click on Right Click on Project > Add > New Item > Code > Class. Add the appropriate name and click on “Add”. In the Request object, set the attribute [DataContract] at the class level and add Global variables which will be used to send data to the processing object. In the Response object, set the attribute [DataContract] at the class level and add Global variables which will be used to return data from the processing object. In the processing object, write the necessary logic. Here, I’m writing the logic to pull the data from the request object into local variables and then create a Customer record along with an address entry for that customer and if everything is completed successfully, I’ll return a “Success” status along with the customer Id else a “Failed” status along with the Customer ID. If there is any logic for logging, that can be added to our processing class after the main operation has completed.You can do that in the following way –  Once this is done, we can now add our processing class to our Service object. Open the “Service” object and set the “Class” field to the processing class you have created. Right click on the Service object in the designer and click on “New Service Operation” In the new Service Operation that is created, set the method from the processing class that you want to call in the “method” field.Set the appropriate name for that method. (This will be part of the endpoint)Set the operational domain, whether it will only work for a particular company or accross the companies.Set the Access Level (Access level increases as you go down the list) Now after this, we’ll assign our Service object to the Service Group. Open the Service Group in the designer, right click it and then click on “New Service” In the newly created “ServiceGroupService” entry set your “Service” Then after rebuild, Sync database and deploy; open postman and add the following URL template. <base_url>/api/services/<ServiceGroup>/<Service>/<Method> Now, if I trigger the “Post” request, I’ll get a “Success” status along with the CustomerId.If I try to recreate the same customer, I’ll get a “Failed” status along with the CustomerId. If you are not sure whether your API exists or not, then you can simply call a “Get” request on the URL – <base_url>/api/services This returns a list of all the “Service Groups” present in the system. We can then call a “Get” request including this “Service Group” into our URL. This returns a list of all the “Services” present in the system for that “Service Group”. We can then call a “Get” request including this “Services” into our URL. This returns a list of all the “Operations” present in the system for that “Service Group”. We can then call a “Get” request including this “Operation” into our URL. This returns the Request and Response objects for this Service Operation. Conclusion Thus, we saw how to create APIs using Custom Services in Finance and Operations. In the next blog, we’ll see some advanced API functionalities that are present in Finance and Operations. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Integration with Finance and Operations – From Basics (Part 1)

Introduction Finance and Operations provides two major ways to interact with tables (or data entities) for external system using APIs; namely Custom Services and Data Entities. Data entities in D365 Finance and Operations simplify data management by grouping data from multiple tables. They make it easier to import, export, and integrate data with other systems. Custom services in D365 Finance and Operations allow developers to create web services for specific business needs.They enable external systems to interact with D365 F&O by exposing custom logic and operations.This helps in integrating and automating processes with other applications. Feature Data Entities Custom Services Purpose  Simplify data management tasks like import, export, and integration.  Expose custom business logic and operations as web services. Functionality  Provide structured access to data from multiple tables in a unified format.  Allow external systems to perform actions or retrieve data via API calls. Usage  Used for bulk data operations, data migration, and integration with external systems.  Used for real-time integration, extending functionality, and custom business process automation. Typical Use Cases  Data import/export, data synchronization, and data migration.  Integrating with external applications, custom business processes, and real-time data access. Data Handling  Focuses on data in bulk.  Focuses on specific operations or business logic. Pre-requisites References Data Entities Overview – Finance and Operations Build and consume data entities – Finance and Operations Exposing an X++ class as a Data Contract Configuration Here, to understand creation of APIs in either case, we’ll expose the same table using both Data Entities and Custom Services. Data Entity: Right click on the project and click on “Add” and then “New Item” Click on Finance and Operations > Dynamic 365 Items > Data Model and then select “Data Entity” Select the table that you want to expose in the “Primary Data Source” field, appropirate “Entity Category”, “Public Entity Name” and “Public Entity Set Name” (which is what the endpoint will be), and the Staging Table name. Select the necessary fields from the primary data source. You can add related tables by clicking on the small arrow next to the table name, which displays the list of all associated tables. Then you can select the relevant fields from the associated tables. Once done, you’ll get one data entity, two security privileges and one staging table created. If you want to add new data sources, then you can right click on the Primary Data Source’s “Data Sources” tab and add new data source. You can drag fields from any of the data sources into the “Fields” section of the data entity to make them available on the API. Calling the Data EntityYou can call <base url>/data url to get a list of all the data entities available in the system. From here, if I call a “GET” request on my Data Entity (the “Public Collection Name” property of the data entity, which we set in the Data Entity wizard), I’ll get the following response.Please note that this “Public Collection Name” is case sensitive. Now, if I need to create a “Customer” record then I can simply pass the same keys into a “POST” request. And we can see the same in FnO. If we want to update a record, then we make the PUT request with the syntax –  {{base_url}}/data/TestCustomers(dataAreaId='<Company Name>’,CustomerId='<Customer Id>’) It will include all the Entity Keys defined on the Data Entity as we only have one field then we are simply passing that. Passing it without the DataAreaId will throw errors. You can delete the record using the same syntax but with the “Delete” request. Conclusion: In this blog, we explored how to create APIs using Data Entities in Dynamics 365 Finance and Operations, simplifying data management and external system integrations. Data Entities offer an efficient way to handle bulk data operations, while Custom Services provide flexibility for exposing specific business logic We’ll see how to create APIs using Custom Services in the next blog. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to create user specific Purchase Requisition in Multi Entity Environment

Introduction In multinational companies having different hierarchy for purchase, i.e., in some companies, purchase is made at regional level, but in some companies, purchase is made at headquarter level by consolidating all regional requirements to have higher buying power and economies of scale and to maintain the same quality of goods over all regions. In a multientity environment having separate sets of employees for each legal entity and only headquarter employees having purchasing authority for all regional entities, the decision of inventory is taken by regional entity employees. In this case, each region submits a requirement to headquarters. In this case, to post and report on what and how much purchase was made for the respective regional entity, we need to create a purchase requisition for the respective buying legal entity. i.e., USMF is the headquarters entity, and PM is the regional entity. Problem statement While creating Purchase requisition from headquarter employee’s login, it is created with buying legal entity as headquarter entity. i.e. Julia is an employee of the headquarters USMF entity who will be going to issue the purchase requisition, and Mahesh is an employee of the regular entity PM. When we login to the PM entity from Julia’s login and create a purchase requisition, then the entity will automatically change to USMF. i.e., when a purchase requisition is made for a PM entity through Julia’s login with details given by Mahesh, it should remain for the PM entity, but the entity changes to USMF. & hence purchase requisition is registered at USMF. Follow the below steps in order to create a purchase requisition with a buying legal entity as per information given by respective regional entity employees and to maintain all details on the respective entity. i.e., details given by Mahesh for purchase requisition will be maintained on PM entity. Needs to add Mahesh name as requester for PM entity in Julia’s account. Go to Procurement & Sourcing > Setup > Policies > Purchase Requisition Permissions > 2. Then the below screen will appear. By preparer, choose Julia. Requester: Add all required employees of the PM entity, i.e., Mahesh, Jodi, Charlie, Ramesh, etc. 3. Go to Procurement & Sourcing -> Purchase Requisitions -> All purchase requirements and create new Purchase requisition from PM entity & click on requester, then all above added names will be available for selection. 4. And now if we add requester as Mahesh or any other name from the list, then a purchase requisition will be created for the PM entity, and the window will not switch back to USMF. Now all items added to this purchase requisition will be ordered and maintained for the PM entity. Conclusion For businesses with multiple legal entities, appropriately configuring purchase requisition permissions in Dynamics 365 guarantees that purchases are correctly attributed to the appropriate legal entities. This method improves reporting accuracy at the regional levels while also streamlining the procurement process. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

CI/CD with TFS for Finance & Operations

Introduction There are 100 million developers around the globe who use Git. However, if you want to work with customizations in Finance and Operations, you need to learn how to use TFS. Initially, I was frustrated and confused about this requirement, but as I learned more about how projects are structured both locally and in TFS, things started to make sense. TFS (Team Foundation Server) is Microsoft’s take on source control management, continuously released and improved since 2005. TFS keeps the source code centralized and tightly integrated with the Microsoft ecosystem. Despite the differences, if you are familiar with Git, transitioning to TFS shouldn’t be too difficult. TFS shares similar concepts with Git, such as checking in, branching, merging, version tracking, and other standard features of a source control management system. Understanding these similarities can make learning TFS easier and help you leverage its full potential in Finance and Operations projects. Pre-requisites Configuration So, here we’ll be starting with a new project.(If you are working with an already created repository then you can skip ahead.) Now, I’ll be adding two folders here, “Main” and “Released”. Later, we’ll convert them into branches from Visual Studio. In TFS, we have the concepts of branches and in addition to that branches can contain folders as well. Folders are used for organizing files and do not impact the version control flow directly.They are simply a way to keep the repository structured and manageable. Branches here (similar to Git) are used to manage different versions or lines of development in the repository.They allow for parallel development and keep separate histories until changes are merged. Inside Main I’ve added a trunk folder as well. Now, let’s head into our Development Enviroment and connect this project to Visual Studio. I’ve clicked on “Continue without Code” for now. I’ll click on View and “Team Explorer” Here, it says “Offline” as currently there’s no Azure DevOps project connected to it.So, let’s do that! I’ll click on “Manage Connections” and “Connect to a Project” Here, as the project is hosted in my own organization on Azure DevOps,I’ll use the same credentials to log in. Here, we can see all the different projects I have created within my organization in Azure DevOps.I’ll click on the relevant one and click on Connect. Here, we see the 3 sections in the Team Explorer view. 1 – Which credentails are being used to connect.2 – The name of root of the project and where it is planning to download the content from TFS.3 – The different components where we’ll be doing most of our work once the initial set up is completed. For now, I’ll just click on “Map & Get” Here, we can see that the mapping was successful. Next, we click on the Source Control Explorer to see the actual content of the TFS server. Now, we can convert the “Main” and “Release” folders into Branches. We can do this by right clicking on the folder -> Branching and Merging -> Convert to Branch  After converting them to Branches, the icon next to them changes. Next, I’ll right click on my “Main” branch and add two new folders here. “Metadata” and “Projects” Now, before we can use these folders anywhere, we need to “push” these changes to the TFS.For that, we right click on “Trunk” folder and click on “Check in Pending Changes”. Now, we add a comment here describing what changes have been done (similar to a commit message)At the bottom we can see the files that have been created or modified. Once the check is done, we can see that the “+” icon next to the folders disappears and we get a notification that the checkin has been completed successfully. Now, this is where TFS shines through as better source control management for Finance and Operations. In (FnO) models and projects are stored in separate folders.  Using Git for this setup can be tricky, as it would either mean managing two different repositories or dealing with a huge .gitignore file.  TFS makes it easier by letting you map local folders directly to TFS folders, simplifying the management process. Here, we can see that currently, our mapping is a bit different than what we need, this is because of the “Map & Get” we did initially.So, to change that mapping, click on “Workspaces” Then click on “Edit” Now, we click on a new line to create a new mapping.Here, I’m creating a mapping between the “Metadata” folder in the “Main” branch of the TFS and the “PackageLocalDirectory” the place where all the models are stored for my system, Now, I’ll create another mapping between the Projects Folder and the local folder where my projects are stored. Now, once I click on “OK” it’ll prompt me if I want to load the changes. Click on “Yes” and move forward. But nothing changes here in Source Control Explorer. That’s because the Source Control Explorer shows what is stored in the TFS.And right now, nothing is; so we’ll have to add some models or projects here.Either we can add existing ones or we can create a new one.Let’s try to create a new model. Now, that the Model is created we’ll need to add it to our Source Control. Click on the blank space within the “Metadata” folder and select “Add Items to Folder” In the window, we can see that because of the mapping, we are sent to the local directory “PackageLocalDirectory”, and we can see our model inside it.Select that and click on “Next”. In the next view, we can see all the files and folders contained within the selected folder.Out of these, we can exclude the “Delta” folders. After, this we are left with these folders for the different elements.We can remove the content from the “XppMetadata” folders as well. Which leaves us with just the Description xml file. **Please do not exclude the descriptor file as without it Visual Studio will not be able to refer to your model or it’s … Continue reading CI/CD with TFS for Finance & Operations

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange