Category Archives: Blog
Advance warehouse management – Warehouses and Locations in Microsoft D365 F&O – Part 2
Hello everyone, in this series of Blog, we are going to learn about the Advance warehouse management in D365. In this blog we will learn about the basic setups required for the Advanced Warehouse Management process. These setups may vary depending on the business scenarios. For a Warehouse to work in an Advance warehouse scenario, there are some prerequisites that we need to do first. The following are the setups that we need to configure: Path: Inventory Management Inventory Breakdown Warehouses. Click on New. Enter the warehouse and Warehouse name. Most importantly, for warehouse to work in a advanced warehouse scenario, we have to enable the “Use warehouse management processes”. For advanced warehouse management, the location plays an important role for better inventory visibility and item tracking. For this, we need to do multiple setups as per below. In the location type, I have mentioned four main locations. These are Baydoor, Floor, receiving and User. These locations will increase or decrease as per business scenarios. Recv: This location I am using primarily for entry of the goods/items in the warehouse. Floor: This location I am using for storing the goods/items in the warehouse. Baydoor: This location I am using for exit of the goods/items from the warehouse. User: This location I am using for default. This user location will be used in the Warehouse management parameter, in User Location profile. Location formats are used as the naming system by which we can create the unique and consistent names for different locations within the warehouse. Here, I have mentioned the length as 1. The location profiles will be used for connection between location type, Location formats and the locations. The location profiles are very important setup because the location capacity can be mentioned here. Also, how the inventory is stored in the location, and how the inventory is getting accessed is also dependent on the location profiles. Here, I have enabled User license plate tracking, allow mixed items, Allow mixed inventory statuses, Allow mixed inventory batches. Click New. Select warehouse from the drop-down menu. Enter location name as per requirement. Select the Location profile ID from the drop-down menu. We can add as many locations as we want. Now, the warehouse and locations are ready to use in Advance Warehouse process. That’s it for this blog!! How to use these warehouses and locations in actual transaction will be discussed going forward in the blog series. Keep learning!!!!!
Share Story :
Advance warehouse management – Item Creation process in Microsoft D365 F&O – Part 1
Hello everyone, in this series of Blog, we are going to learn about the Advance warehouse management in D365. What is Advance Warehousing Process? The day-to-day transactions in the advanced warehouse are a bit different than in a normal warehouse. Workers in the advanced warehouse use mobile devices to move stock from one location to another. Here location doesn’t mean just the location, we can move it from one warehouse to another. So, instead of sitting in one place, workers can move around the warehouse to pick up or put the stock. In Advance warehouse management, the visibility of the stock is very clear. Workers can see where the stock is in the warehouse and in what quantity. This results in Streamlined warehouse operations, reduced inventory carrying costs, and thus improvement in customer satisfaction through faster and more accurate order fulfillment. In this blog we will learn about the basic setups required for the Advanced Warehouse Management process. These setups may vary depending on the business scenarios. For an item to work in an Advance warehouse scenario, there are some prerequisites that we need to do first. The following are the setups that we need to configure: Path: Product Information Managementà Dimension and Variant Groupsà Storage Dimension Groups. In the Storage dimension group, activate the “Use warehouse management process.” Once you activate the use warehouse management process, the Location, Inventory Status, License Plate will be automatically activated. 2. Reservation hierarchy Path: Warehouse Management à Setupà Inventoryà Reservation Hierarchy. A reservation hierarchy helps delay specific reservation details until after you’ve placed an order. This hierarchy relies on item-related factors like inventory status and license plate for storage and tracking. While it’s essential to have site, warehouse, and inventory status information during the ordering process, certain details like location can be added after making reservations. For instance, the warehouse management system can identify the best locations for picking based on this reservation hierarchy. 3. Unit sequence groups Path: Warehouse Management à Setup à Warehouse à Unit sequence groups. Unit sequence group is mandatory setup which works when dealing with movement of material. Now take one example: If the Unit sequence group is mentioned as Each and Box, then While making product receipt from a Purchase Order, mobile device will show the options to receive the materials in two different units which are Each and Box. Here based on my scenario, I have mentioned the default unit for Purchase orders and Transfer Order as Box. Now, the Item is ready to use in Advance Warehouse process. That’s it for this blog!! How to use this item in actual transaction will be discussed going forward in the blog series. Keep learning!!!!! Next in the Blog series: How to create Warehouses in Advance warehouse management in D365. How to create Locations in Advance warehouse management in D365.
Share Story :
Transfer Environment in Business Central
Introduction: The Transfer Environment feature in Microsoft Dynamics 365 Business Central allows you to seamlessly move an environment from one Microsoft Entra tenant to another. Here are the key points: Purpose: Self-Service Process: Steps: This streamlined process enhances flexibility and efficiency for managing Business Central environments across different tenants. Pre-requisites: Steps: 2. We are going to transfer “Shubham” to different tenant. Click on “Environment Transfers” 3. Go to Transfer Environment. 4. Select the environment that you are going to transfer and enter the Destination Tenant id. It is also possible to schedule the transfer within 2 weeks and set the time when it should be transferred. 5. Now go to Destination Environment > Admin Center. 6. Click on Environment Transfer > Receive Environment. 7. Enter the Source Tenant id (source from where the environment is to be send). Click on Next. 8. After that in Pending Incoming transfers the Environment will be displayed. Now just click on the Environment and Confirm. 9. The Environment is being successfully transferred. Note: If the destination tenant already has 1 Production environment and 3 Sandboxes, environment transfer is not possible. For successful transfer, the destination tenant must have no existing production environments (if you are transferring a production environment) or fewer than 3 sandboxes (if you are transferring sandbox). Conclusion: In conclusion, this blog has provided valuable insights about how to transfer environment in business central. Thank you very much for reading. I hope this helps!
Share Story :
Configure an Azure Connector in LCS
Introduction In this blog, we’ll be looking into configuring the Azure Connector in LCS with the Azure Resource Manager so that LCS can deploy your resources to Azure. Pre-requisites An Azure subscription that you are a co-administrator in. References Configuration Go to Microsoft Dynamics Lifecycle Services and log in with your account. In LCS, when we try to create a cloud hosted environment for the first time, it prompts us to create an Azure Connector first. You can also access this by going to your Project Settings and the “Azure Connectors.” Once, we reach this screen, we have to click on Authorize in the organization where we want to authorize. Please do ensure your account has the necessary permissions for these actions. Once, this is done click on Microsoft Azure Portal as there are a few configurations we need to do in the Azure Portal. Click on Subscriptions. From the Subscriptions list, we can note down the Subscription ID as we will need it while creating the Azure Connector. The “Subscription ID” is also available in the Overview section of the Subscription. Then go to the Access Control (IAM) tab and click on Add and then Add Role Assignment. Then go to Role -> Privileged Administrator Roles and then search for “Contributor”. Click on it and then click on Next. In the members tab, click on User, group or Service Principal and click on Select Members. After that search for and add “Dynamics Deployment Services [wsfed-enabled]” and your own user to this role assignment. Once that is done, we’ll get a confirmation message. After that, we can move back to the LCS for configuring the Azure Connector. We click on Add to create a new Azure Connector and get the following pop-up. Here we add a name for the connector, Azure Subscription ID and the Domain Name. Name can be anything that you want, the Domain Name in most cases is the part of your email address after the @. For e.g. it’d be “microsoft.com” in case of rbansode@microsoft.com. For the Azure Subscription ID, we have already stored that from the previous steps. Once we add the necessary values and click on next we get the following pop-up. We’ve already completed the necessary steps in the Azure Portal so we can simply click on Next. After that, we get the following pop-up. We’ve completed the steps mentioned in the Ensure you are a subscription user section. If for some reason you are facing any difficulties in that you can also try the steps from the Apply a Subscription Tag section. Apply a subscription tag When you click on Get a Code you’ll get the following pop-up which includes a unique verification code. We copy this and head on to the Azure Portal. Then go to your subscription in Azure Portal. Head to the Tags section and create a new entry with name as “LifecycleServicesAuthCode” and the value as unique verification code from LCS. If neither of those methods work, there is a soon to be deprecated method mentioned as well where you upload the certificate downloaded from LCS into the “Management Certificates” of your Azure Subscription. Hopefully, one of these three methods work out for you and you’ll get the following pop-up. Once you click on connect you’ll see an entry created in your Azure Connectors. This indicates that your Azure Account has been linked and now LCS can utilize it to create resources in Azure on your behalf. Side Note If you see the following error message then that means there was an error with one of the three suggested approaches you choose. You can try with another approach and start over. Conclusion Thus, we saw how to configure the Azure Connector in LCS. Happy Coding!
Share Story :
Create a New Environment in LCS for D365 Finance and Operations
Introduction In this blog, we’ll be looking into creating a new environment for D365 Finance and Operations or D365 Commerce. Pre-requisites References Configuration Go to Microsoft Dynamics Lifecycle Services and log in with your account. If you select D365 Commerce, you get the following screen. If you select D365 Finance and Operations, you get another screen where you have to specify whether the project is an actual implementation or just for evaluation after which you get the same screen as below. Once the Project is created, we get the following screen. From here, we click on the hamburger menu at the top and then click on Cloud Hosted Environments. Click on Add to create a new environment. If you get the below pop-up asking to configure an Azure Connector, please refer to my blog – “Configure an Azure Connector in LCS”. Once, you have an Azure Connector configured, you can click on Add again and get the following pop-up. After selecting the Application and Platform version, you’ll get the option to select the environment topology. DEMO – A demo environment includes only Microsoft demo data. You can use a demo environment to explore default features and functionality. DEVTEST – A DevTest environment is for development or build. You can use this environment for development or build. Then we get another pop-up to select the environment topology. After that is selected, we decide the environment name and the size of the VM that is to be used for this environment. You can read more about VM sizes here – VM sizes – Azure Virtual Machines | Microsoft Learn Once we click on Next we get the last pop-up after which the environment gets deployed. Once we click on deploy, it takes about 6-8 hours to deploy the environment after which it’ll be available in the cloud-hosted environments section. If, for some reason, you try to create an environment with the latest platform and application version and that deployment fails, you can try to create an environment one platform/application version below that. Conclusion Thus we saw how to create an environment in LCS for either D365 Finance and Operations. Happy Coding!
Share Story :
AS2 using Logic App
High-level steps to start building B2B logic app workflows: Creating a Key Vault for Certificate and Private Key Create an Azure Key vault. In the next step, Select Vault access policy and select the Users. Select Review + Create. Add the access policy and assign it to Azure Logic App. Create Certificate Click the Certificate and Download Create a Key and attach the .pfx format file. Creating two Integration Account for adding Partners, Agreements and Certificates Create 2 integration accounts, one for sender and one for receiver. Add the Sender and Receiver Partners in both the integration accounts. Add a public certificate in sender integration account and a private certificate in receiver integration account. Now we need to add the agreement in both sender and receiver integration account. Sender Agreement Send Settings Receiver Agreement Receive Settings Creating two Logic Apps, one for Sending (Encoded Message) and one for Receiving (Decoded Message) Create two logic apps and add the integration account in respective logic apps. Logic App for Sender (Encoding Message) Logic App for Receiver (Decoding Message)
Share Story :
“Expiration Date being past the Required Date” issue for Batch Number in D365 Finance & Operations.
In Dynamics 365 Finance and Operations (D365 F&O), the use of batch numbers is a common practice to manage and trace items with specific characteristics. Batch numbers are typically assigned to groups of items produced or received together, allowing for better control, tracking, and compliance with industry regulations. In this blog I will explain how to solve the expiration date issue while registering a Batch and Serial number tracked product. In the above screenshot you can see that the Batch Number 23010-CM-000088 has been assigned for my Item P-000014. Here, the Expiration Date is 08-04-2023. Now if I try to register this Item with the same Batch Number, I will get the Batch Number Expiration Date being past the Required Date error. Now to solve this error go to Inventory Management>Enquiries and Reports>Tracking Dimensions>Batches. Now on the Batch Number page go to Reset Tab then under the Reset tab click on Reset Shelf-Life Dates. The next step is to select the New Expiration Date and then Click Ok. This will Update the Expiiration Date of that Batch Number. In the above screenshot you can see that the Expiration Date has been changed. Now we will be able to register the Item. Here, you can see that now I am able to register the Item successfully. That’s it for this blog. Hope this helps you! Thank You!
Share Story :
Restoring an Environment in Business Central.
Introduction: This comprehensive guide provides administrators with a step-by-step process for restoring Microsoft Dynamics 365 Business Central environments to a previous state within the retention period. Users with Restore Permissions: Only specific users, such as internal and delegated administrators, can restore environments. These users must also have the D365 BACKUP/RESTORE permission set in the relevant environment. Considerations and Limitations: Environments can be restored up to 10 times per calendar month. Restoration is limited to the same Azure region and country as the original environment. Preparation before Restoration: Before restoring an environment, it’s essential to communicate the plan within the organization, restrict user access, and consider renaming the environment to avoid conflicts. Restoration Process: 3. Click on Restore 4. In the restore environment window select date and time to which you want to restore the environment. 5. Select the type of environment such as sandbox or production. 6. Name for restored environment. 7. Click on restore. Important point: You can restore your production environment into a new production environment even if doing so results in exceeding your number of environments or database capacity quotas. You can however only exceed this quota by one extra production environment, regardless of how many production environments you have available for your subscription. This capability is provided as an exception, to ensure that you can always restore your production environment in critical situations. You must return within your quota within 30 days following the restore by either removing the original production environment or by purchasing an additional production environment. Once the data in the restored database meets your expectations, activate the users, initiate the work queues, and notify your organization that the environment is once again available for use and that the restoration procedure is now complete. Hope this helps!
Share Story :
Optimizing Project Impact: Continuous Monitoring of Client System Utilization for Enhanced Value Deliver using Business Central
Introduction: It is crucial for the management team to track the client’s utilization of the system as a key metric for assessing the project’s success and the value it brings to the client. To facilitate this monitoring process, I have developed a utility that can automatically generate and send reports to the management team, detailing the number of records created in specified tables. For example, during the initial master data upload phase, 1500 data added into the Customer table. Subsequently, over the following month, this figure increased to 1750 and then to 1950. Such trends signify that the client is utilizing the system in line with expectations. Pre-requisites: Configuration: Usage Statistics Setup Page: This page, contains two main fields:- Collect Statistics (Boolean) and Mail Recipients (which contains email id’s to which the report has to be send) and one more field that is primary key field is added in table but not in page and it is set to code due to which the header we get as required in Usage Statistics Setup Page. The datatype of the field primary key is set to header because the default number of code is null. Regex is used for pattern matching. Here, email validation is added on Mail Recipients. User can enter multiple email addresses in this format eg. abc@gmail.com;xyz@gmail.com. If the Collect Statistics is enabled than only you can process further and there should be at-least one mail id present in Mail Recipients. Usage Statistics Configuration Page: This page contains the actual data from which the data will be passed and report will be generated. The list and the card pages are also created with same fields. The “All Object With Caption” is used for viewing all object details in the system. The trigger lookup is used to get the table no. and table name at runtime. After fetching the specific table no. and table name, fields will filter according to the filter field 1 value, same goes for filter field 2 value. (filter field 2 is added according to the requirements.) The FieldsDisplay procedure is used to retrieve field no. and field name of the according to its record. In Filter Field 1 Name any field can be selected and filter field 1 value must be set according to that field. Create Statistics Report: This report is designed to automate the generation of usage statistics based on configurations specified in the “Usage Statistics Configuration” table. The report is flagged as “ProcessingOnly,” indicating it is intended for background processing rather than direct user interaction. The dataset within the report contains a data item with an “OnAfterGetRecord” trigger, which executes after each record is retrieved. This trigger is responsible for processing each configuration record, applying filters, and updating or inserting records into the “UsageStatistics” table. Additionally, the report features an “OnInitReport” trigger that checks the “Usage Statistics Setup” table to ensure that statistics collection is enabled. If this condition is not met, an error message is displayed, and the report exits. In essence, this report streamlines the creation of usage statistics in Business Central, adhering to specified configurations and ensuring the necessary setup conditions are satisfied before processing. Usage Statistics Page: After filter the number of records the data which will be generated will be displayed in this table. List will also be created with same fields. In Record Count the number of values are there which satisfies the filter condition. Send Statistics Report: This report is designed to send usage statistics via email. Let’s break down the code: The report begins with specifications such as its application area, caption, and usage category. Notably, it is marked as a “ProcessingOnly” report, indicating it is intended for background processing rather than direct user interaction. The OnInitReport trigger executes when the report is initialized. It checks settings in the “Usage Statistics Setup” table, ensuring that statistics collection is enabled (“Collect Statistics”) and valid mail recipients are specified (“Mail Recipients”). If these conditions are not met, error messages are displayed, and the report exits. The main functionality is in the OnPostReport trigger, which executes after the report is processed. It performs the following steps: Email Excel Sheet: In this, on 22nd date after applying the filters the output is given and on 24th it again checks by applying same filter since new data was not added in the respective table that’s why it is showing the same data. Conclusion In conclusion, the automated reporting tool plays a pivotal role in monitoring client system utilization, revealing encouraging trends such as the gradual increase in customer records. These insights affirm the project’s success and underline its value to the client, reinforcing our commitment to proactive monitoring for continual optimization and client satisfaction.
Share Story :
Salesforce Integration using Azure Integration Services
In this Blog, it shows the detailed information for integration between SAP B1 to Salesforce. The AIS Interface is intended to Extract, Transform and Route the data from SAPB1 to Salesforce. The steps for integration would be same for different entities. Event Scenario Pre-Requisites: Process Steps: On Demand Load Scenario Pre-Requisites: Process Steps Based on the above Integration scenarios Azure Developer can easily navigate for the integration implementation and they can choose between Event Driven or On-Demand based on the business requirement. We are just getting started with Azure Integration Services and stay tuned for more in this series.
