Category Archives: Blog
Optimizing Project Impact: Continuous Monitoring of Client System Utilization for Enhanced Value Deliver using Business Central
Introduction: It is crucial for the management team to track the client’s utilization of the system as a key metric for assessing the project’s success and the value it brings to the client. To facilitate this monitoring process, I have developed a utility that can automatically generate and send reports to the management team, detailing the number of records created in specified tables. For example, during the initial master data upload phase, 1500 data added into the Customer table. Subsequently, over the following month, this figure increased to 1750 and then to 1950. Such trends signify that the client is utilizing the system in line with expectations. Pre-requisites: Configuration: Usage Statistics Setup Page: This page, contains two main fields:- Collect Statistics (Boolean) and Mail Recipients (which contains email id’s to which the report has to be send) and one more field that is primary key field is added in table but not in page and it is set to code due to which the header we get as required in Usage Statistics Setup Page. The datatype of the field primary key is set to header because the default number of code is null. Regex is used for pattern matching. Here, email validation is added on Mail Recipients. User can enter multiple email addresses in this format eg. abc@gmail.com;xyz@gmail.com. If the Collect Statistics is enabled than only you can process further and there should be at-least one mail id present in Mail Recipients. Usage Statistics Configuration Page: This page contains the actual data from which the data will be passed and report will be generated. The list and the card pages are also created with same fields. The “All Object With Caption” is used for viewing all object details in the system. The trigger lookup is used to get the table no. and table name at runtime. After fetching the specific table no. and table name, fields will filter according to the filter field 1 value, same goes for filter field 2 value. (filter field 2 is added according to the requirements.) The FieldsDisplay procedure is used to retrieve field no. and field name of the according to its record. In Filter Field 1 Name any field can be selected and filter field 1 value must be set according to that field. Create Statistics Report: This report is designed to automate the generation of usage statistics based on configurations specified in the “Usage Statistics Configuration” table. The report is flagged as “ProcessingOnly,” indicating it is intended for background processing rather than direct user interaction. The dataset within the report contains a data item with an “OnAfterGetRecord” trigger, which executes after each record is retrieved. This trigger is responsible for processing each configuration record, applying filters, and updating or inserting records into the “UsageStatistics” table. Additionally, the report features an “OnInitReport” trigger that checks the “Usage Statistics Setup” table to ensure that statistics collection is enabled. If this condition is not met, an error message is displayed, and the report exits. In essence, this report streamlines the creation of usage statistics in Business Central, adhering to specified configurations and ensuring the necessary setup conditions are satisfied before processing. Usage Statistics Page: After filter the number of records the data which will be generated will be displayed in this table. List will also be created with same fields. In Record Count the number of values are there which satisfies the filter condition. Send Statistics Report: This report is designed to send usage statistics via email. Let’s break down the code: The report begins with specifications such as its application area, caption, and usage category. Notably, it is marked as a “ProcessingOnly” report, indicating it is intended for background processing rather than direct user interaction. The OnInitReport trigger executes when the report is initialized. It checks settings in the “Usage Statistics Setup” table, ensuring that statistics collection is enabled (“Collect Statistics”) and valid mail recipients are specified (“Mail Recipients”). If these conditions are not met, error messages are displayed, and the report exits. The main functionality is in the OnPostReport trigger, which executes after the report is processed. It performs the following steps: Email Excel Sheet: In this, on 22nd date after applying the filters the output is given and on 24th it again checks by applying same filter since new data was not added in the respective table that’s why it is showing the same data. Conclusion In conclusion, the automated reporting tool plays a pivotal role in monitoring client system utilization, revealing encouraging trends such as the gradual increase in customer records. These insights affirm the project’s success and underline its value to the client, reinforcing our commitment to proactive monitoring for continual optimization and client satisfaction.
Share Story :
Salesforce Integration using Azure Integration Services
In this Blog, it shows the detailed information for integration between SAP B1 to Salesforce. The AIS Interface is intended to Extract, Transform and Route the data from SAPB1 to Salesforce. The steps for integration would be same for different entities. Event Scenario Pre-Requisites: Process Steps: On Demand Load Scenario Pre-Requisites: Process Steps Based on the above Integration scenarios Azure Developer can easily navigate for the integration implementation and they can choose between Event Driven or On-Demand based on the business requirement. We are just getting started with Azure Integration Services and stay tuned for more in this series.
Share Story :
Quarantine Management Process in Dynamics 365 FnO Supply Chain Management
Hello Everyone!!! In this blog I will explain the Quarantine Management process in Dynamics 365 Supply Chain Management. Here the Product was first received in the Main warehouse and then it was moved to Quarantine Warehouse for further inspection as the Product failed to clear the Quality Test. In this blog I will explain a scenario where the Products should be directly received in the Quarantine Warehouse for inspection and after final inspection it turns out that all the products fail the inspection so, a few products will be scrapped and the rest will be returned back to vendor. What is Quarantine Management? The quarantine management process in D365 aims to effectively manage and control quality issues, ensuring that only items meeting the required standards are released for use or distribution. The quarantine management process in Dynamics 365 is a systematic approach to handling items that are suspected of having quality issues or non-conformities. When an item is flagged for quarantine, it is physically segregated from the regular inventory and moved to a designated quarantine location. The quarantined item undergoes thorough inspection and testing to assess the extent of the quality issue. Based on the evaluation, decisions are made regarding the item’s disposition, which may include repair, return to the vendor, or scrapping. If the item is repairable, necessary actions are taken to rectify the identified issues. In cases where the vendor is responsible, the item can be returned for resolution. Once the necessary actions are completed, and the item meets the required quality standards, it is released from quarantine and reintegrated into the regular inventory for use or distribution. This process ensures that only items meeting quality criteria are allowed for further processing, while mitigating the risk of non-conforming products entering the supply chain Let’s start with the setups first: Step 1: Map the Quarantine Warehouse to the Main Warehouse. For that the Pathway is: Go to Inventory Management>Set Up>Inventory Breakdown>Warehouses. Step 2: The next step is to Enable the “Quarantine Management” parameter in the Item Model Group Quarantine Management Process: The below Purchase Order CM-PO-0000137 has been received in the Quarantine Warehouse as per the setup, hence a Quarantine Order CM-0000142 with 9 quantities has been created. Now if I go to the Quarantine Orders page, I can see that a new Quarantine Order has been created. The below screenshot shows the Transactions that took place after the Product Receipt was posted. After Inspection it was found out that 5 quantities were damaged, 4 quantities were to be returned back to the Vendor which means that all the 9 quantities failed to pass the Quality Inspection. So, what I will do is Scrap the 5 quantities and return back the 4 quantities to the Vendor by creating a Purchase Return Order. Now I will split the Quarantine Order CM-0000142 into 2 separate Quarantine Orders. For that I will use the Split function which is available at the Top of the screen under the Functions Tab. In the above image you can see that I have split the Quarantine Order CM-0000142 into 2 different Quarantine Orders that is CM-0000143 and CM-0000144 for further processing. Now let’s begin the further processing. To scrap this order, I will use the Scrap functionality. To Scrap this order, click on Function and then select Scrap. Here, you can see that the Quarantine Order CM-000143 has been ended as it has been scrapped. 2. Return the Items back to vendor: Now in order to return the Items back to vendor I will first the Invoice the existing Purchase Order which is CM-PO-0000137. Then I will receive the Items from the Quarantine warehouse to the W3 warehouse. Below you can see that I have invoiced the purchase order CM-PO-0000137. Now let’s receive the items from the Quarantine Warehouse to the Central Warehouse, for that I will use the Arrival Journal. As you can see that a New Arrival Journal has been created. Validate and Post the Journal. Vendor Return Process: After posting the Journal create a New Purchase Order with the type as Return Order. For that go to Procurement and Sourcing>Purchase Orders>All Purchase Orders. Then select the Vendor to whom the Items will be returned and select the Purchase Type as Return Order. Enter the Site and Warehouse from which the Items will be returned back to the Vendor. Then enter the RMA number provided by the Vendor and then Click OK. Click on the Purchase Order line, select Credit Note, choose the specific invoice, input the quantity as a negative value, and then click OK. In my case the Invoice Number was 311711, so I will select that and then click OK. Then proceed with the normal purchase order processing which will be Product Receipt and Invoicing. After Invoicing the Return Order aa credit note will be created which will be deducted from the Vendor Balance. The below screenshot represents the On hand List before and after posting the Return Order Invoice. In the above screenshot you can see that the On hand Quantity changed from 26 to 22 since 4 quantities have been returned back to vendor. That’s it for this blog. Hope this helps you! Thank You!
Share Story :
Get and Post Method in Business Central
Introduction: HTTP (Hypertext Transfer Protocol) is a protocol used for communication between web servers and clients. When a client sends a request to a server, it is called an HTTP request. The request consists of a request line, headers, and an optional message body. The request line contains the HTTP method, URL, and HTTP version. The headers contain additional information about the request, such as the user agent and content type. When the server receives the request, it sends back an HTTP response. The response consists of a status line, headers, and an optional message body. The status line contains the HTTP version, status code, and reason phrase. The headers contain additional information about the response, such as the content type and server type. Pre-requisites: Configuration: So, in this we are going to use table extension of the table Item [Record id: 27]. It extends the “Item List” page by adding a button after the “History” action. The button is defined as an action with the following properties: ApplicationArea, Visible, Image, and trigger OnAction(). When the button is triggered, it sends an HTTP request using the HTTPRequestFact() and HTTPRequestFactPost() functions. The HttpClient.Get method to send an HTTP GET request to retrieve data from a remote server. The syntax for the HttpClient.Get method is as follows: [Ok := ] HttpClient.Get (Path: Text, var Response: HttpResponseMessage) Here, Path is the path of the resource you want to retrieve, and Response is the response received from the remote endpoint. The HttpClient.Get method returns a Boolean value indicating whether the operation was successful or not. If the operation was successful, the Response parameter will contain the response received from the remote endpoint. This code sends an HTTP GET request to the URL https://ene6g2z8lzf2g.x.pipedream.net/ using the HttpClient.Get method. The response received from the remote endpoint is stored in the Httpresponsemessage variable. If the operation was successful, the response is read as a string and stored in the Response variable. Finally, the Message function is called to display the response on the screen. The HttpClient.Post method to send an HTTP POST request to a remote server. The syntax for the HttpClient.Post method is as follows: [Ok := ] HttpClient.Post (Path: Text, Content: HttpContent, var Response: HttpResponseMessage) Here, Path is the path of the resource you want to post data to, Content is the HTTP request content sent to the server, and Response is the response received from the remote endpoint. The HttpClient.Post method returns a Boolean value indicating whether the operation was successful or not. If the operation was successful, the Response parameter will contain the response received from the remote endpoint. This code sends an HTTP POST request to the URL https://enni1en7jg0n.x.pipedream.net/ with the JSON payload constructed from the records in the current page. The HttpClient.Post method is used to send the HTTP POST request. The response received from the remote endpoint is stored in the Httpresponsemessage variable. If the operation was successful, the response is read as a string and stored in the Response variable. Finally, the Message function is called to display the response on the screen. The code first sets the filter for the current page to the “Item” record. It then loops through all the records in the “Item” record and creates a new JSON object for each record. The JSON objects are added to a JSON array. A new JSON object is created for the JSON array, and the JSON array is added to the JSON object. The JSON object is then written to a text variable. A new HTTP content object is created for the JSON payload, and the JSON payload is written to the HTTP content object. The HTTP POST request is then sent to the remote endpoint using the HttpClient.Post method. If the operation was successful, the response is read as a string and stored in the Response variable. Finally, the Message function is called to display the response on the screen. The RequestWithBasicAuthenticaton procedure takes in three parameters: Url, Username, and Password. It creates an HTTP request message with the given URL and sets the method to GET. It then adds the authorization method by encoding the username and password in Base 64 and adding it to the request headers. After that in if condition it sends an HTTP GET request to the URL specified in the Url parameter with basic authentication using the HttpClient.Send method. The HttpClient.Send method is used to send the HTTP GET request. The response received from the remote endpoint is stored in the ResponseMessage variable. If the operation was successful, the response is read as a string and stored in the Response variable. Finally, the Message function is called to display the response on the screen. Output: For Multiple Data, Conclusion: In conclusion, this blog has provided valuable insights into the fundamental concepts of HTTP GET and POST methods, along with an exploration of Basic Authentication in the discussed code. Readers now have a clearer understanding of how these key HTTP techniques, combined with authorization mechanisms, play pivotal roles in web communication, facilitating data retrieval, creation, and ensuring secure access. Introduction: HTTP (Hypertext Transfer Protocol) is a protocol used for communication between web servers and clients. When a client sends a request to a server, it is called an HTTP request. The request consists of a request line, headers, and an optional message body. The request line contains the HTTP method, URL, and HTTP version. The headers contain additional information about the request, such as the user agent and content type. When the server receives the request, it sends back an HTTP response. The response consists of a status line, headers, and an optional message body. The status line contains the HTTP version, status code, and reason phrase. The headers contain additional information about the response, such as the content type and server type. Pre-requisites: Configuration: So, in this we are going to use table extension of the table Item [Record id: 27]. It extends the “Item List” page by adding a button after the “History” action. The button is defined as an action with the following properties: ApplicationArea, Visible, Image, and trigger OnAction(). When the button is triggered, it sends an HTTP request using the HTTPRequestFact() and HTTPRequestFactPost() functions. The HttpClient.Get method to send an HTTP GET request to retrieve data from a remote server. The syntax for the HttpClient.Get method is as follows: [Ok := ] … Continue reading Get and Post Method in Business Central
Share Story :
Creating a Custom Connector for retrieving a single record from Shopify
Hello everyone! Shopify has become a powerful platform for companies to build their online presence and run their operations in the e-commerce space. Effectively maintaining consumer data is one of the most important aspects of running an online shop. In this article, we’ll show you how to build a custom connection to use a Postman collection to extract a single customer’s information from Shopify. You may easily get client information using this simplified procedure for better customer service and data analysis. Let’s get started! Click this link ‘https://documenter.getpostman.com/view/3800273/SWLk55pF’ to get the Postman Collection. Click ‘Run in Postman’ button, select ‘Postman Desktop App to Import’ and select ‘Open App Make sure Postman is installed in your device. Once this is imported, you will see the pre-built libraries. Export the Postman Collection and Select ‘Customer’ Imported the Collection which is exported which is in the swagger.json format Add an image as the connector icon and description as shown below. Add the host name and move to the next Section ‘Security’ Select a type of Authentication. Here, in this case, I selected Basic Authentication. Basic Authentication involves Username and Password. Provide appropriate parameter labels. We can see the pre-defined Actions as we imported the postman collection. We will be retrieving details of a single customer so select ‘RetrievesASingleCustomer’. Enter the verb Get request URL to retrieve the single customer detail Verb: GET, URL: https://{apikey}:{password}@{hostname}/admin/api/2020-10/customers/{customer_id}.json Paste this: Like this:https://8b3340489beedff0d96efe51699060e2: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@cassrodshop.myshopify.com/admin/api/2023-01/customers/{customer_id}.json Here, {customer_id} is the input parameter. So, I have edited and pre-defined the Customer ID value. Click on Create ‘Custom Connector’ Add a new connection to test the connection. Enter the API Key as the Username and Admin API Access Token as Password. You will find this in your Shopify environment Developer Options. A connection is displayed here. If it doesn’t reflect, refresh the Connection. Add the Customer ID here and click ‘Test Operation’. Voila, the response is successfully generated. Hope this helps!
Share Story :
Streamlining General Ledger Adjustments in Microsoft Dynamics 365 Finance and Operations
In financial management, the accurate recording and reconciliation of transactions within the general ledger stand as paramount tasks. Microsoft Dynamics 365 Finance and Operations, a comprehensive enterprise resource planning (ERP) solution, offers a feature known as General Ledger Adjustments. This feature, often overlooked or misunderstood, plays a pivotal role in addressing critical challenges associated with ledger adjustment entries due to environmental issues or data corruption. The recent discovery of the General Ledger Adjustments feature within the feature management workspace prompted a deeper exploration. However, upon enabling this feature, it became apparent that Microsoft aims to streamline and systematize a fundamental problem-solving process. Reflecting on past experiences, various challenges related to ledger adjustments surfaced: Traditionally, when encountering such issues, the path forward often involved engaging Microsoft support to delve deeper into the problem. While Microsoft proficiently identifies the root cause and initiates steps to address it in future product updates, the immediate concern revolves around rectifying the existing corrupted or missing data. The suggested workaround typically involves manual journal entries. However, this process traditionally unfolds through email communications, leading to potential discrepancies or misunderstandings regarding the specifics—such as the nature of the journal entries, relevant dates, or designated journals. The introduction of the General Ledger Adjustments feature within Microsoft Dynamics 365 Finance and Operations appears as a structured attempt to address these challenges within the system itself, streamlining and formalizing the process that was previously conducted through ad hoc communications. Enabling General Ledger Adjustments: In this section, Microsoft Dynamics 365 Finance and Operations provides an interface where recommended journal entries for data correction are populated based on identified discrepancies or issues within the system. Users are presented with suggested adjustments, and based on their discretion and analysis, they can decide to create a journal entry directly from this interface. Note: Creating a journal entry from this interface generates a “Daily” type of journal within the General Ledger, facilitating a more organized and systematic approach to handling the necessary corrections or adjustments. In conclusion, this feature signifies a proactive step towards enhancing the efficiency and accuracy of managing ledger adjustments within the Microsoft Dynamics 365 Finance and Operations ecosystem.
Share Story :
D365 F&O + Out-of-the-Box WMS vs D365 F&O + SATO WMS for manufacturing, retail and distribution business
In the dynamic landscape of manufacturing, retail and distribution, the selection of the right Warehouse Management System (WMS) holds paramount importance. It’s a pivotal decision that influences operational efficiency by saving time and labor in tracking processes, reducing loss of inventory, prevents loss of sales opportunities, greater productivity & stock control. Hence resulting into greater customer satisfaction and ultimately, business growth. Among the plethora of available options, two prominent choices stand out: D365 F&O out-of-the-box WMS solutions and SATO Global WMS. Out-of-the-box WMS solutions in D365 Finance and Operation offers standardized features designed to suit general warehouse management needs. These solutions are pre-configured and cater to businesses that prioritize simplicity and adherence to standard industry practices. Advantages of D365 F&O Out-of-the-Box WMS: Conversely, SATO WMS along with D365 F&O ERP: SATO WMS is 3rd party software which is tailored for customization and efficiency. SATO WMS is renowned for its adaptability and tailored functionalities. It offers a high level of customization, empowering businesses to mold the system according to their unique requirements. This flexibility facilitates seamless integration with existing processes, ensuring a smoother transition and optimized operations. Key Features of SATO WMS: The decision between SATO WMS and out-of-the-box WMS hinges on understanding the specific needs and priorities of the customer. If a customer prefers standardized processes and does not require extensive customization, an out-of-the-box WMS might be more suitable, offering a cost-effective and efficient solution. However, for businesses seeking a highly customizable solution to align with their unique operational workflows, SATO WMS stands out as the preferred choice, providing adaptability and tailored functionalities that enhance efficiency and productivity. Conclusion: Selecting the right WMS is a critical decision that directly impacts operational efficiency and business growth in the manufacturing, retail and distribution sectors. While SATO WMS offers unparalleled customization and flexibility, out-of-the-box solutions excel in providing standardized, cost-effective options. By comprehensively evaluating factors such as customization needs, operational workflows, scalability, and budget constraints, businesses can make an informed decision that maximizes efficiency and drives growth.
Share Story :
Workflow Email Alert Configuration in D365 Finance and Operations
In today’s fast-paced business environment, efficient workflow management is essential for organizations to stay competitive. Workflow email alerts help organizations stay on top of important business processes by automatically notifying designated individuals when certain tasks are completed or when specific conditions are met. In this article, we will provide a step-by-step guide for setting up workflow email alerts in D365 FO. Before getting started, it’s important to ensure that email parameters are configured correctly and that you are able to receive test emails from D365 in your inbox. To check this, navigate to System Administration> Setup > Email > Email Parameters. 1. In the configuration, Enable the email provider as per your requirement. 2. If your using the SMTP setting then enter the outgoing server information, and SMTP port and enable SSL/TLS required button. Set up an email address for the user who will receive alerts and enable workflow email alerts for that user. To accomplish this, go to the System Administration section, select Users, and then click on Users. Highlight and click on the user you wish to enable email alerts for. If no email address is attached, enter the user’s email address. To enable email notifications, go to ‘User options’ on the ribbon, then click on ‘Workflow’ on the left side and set the option to ‘Send notifications in email’ to Yes. Create an email template or email message. Templates are generic and can be configured while configuring workflows. Navigate to System Administration> Setup > Email> System email templates. Here you can create a template, such as “Workflow emails.” Assign your email template to the specific workflow. To configure the workflow email alerts, you can navigate to the workflow for which you want to send email notifications, Open that workflow access the basic settings, and choose the relevant workflow template from the dropdown menu. Configure the necessary notifications for the particular task, control, or approval while setting up the workflow steps. Here, you will specify the notification content which will be sent to the user in the email. Configure the batch job to distribute the email notifications. Navigate to System administration > Periodic tasks > Email processing > Email distributor batch. Since we are now adding a batch job, it can be scheduled to run automatically at specified intervals. To schedule autorun, adjust the ‘Recurrence’ settings as per the requirements. Monitor email sending status. To access the E-mail Sending Status page, go to System administration > Periodic tasks > Email processing > Batch email sending status. Here, you can monitor successfully sent, pending, or failed email messages. In conclusion, configuring workflow email notification alerts can be a simple yet effective way to stay on top of important business processes. By following the above steps, you can set up workflow email alerts in D365 FO and start reaping the benefits of streamlined workflow management. Thank you for reading my blog!
Share Story :
Dual Write (DW) – Challenges & Recommendations
Introduction – A few years ago, Dual Write (DW) was introduced with the objective of providing a reliable integration between Dynamics CRM and Dynamics Finance & SCM. The intent is in the right direction, but the execution of this vision has not been reliable. In this blog we have talked about some of the challenges we have faced and our recommendations for reliable integration. Let’s look at some of the challenges we have faced – Initial Sync Limitations – Unavailability of solution in few regions – Enabling Solution issue and missing File – Billing Rule Error – Table Version mismatch – Other Issues – Recommendations for reliable integrations – We hope you found this article useful and if you would like to discuss anything you can reach out to us at transform@cloudfronts.com
Share Story :
Opportunity to Sales cycle – Part 2
Introduction: A Step-by-Step Guide to Creating Opportunities and Processing Sales in Business Central Pre-requisites for creating Sales Opportunities: Create Sales Opportunities You can create opportunities from the Opportunity List page. Typically opportunities are created from a specific contact or salespeople. Globally search “Salesperson” and select related link. Select the salesperson from the list for whom you want to create an opportunity. On the salespeople card page, select opportunities action –> selected salesperson opportunities page will open up –> user can create new by clicking on New action. If opportunity is created through salespeople then salesperson code is automatically generated. Globally search “Contact” and select related link. Contacts list page will open up, user can select the contact for which opportunity has to be created. Click on Home –>Create Opportunity for new opportunity. Click on Contact–> Open Opportunity (To view existing opportunity of selected contact) If opportunity is created through contact then contact name is automatically generated. No. – This field is auto generated based on the set no. series. Description – Description of the opportunity. Contact no. – User can select existing contact or create new. Contact name: Contact name is auto generated based on contact no. Phone/mobile/email: Auto generated from contact card page. Contact company name: This field is auto generated from contact card page Salesperson code: if opportunity is created through salesperson then this field is auto populated, if not then user can manually select the salesperson. Campaign no. User can select specific campaign to link with the opportunity. Priority: The default priority is set as normal. Other priorities are Low and High. Sales Cycle code: This is a setup. (To know more refer part 1 of this blog) Status: The status field is updating automatically. Closed: Specifies if the opportunity is closed. Creation date: Opportunity creation date Date Closed: Specify the date the opportunity was closed. Segment no.: User can link segment to the opportunity. (If any) Sales cycle stages To start the sales cycle, user can click on “Activate first stage” To move an opportunity through sales cycle stages: Sales cycle stage is automatically updated Fill in the rest of the details as necessary. To Close opportunity When the negotiations are finished, you can close the opportunity. When closing an opportunity, you can specify whether it was won or lost, as well as why it was closed. To specify a reason, you must set up closed opportunity codes. To Create Sales Quote To Create Sales Order To Delete Opportunities After you have deleted an opportunity, it is removed it from the Opportunity List page. Conclusion Microsoft Dynamics 365 Business Central provides a robust framework for creating opportunities and processing sales seamlessly. By following this step-by-step guide, you can harness the full potential of Business Central to optimize your sales processes, enhance customer relationships, and drive business growth. Hope this helps!