Latest Microsoft Dynamics 365 Blogs | CloudFronts - Page 23

Optimizing Data Management in Business Central using Retention Policies

Introduction Data retention policies dictate which data should be stored or archived, where it should be stored, and for how long. When the retention period for a data set expires, the data can either be deleted or moved to secondary or tertiary storage as historical data. This approach helps maintain cleaner primary storage and ensures the organization remains compliant with data management regulations. In this blog, we’ll be covering –  Pre-requisites Business Central environment References Data Retention Policy Clean up Data with Retention Policy – Microsoft Learn Details In Business Central, we can define Retention Policies based on two main parameters – The table which is to be monitored and the retention policy. Retention Policy Retention periods specify how long data is kept in tables under a retention policy. These periods determine how often data is deleted. Retention periods can be as long or as short as needed. Applying a retention policy –  Retention policies can be applied automatically or manually. For automatic application, enable the policy, which creates a job queue entry to apply it according to the defined retention period. By default, the job queue entry applies policies daily at 0200, but this timing can be adjusted (refer below screenshot), preferably to non-business hours. All retention policies use the same job queue entry. For manual application, use the “Apply Manually” action on the Retention Policies page and turn on the “Manual” toggle to prevent the job queue entry from applying the policy automatically. We can also exclude or include certain records based on filters. Deselect the “Apply to all records” this will show a new tab where we can define the record filters. Every such group can have it’s own retention period. By default, only a few selected tables are shown in the table selection on the Retention Policy page. If we want to include our custom table in this list, we’ll have to do a small customization. **You cannot add tables that belong to seperate modules, for example “Purchase Header” cannot be added in this list by you. Unless you work at Microsoft in which case you already knew this. ** So here I’ve created a small sample table. And I’ve created an Codeunit with Install subtype where I’m adding my custom table to the allowed tables list. After deploying I can now see my custom table in the list. Developers also have the option to set Mandatory or Default filters on the custom tables. Mandatory filters cannot be removed and Default filters can be removed. To create a mandatory/default filter –  Setting the “Mandatory” parameter to true, makes it Mandatory otherwise it’s a default filter. When I add the table ID on the “Retention Policy” I get the following entry created automatically. If I try to remove the filters, I get the error –  Conclusion Thus, we saw how we can leverage Retention Policies in Business Central to reduce capacity wastage without heavy customizations.

Share Story :

Understanding OData.Etag in Postman and Related Features

Introduction Open Data Protocol (oData) is a web protocol for querying and updating data. It simplifies the data exchange between clients and servers, allowing for easy integration with RESTful APIs. One important feature of oData is the use of ETags (Entity Tags), which are part of the HTTP protocol and help manage the state of resources. ETags serve as a version identifier for a resource. When a client retrieves a resource, the server sends an ETag in the response. The client can then use this ETag in subsequent requests to ensure that it is working with the most current version of that resource. What is oData.ETag? In Postman, oData.ETag refers specifically to the ETag values used in oData responses. These tags help maintain data integrity during updates. When a client attempts to update a resource, it can include the ETag in the request headers. If the ETag matches the current version on the server, the update proceeds. If not, the server rejects the request, preventing unintended data overwrites. Using oData.ETag in Postman Fetching an ETag: When you send a GET request to an oData endpoint, look for the ETag header in the response. For example:GET https://api.example.com/odata/productsThe response might look like this:HTTP/1.1 200 OKETag: “W/\”123456789\”” Updating a Resource with ETag: When you need to update the resource, include the ETag in the If-Match header of your PUT or PATCH request:PATCH https://api.example.com/odata/products(1)If-Match: “W/\”123456789\””Content-Type: application/json {    “name”: “Updated Product Name”} If the ETag matches, the update occurs; otherwise, you’ll receive a 412 Precondition Failed response. Related Features in Postman Conditional Requests: Beyond oData, ETags are useful in REST APIs for conditional requests. You can use If-None-Match to check if a resource has changed before downloading it again, saving bandwidth and time. CORS Preflight Requests: When working with cross-origin requests, browsers may perform preflight checks using OPTIONS requests. Understanding ETags can help in managing these requests effectively, ensuring your API can handle them smoothly. Caching Strategies: Implementing caching with ETags can enhance performance. Postman can simulate caching behavior, allowing you to test how your API behaves when dealing with cached responses. Error Handling: Testing how your API handles errors, such as a mismatched ETag, is crucial for robustness. Postman’s test scripts can validate error responses and ensure that your API behaves as expected. Conclusion Understanding oData.ETag in Postman is essential for developers working with RESTful APIs, especially in scenarios where data integrity is critical. By leveraging ETags, you can ensure safe and efficient data updates, manage caching, and improve your overall API interactions.

Share Story :

Performance Optimization Techniques in Power BI

Posted On November 7, 2024 by Deepak Chauhan Posted in Tagged in

Introduction Building efficient Power BI reports can be challenging, especially when working with large datasets. One common issue Power BI users encounter is the “stack overflow” error, which can disrupt the report-building process. I In this blog I will share some performance optimization techniques that you can use in building power BI report.  When using power query or importing data you might have got this error –  “Expression.Error: Evaluation resulted in a stack overflow and cannot continue.”  This error occurs when there’s a large amount of data is being imported or not enough memory available memory available for Power BI to complete the operation.  This issue can be resolved by increasing the Memory and CPU cores that can be used by Power BI while querying or evaluations.  There are two settings that we need to keep in mind –  By default, the maximum number of simultaneous evaluations is equal to the number of logical CPU cores on the machine and Maximum memory used per simultaneous evaluation is 432 MB.  Personally, I have kept these values in between or close to maximum value depending on my requirement and system.      Also, here is link to recommendations by Microsoft for managing Power BI workload and evaluation configurations –  https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-evaluation-configuration  Conclusion Optimizing performance in Power BI is crucial for handling large datasets and preventing issues like the “stack overflow” error. By adjusting settings for simultaneous evaluations and memory allocation, you can significantly improve report processing and responsiveness. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Advance warehouse management – Wave Templates in Microsoft D365 F&O – Part 6 

Introduction In this blog we will learn about the basic setups required for the Advanced Warehouse Management process. These setups may vary depending on the business scenarios. For Wave Templates to work in an Advance warehouse scenario, there are some prerequisites that we need to do first. The following are the setups that we need to configure:  Wave Templates plays a significant role in advanced warehouses. Wave Templates are used for shipment of goods for Sales Orders, Transfer Order ship, Or Outbound shipment orders. Also, it is used for Production order and Kanban Orders.  – For my current scenario, I will create a Wave template for a Sales Order.  – Select Wave template type as “Shipping”. So, when we create the Sales order wave will be created.  – There is option to Automatically create the Wave.  – Following setup I have enabled.  – Automate Wave creation.  – Process wave at Release to warehouse.  – Process wave automatically at threshold.  – Automate wave release.  – The following basic methods are needed to complete sales order transactions.  – Also, we need to do regenerate methods step to enable methods on wave templates.  – Click Regenerate methods.  Now, Wave Templates are ready to use in Advance Warehouse process.  That’s it for this blog!!  How to use these Wave Templates in actual transactions will be discussed going forward in the blog series.  Next in the Blog series:  How to set up Worker in Advance warehouse management in D365.  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Unlocking Seamless Financial Operations: The Power of Stripe Integration with Business Central

Introduction Integrating Stripe’s payment gateway with Microsoft Dynamics 365 Business Central can streamline the payment process for businesses, enabling seamless transactions, real-time invoicing, and efficient payment tracking. By doing this, businesses can automate the process of accepting online payments and manage financial data in a single platform. This blog will guide you through the steps involved in integrating Stripe with Business Central, as well as the benefits and essential considerations. Why Integrate Stripe with Business Central? Steps to Integrate Stripe with Business Central 1. Set up a Stripe Account To get started, you’ll first need a Stripe account if you don’t already have one: – Sign up for a Stripe account on Stripe’s website. – Complete the necessary account verification steps and configure your business information. 2. Create an extension for the business central Business Central allows the integration of third-party payment gateways through extensions. We will customize our business central to capture Payments, Refunds, and Disputes by using a Payment journal. 3. Configure setup in Business Central With the help of customization, we will create a stripe setup in Business Central where we can define balance accounts and other important values. 4. Link Stripe with Business Central via Azure app Will create azure app to capture all transaction with help of Stripe webhooks. When building Stripe integrations, you might want your applications to receive events as they occur in your Stripe accounts, so that your backend systems can execute actions accordingly. To enable webhook events, you need to register webhook endpoints. After you register them, Stripe can push real-time event data to your application’s webhook endpoint when events happen in your Stripe account. Stripe uses HTTPS to send webhook events to your app as a JSON payload that includes an Event object. Receiving webhook events is particularly useful for listening to asynchronous events such as when a customer’s bank confirms a payment, a customer disputes a charge, a recurring payment succeeds, or when collecting subscription payments. Benefits of Integration Conclusion Integrating Stripe with Microsoft Dynamics 365 Business Central simplifies the payment collection process, streamlines accounting tasks, and improves overall business efficiency. By following the steps above, businesses can easily set up this integration and begin accepting payments through Stripe directly within Business Central. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to Send D365 CRM Emails with Attachments Using Power Automate

Introduction In this guide, we’ll walk through the process of sending emails from D365 CRM with attachments using Power Automate. This step-by-step approach will help you understand how to automate your email communications from CRM with attachments efficiently. Use-Case Let’s say you’re working on a project where you need to send emails from D365 CRM that include attachments. In this example, the document is stored in SharePoint, and its URL is linked within the CRM record. This setup is common in CRM where files are centrally stored in SharePoint but need to be easily accessible in CRM for email communication and tracking in CRM. However, this approach is versatile—whether you want to attach specific documents, generate them dynamically, or handle a range of file types, it can be adapted to meet your use-case needs. Why this solution? Main objective of using D365 Emails is the ability to track the emails to the record to keep track of communications in timeline. Also, manually attaching documents to each email is time-consuming and prone to errors. With Power Automate, you can automate this process, ensuring that every email includes the right attachment without extra steps. This solution not only saves time but also reduces the risk of sending incorrect or outdated files, keeping your communications accurate and efficient. Implementation – Step by Step As per my use-case, I have added a column in Accounts table that will hold my SharePoint file URL which I’ll use in power automate. Step 1: Trigger the Flow when a flag is marked true to send email report. Step 2: Get the file content using SharePoint path Step 3: Create a new ‘Email Message’ record in data verse (Add a new row) Step 4: Add a new row for ‘Attachments’ and link to email message record Add the custom value as shown below Add Base64 to your file content Add file name Step 5: Send the email That’s it Let’s test it – Results Trigger the flag (as per my use-case) The Email record with attachment Conclusion By integrating Power Automate to handle attachments from SharePoint, you streamline your email process, save time, and minimize errors. This solution is especially valuable for cases requiring frequent attachments or centralized file storage, as it keeps communication efficient and files up-to-date. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Advance warehouse management – Work Classes and Work Templates in Microsoft D365 F&O – Part 5 

Introduction In this blog we will learn about the basic setups required for the Advanced Warehouse Management process. These setups may vary depending on the business scenarios. For Work classes and Work Templates to work in an Advance warehouse scenario, there are some prerequisites that we need to do first. The following are the setups that we need to configure:  Work classes and Work Templates plays a significant role in advanced warehouses. a Work classes and Work Templates are the set of rules to create the work for Purchase Order, Sales Order, Transfer Order etc.  For my current scenario, I will create a Work classes for a Sales Order, Purchase Order and Transfer Order.  – Enter Work Class ID and Description.  – Select work order type from the Drop-down menu.  – Here, I have created 4 work classes. We will use these work classes while making the work templates.  Work Templates:  Work templates will be used to create a work, when there is a related transactions like Purchase Order, Transfer Order, Sales Order etc.  By selecting this work, warehouse worker can perform the transaction on the mobile device.  – Click New.  – In the work Order type select Purchase Order.  – Enter Work template name  – Enter Work template Description  – Select the work type as “Pick” and “Put”  – Select previously created Work class ID as “PurchOrder”.  – Click New.  – In the work Order type select Sales Order.  – Enter Work template name  – Enter Work template Description  – Select the work type as “Pick” and “Put”  – Select previously created Work class ID as “SalesOrder”.  Now, Work classes and Work Templates are ready to use in Advance Warehouse process.  That’s it for this blog!!  How to use these Work classes and Work Templates in actual transactions will be discussed going forward in the blog series.  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Dynamics 365 Business Central: Setting Up an Approval Workflow with Flexible Approvers

Introduction In today’s busy work environment, having a smooth approval process is important for keeping things running efficiently. Dynamics 365 Business Central makes it easy to set up workflows that help manage approvals effectively. One great feature is the option to allow approval from “one of” several chosen approvers. This was one of our clients’ requirements to avoid delays and allows different team members to take part in the approval process. By letting any of the designated approvers approve requests, your organization can work faster and better together. In this guide, we’ll show you how to create a simple approval workflow in Dynamics 365 Business Central that requires just one approver from a group, making your approval process quicker and more efficient. Scenario: The purchase order has two approvers but if any one of them approves the approval workflow will be fulfilled. Hence, if one approver approves, all open approval entries will be closed. 1. Workflow user group As shown below the workflow user group, sequence no. 1 has been assigned to both the users. 2. Open the workflow for which this user group needs to be assigned. 3. Add the Workflow User Group to the Workflow – Open the response “add record restriction” – Add the workflow user group as the approver type and select the workflow user group created earlier. 4. Modify the events and conditions – Remove the “on condition to always” – Remove the third step 5. Additional response To close open approval entries after receiving a single approval, you must edit the second step and add the response, “Approve the approval request for the record.” – Click on the response of 2nd line ” remove record restriction” – Add response, ” “Approve the approval request for the record.” Conclusion In conclusion, setting up an approval workflow with flexible approvers in Dynamics 365 Business Central can significantly enhance the organization’s efficiency and responsiveness. By allowing any designated approver to handle requests, one can streamline the approval process and reduce potential delays. This approach not only fosters collaboration among team members but also ensures that important decisions are made quickly. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to Send a Customer Statement via Email in Microsoft Dynamics 365 Business Central

Introduction In today’s fast-paced business environment, ensuring timely and accurate communication with your customers is critical. One of the most frequent interactions businesses have with customers is providing them with account statements. Microsoft Dynamics 365 Business Central simplifies this process, allowing users to send Customer Statements directly via email, streamlining communication and helping businesses maintain positive relationships with their clients. Steps to achieve the goal: 1. Log into Business Central Start by logging into your Business Central account. Ensure you have the necessary permissions to access customer information and send reports. 2. Access the Customer List Once logged in: 3. Select the Document Layout before you send an email to customers. 4. Setup Email Account (Optional if already configured) 4. Open the Statement Report Within the Customer Card: 5. Set Up the Statement Parameters Before generating the customer statement: 6. Send the Statement via Email Once the statement is ready: 8. Review and Send Once you’ve reviewed everything: Conclusion Sending customer statements via email in Business Central is a straightforward process that enhances customer communication while saving time. With just a few clicks, you can generate, customize, and send statements to your clients, ensuring that they stay informed about their account status. This efficient process helps you maintain accurate financial records, avoid payment delays, and ultimately, improve your cash flow. By leveraging Business Central’s customer statement feature, you can optimize your accounting workflows and focus more on growing your business. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Posting – Document processing – The remote certificate is invalid according to the validation procedure Error in D365 FNO

Introduction Encountering errors while working with Sales Orders in Dynamics 365 Finance and Operations (D365FO) can disrupt your workflow, especially in development environments. One common issue involves posting the packing slip due to an expired SSL certificate in cloud-hosted environments. SSL certificates in D365FO cloud-hosted setups are valid for one year, after which they need to be renewed for continued security and functionality I faced this issue while trying to post the packing slip for a Sales Order.  I faced this issue on Dev Environment. To resolve this issue, follow the below process: To maintain security, these certificates must be renewed through rotation. Credential rotation is a critical aspect of enterprise-level cybersecurity, and this process can be managed via LCS. To resolve this log into the LCS environment. – Select the Implementation Project and then click on Full details option. – Click on the Maintain drop down button and then select the Rotate Secrets. – After that click on Rotate SSL Secrets Certificates option. It will look like this. This process make take a few minutes to complete. This will resolve the issue. After completion you can see that the status will be changed to Deployed. Then the next and final step is to click on Apply updates option this will apply all the changes and updates. Conclusion Rotating SSL certificates in Dynamics 365 Finance and Operations is essential to maintain security and functionality in cloud-hosted environments. By following these steps in LCS, you can ensure that your environment remains secure and that tasks like posting packing slips proceed smoothly. Regularly checking and updating your SSL certificates will help prevent future disruptions and keep your operations running efficiently. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange