Category Archives: Blog
How to Use Copilot Chat to Supercharge Productivity in Business Central
Interacting with systems in natural language benefits organizations by making technology more accessible and user-friendly for all employees, regardless of technical skill. It allows users to complete tasks quickly by eliminating complex commands, improving productivity and reducing the potential for errors. This streamlined access to information leads to faster decision-making, ultimately helping organizations operate more efficiently. Are your people finding it difficult to navigate Business Central and access important information quickly? If so, consider incorporating Copilot Chat to ease their suffering! Research indicates that call center operators using AI assistance became 14% more productive, with gains exceeding 30% for less experienced workers. This improvement is attributed to AI capturing and conveying organizational knowledge that helps in problem-solving. Specific tasks can see remarkable speed increases; for instance, software engineers using tools like Codex can code up to twice as fast. Similarly, writing tasks can be completed 10-20% faster with the aid of large language models. In the retail sector, AI-driven chatbots have been shown to increase customer satisfaction by 30%, demonstrating their effectiveness in enhancing customer interactions. Currently, around 35% of businesses leverage AI technology, which is expected to grow significantly as organizations recognize its strategic importance. I am confident that this article will highlight the advantages of incorporating Copilot into your daily activities. References Configuration In Business Central, – Search for “Copilot and Capabilities.” – Select the “Chat” and click on the “Activate” button. – Click on the “Copilot button” near the top right. – You’ll be presented with this screen. – You can ask it queries like – “Show me all the Customers” from India. – “How many open Sales Quotes do I have?” – You can also ask context specific questions like – – You can also ask questions for guidance on how to achieve certain things in the system. In my humble opinion, it is far from perfect but it is absolutely a step in the right direction.In the coming days, the functionality is surely going to blossom and navigation to different screens may become something that only power users need to think about. Conclusion In conclusion, I believe utilizing Copilot can surely boost the Users productivity and reduce reliance on other partners or other experiences users in resolving minor queries.It also reduces the effort taken to move from one piece of information to another. One thing that I would love to see incorporated into this is data summarization and inclusion of all the fields available on the entity to the Copilot’s database. If you need further assistance, feel free to reach out to CloudFronts for practical solutions that can help you develop a more effective service request management system. Taking action now will lead to better customer satisfaction and smoother operations for your business.
Share Story :
Understanding Purchase & trade agreements in D365 – Part 4
In Purchase & trade agreements in D365 – Part 1 & 2 blog we have gone through overview of Purchase & trade agreements in D365 & how to setup different types of Purchase agreements in D365. In Part 3 blog we have covered setup of Trade agreement for purchase price & line discount. In this blog will go through how to setup Trade agreement line discount for quantity range & find next functionality in trade agreement. Problem statement – In this scenario we need to setup trade agreements for quantity range & Find next functionality. Solution steps – 1 – Will setup trade agreement for Line discount for Quantity range 1.1 As discussed in previous blog (Part 3) point no. 2.1, 2.2 & 2.3 we have already created Trade agreement journal names & enabled parameters hence we can use same for this blog. For those who have not gone through blog part 3 can follow below steps Create Trade agreement journal names – Go to Procurement & Sourcing -> Setup -> Prices & discounts -> Trade agreement journal names New -> Name -> Pur Disc -> Description -> Purchase discount -> Relation -> Line disc. (purch.) -> Save. & Enable parameters – Go to Procurement & sourcing -> Setup -> Prices & discounts -> Activate price/discount Enable all parameters for Price. Item parameter Yes for Vendor means It is to enable price for specific vendor for specific Item. Item parameter Yes for Vendor group means if price is same for item for group of supplier (based on vendor group) then need to enable this parameter. Item parameter Yes for All vendors means if item has same price for all suppliers then need to enable this parameter. 1.2 Create trade agreement journal. Go to Procurement & Sourcing -> Prices & discounts -> Trade agreement journals Create new journal – Name -> Pur Disc (Created in step 1.1) -> click on Lines to add details 1.3 Enter line details. Party code type -> Table -> Account selection -> VEN-000001 -> Product code type -> Table -> Item relation -> P-000012 -> From -> 1 -> To -> 101 -> Unit -> Pcs -> Discount percentage 1 -> 5. Kindly note that, discount will be applicable to range excluding last number. In this case it will be excluding 101. 1.4 Then add lines as per range required Party code type -> Table -> Account selection -> VEN-000001 -> Product code type -> Table -> Item relation -> P-000012 -> From -> 101 -> To -> 501 -> Unit -> Pcs -> Discount percentage 1 -> 10. Party code type -> Table -> Account selection -> VEN-000001 -> Product code type -> Table -> Item relation -> P-000012 -> From -> 501 -> To -> 1001 -> Unit -> Pcs -> Discount percentage 1 -> 15 Then validate & post. 1.5 Then create new purchase order with respective vendor (In this case VEN-000001) for respective item (In this case P-000012) discount % will be reflected as defined in Trade agreement. As per above Trade agreement, if quantity is within 1-100 then discount will be 5%. As per above Trade agreement, if quantity is within 101-500 then discount will be 10%. 2 – Will setup trade agreement for Line discount with Next flag enabled 2.1 When we have trade journal with multiple scenarios for same item, “Find next” flag checks the all the applicable scenarios defined in trade agreement. When “Find next” flag is disabled then system checks & use discount with highest level of details. Create trade agreement journal. Go to Procurement & Sourcing -> Prices & discounts -> Trade agreement journals Create new journal – Name -> Pur Disc (Created in step 1.1) -> click on Lines to add details 2.2 Enter line details. Party code type -> Table -> Account selection -> VEN-000002 -> Product code type -> Table -> Item relation -> P-000009 -> Unit -> Pcs -> Discount percentage 1 -> 5 -> Find next -> Yes Party code type -> Table -> Account selection -> VEN-000002 -> Product code type -> Table -> Item relation -> P-000009 -> From -> 101 -> To -> 501 -> Unit -> Pcs -> Discount percentage 1 -> 10 -> Find next -> Yes Party code type -> Table -> Account selection -> VEN-000002 -> Product code type -> Table -> Item relation -> P-000009 -> From -> 501 -> To -> 1001 -> Unit -> Pcs -> Discount percentage 1 -> 15 -> Find next -> Yes 2.3 Now if Find next parameter is Disable & then if we create PO with quantity 50 then disc will be 5%, quantity 150 then also discount 5% (even though we have given disc of 10% for 101-501) It is because Find next parameter is turned off, so system search for best fit price. Since in 1st line, quantity is not mentioned it is best fit line for all quantities & hence only applies 5% disc. 2.4 Now if we enable Find next parameters in all 3 lines & then if we create PO with quantity 50 then discount will be 5%, quantity 150 then discount 15% (as we have defined discount of 10% for 101-501 & 5% for all quantity) So system finds 2 best possible discounts hence added the same. If quantity 700 then discount 20% & If quantity 1050 then Discount 5%, as we have defined additional discount till quantity 1000. In this blog we completed how to setup of trade agreement line discount for quantity range & next flag. In next blog (Part 5) will cover Multiline discount & Total discount. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Sending and Receiving Messages from Azure Service Bus Using Logic Apps
Azure Service Bus, paired with Logic Apps, offers a powerful combination for sending, receiving, and managing messages between different applications and services. In this blog, we’ll walk through the process of sending and receiving messages using Azure Service Bus and Logic Apps. Steps to send and receive messages from service bus using logic app Step 1: Create an Azure Service Bus Namespace Navigate to the Azure Portal: – Go to portal.azure.com and log in with your credentials. Create a Service Bus Namespace: – In the search bar at the top, type “Service Bus” and select Service Bus from the results. – Click + Create to start the creation process. – Fill in the required details: Click Review + Create, and then Create to deploy the namespace. Step 2: Create a Queue or Topic in the Service Bus Namespace Access the Service Bus Namespace: – After the namespace is deployed, navigate to it by clicking on the resource in the portal. Create a Queue or Topic depending on your use case I am going to use: – Creating a Queue: Step 3: Create a Logic App to Send Messages to the Service Bus Navigate to Logic Apps: – In the Azure portal, use the search bar to find and select Logic Apps. – Click + Create to start a new Logic App. Configure Your Logic App: – In the Basics tab, provide the following details: – Click Review + Create, and then Create. Design the Logic App: – Once the Logic App is created, open the Logic Apps Designer and a trigger “When a HTTP request is received” along with POST request. – Add a compose action and pass the input parameters. – Go to Service bus –> Shared access policies –> Copy the Connection String Endpoint url – Add action Service Send Message and paste the copied end point in Connection String. – Pass the Output of compose in content. – Add a response action and the logic app workflow. – Now Copy the Url from trigger and paste it in postman hit the url. – As soon as you hit the url you will get customer Id as response in postman body. – Now Go to azure portal and check the run history I will see the Date and Status has been added for that particular customer id. – Now, Let’s verify this particular message whether it has been sent at the logic level or not. – Go to queue in my case Queue name Is “receivingqueue” –> Go to Service bus Explorer –> Click on Peek form Start. – Now in order see the content/ Message select the sequence number Step 4: Create a Logic App to Receive Messages from the Service Bus – Create a New Logic App: Repeat the steps to create a new Logic App. – Go to Logic app designer. – Add the Trigger “When a message is received in a queue”. – Add a compose action – Add a Terminate action on Succeeded. – Now to verify you check the run history of logic app you can we are getting the content in base64 Format – You can decode it and check it’s the same data that we were sending. Conclusion We’ve successfully set up a messaging system with Logic Apps and Azure Service Bus by following these steps. This configuration makes it possible to automate workflows, integrate apps seamlessly, and create reliable cloud solutions. Whether you’re working with batch processing or real-time data, Azure’s tools give you the strength and flexibility you need to scale your business effectively.
Share Story :
X++ and Excel: A Powerful Partnership
Excel has over 750 million users worldwide, making it one of the most popular software applications in the world. According to recent studies, 89% of companies use Excel for daily operations, financial modeling, data analysis, and other tasks. Excel is so integral to the financial world that many financial analysts and accountants refer to themselves as “Excel jockeys” or “Excel ninjas.” NASA used Excel in its operations for various calculations related to space missions. Using Excel for manual data entry is much more easier for end users as it provides a familar interface and can be navigated much more quickly.It can also be used for quick minor calculations and formulas. References: Details: For businesses generating large volumes of data, it’s essential to have an efficient system for users to input that data smoothly. Are you struggling to keep up with your rapidly growing data? A study by Forrester Consulting shows that companies using Microsoft 365 tools like Excel, Word, Outlook, and PowerPoint see a 15-20% boost in employee productivity due to better collaboration and task management. This article will surely inspire you to start using Excel for your organization’s daily operations too! Enabling the Developer Tab in Excel To access advanced features like creating macros, using form controls, or accessing the XML commands in Excel, you’ll need to enable the Developer tab. Here’s how: – In the Developer Tab, click on “Add Ins” – In the pop-up that follows, click on “Store” and search for “Microsoft Dynamics” and click on enter. – Once you get the results as described in the below screenshot, click on “Add.” – Click on Continue. – Go to your Finance and Operations environment. – Go to System Administration -> Setup -> Office App Parameters. – Go to App Parameters and click on “Initialize app parameters” – Go to “Registered applets” and click on “Initialize applet registration” – Go to “Registered resources” and then click on “Initialize resource registration” – Then to test it out, we can go to the “All sales orders” list click on the “Office” icon at the top right and click on one of the “non-obsolete” options. – You can either download it on your own system or you can save it directly from this screen. – When you open the downloaded excel, after enabling editing, you’ll get the following pop-up and data. – You can also use this Excel to create records in the system. – Open the downloaded excel sheet. – Click on “New”. – Add the necessary fields in the newly created rows. – Once done, click on Publish. And we can see back in D365 that we have added some new records in the system via Excel. In conclusion, I firmly believe that using Excel for manual data entry can significantly cut down on unnecessary tasks.If you’re looking to streamline your processes or maximize the potential of your ERP systems, please feel free to reach out.
Share Story :
Optimizing Data Management in Business Central using Retention Policies
Introduction Data retention policies dictate which data should be stored or archived, where it should be stored, and for how long. When the retention period for a data set expires, the data can either be deleted or moved to secondary or tertiary storage as historical data. This approach helps maintain cleaner primary storage and ensures the organization remains compliant with data management regulations. In this blog, we’ll be covering – Pre-requisites Business Central environment References Data Retention Policy Clean up Data with Retention Policy – Microsoft Learn Details In Business Central, we can define Retention Policies based on two main parameters – The table which is to be monitored and the retention policy. Retention Policy Retention periods specify how long data is kept in tables under a retention policy. These periods determine how often data is deleted. Retention periods can be as long or as short as needed. Applying a retention policy – Retention policies can be applied automatically or manually. For automatic application, enable the policy, which creates a job queue entry to apply it according to the defined retention period. By default, the job queue entry applies policies daily at 0200, but this timing can be adjusted (refer below screenshot), preferably to non-business hours. All retention policies use the same job queue entry. For manual application, use the “Apply Manually” action on the Retention Policies page and turn on the “Manual” toggle to prevent the job queue entry from applying the policy automatically. We can also exclude or include certain records based on filters. Deselect the “Apply to all records” this will show a new tab where we can define the record filters. Every such group can have it’s own retention period. By default, only a few selected tables are shown in the table selection on the Retention Policy page. If we want to include our custom table in this list, we’ll have to do a small customization. **You cannot add tables that belong to seperate modules, for example “Purchase Header” cannot be added in this list by you. Unless you work at Microsoft in which case you already knew this. ** So here I’ve created a small sample table. And I’ve created an Codeunit with Install subtype where I’m adding my custom table to the allowed tables list. After deploying I can now see my custom table in the list. Developers also have the option to set Mandatory or Default filters on the custom tables. Mandatory filters cannot be removed and Default filters can be removed. To create a mandatory/default filter – Setting the “Mandatory” parameter to true, makes it Mandatory otherwise it’s a default filter. When I add the table ID on the “Retention Policy” I get the following entry created automatically. If I try to remove the filters, I get the error – Conclusion Thus, we saw how we can leverage Retention Policies in Business Central to reduce capacity wastage without heavy customizations.
Share Story :
Understanding OData.Etag in Postman and Related Features
Introduction Open Data Protocol (oData) is a web protocol for querying and updating data. It simplifies the data exchange between clients and servers, allowing for easy integration with RESTful APIs. One important feature of oData is the use of ETags (Entity Tags), which are part of the HTTP protocol and help manage the state of resources. ETags serve as a version identifier for a resource. When a client retrieves a resource, the server sends an ETag in the response. The client can then use this ETag in subsequent requests to ensure that it is working with the most current version of that resource. What is oData.ETag? In Postman, oData.ETag refers specifically to the ETag values used in oData responses. These tags help maintain data integrity during updates. When a client attempts to update a resource, it can include the ETag in the request headers. If the ETag matches the current version on the server, the update proceeds. If not, the server rejects the request, preventing unintended data overwrites. Using oData.ETag in Postman Fetching an ETag: When you send a GET request to an oData endpoint, look for the ETag header in the response. For example:GET https://api.example.com/odata/productsThe response might look like this:HTTP/1.1 200 OKETag: “W/\”123456789\”” Updating a Resource with ETag: When you need to update the resource, include the ETag in the If-Match header of your PUT or PATCH request:PATCH https://api.example.com/odata/products(1)If-Match: “W/\”123456789\””Content-Type: application/json { “name”: “Updated Product Name”} If the ETag matches, the update occurs; otherwise, you’ll receive a 412 Precondition Failed response. Related Features in Postman Conditional Requests: Beyond oData, ETags are useful in REST APIs for conditional requests. You can use If-None-Match to check if a resource has changed before downloading it again, saving bandwidth and time. CORS Preflight Requests: When working with cross-origin requests, browsers may perform preflight checks using OPTIONS requests. Understanding ETags can help in managing these requests effectively, ensuring your API can handle them smoothly. Caching Strategies: Implementing caching with ETags can enhance performance. Postman can simulate caching behavior, allowing you to test how your API behaves when dealing with cached responses. Error Handling: Testing how your API handles errors, such as a mismatched ETag, is crucial for robustness. Postman’s test scripts can validate error responses and ensure that your API behaves as expected. Conclusion Understanding oData.ETag in Postman is essential for developers working with RESTful APIs, especially in scenarios where data integrity is critical. By leveraging ETags, you can ensure safe and efficient data updates, manage caching, and improve your overall API interactions.
Share Story :
Performance Optimization Techniques in Power BI
Introduction Building efficient Power BI reports can be challenging, especially when working with large datasets. One common issue Power BI users encounter is the “stack overflow” error, which can disrupt the report-building process. I In this blog I will share some performance optimization techniques that you can use in building power BI report. When using power query or importing data you might have got this error – “Expression.Error: Evaluation resulted in a stack overflow and cannot continue.” This error occurs when there’s a large amount of data is being imported or not enough memory available memory available for Power BI to complete the operation. This issue can be resolved by increasing the Memory and CPU cores that can be used by Power BI while querying or evaluations. There are two settings that we need to keep in mind – By default, the maximum number of simultaneous evaluations is equal to the number of logical CPU cores on the machine and Maximum memory used per simultaneous evaluation is 432 MB. Personally, I have kept these values in between or close to maximum value depending on my requirement and system. Also, here is link to recommendations by Microsoft for managing Power BI workload and evaluation configurations – https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-evaluation-configuration Conclusion Optimizing performance in Power BI is crucial for handling large datasets and preventing issues like the “stack overflow” error. By adjusting settings for simultaneous evaluations and memory allocation, you can significantly improve report processing and responsiveness. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Advance warehouse management – Wave Templates in Microsoft D365 F&O – Part 6
Introduction In this blog we will learn about the basic setups required for the Advanced Warehouse Management process. These setups may vary depending on the business scenarios. For Wave Templates to work in an Advance warehouse scenario, there are some prerequisites that we need to do first. The following are the setups that we need to configure: Wave Templates plays a significant role in advanced warehouses. Wave Templates are used for shipment of goods for Sales Orders, Transfer Order ship, Or Outbound shipment orders. Also, it is used for Production order and Kanban Orders. – For my current scenario, I will create a Wave template for a Sales Order. – Select Wave template type as “Shipping”. So, when we create the Sales order wave will be created. – There is option to Automatically create the Wave. – Following setup I have enabled. – Automate Wave creation. – Process wave at Release to warehouse. – Process wave automatically at threshold. – Automate wave release. – The following basic methods are needed to complete sales order transactions. – Also, we need to do regenerate methods step to enable methods on wave templates. – Click Regenerate methods. Now, Wave Templates are ready to use in Advance Warehouse process. That’s it for this blog!! How to use these Wave Templates in actual transactions will be discussed going forward in the blog series. Next in the Blog series: How to set up Worker in Advance warehouse management in D365. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Unlocking Seamless Financial Operations: The Power of Stripe Integration with Business Central
Introduction Integrating Stripe’s payment gateway with Microsoft Dynamics 365 Business Central can streamline the payment process for businesses, enabling seamless transactions, real-time invoicing, and efficient payment tracking. By doing this, businesses can automate the process of accepting online payments and manage financial data in a single platform. This blog will guide you through the steps involved in integrating Stripe with Business Central, as well as the benefits and essential considerations. Why Integrate Stripe with Business Central? Steps to Integrate Stripe with Business Central 1. Set up a Stripe Account To get started, you’ll first need a Stripe account if you don’t already have one: – Sign up for a Stripe account on Stripe’s website. – Complete the necessary account verification steps and configure your business information. 2. Create an extension for the business central Business Central allows the integration of third-party payment gateways through extensions. We will customize our business central to capture Payments, Refunds, and Disputes by using a Payment journal. 3. Configure setup in Business Central With the help of customization, we will create a stripe setup in Business Central where we can define balance accounts and other important values. 4. Link Stripe with Business Central via Azure app Will create azure app to capture all transaction with help of Stripe webhooks. When building Stripe integrations, you might want your applications to receive events as they occur in your Stripe accounts, so that your backend systems can execute actions accordingly. To enable webhook events, you need to register webhook endpoints. After you register them, Stripe can push real-time event data to your application’s webhook endpoint when events happen in your Stripe account. Stripe uses HTTPS to send webhook events to your app as a JSON payload that includes an Event object. Receiving webhook events is particularly useful for listening to asynchronous events such as when a customer’s bank confirms a payment, a customer disputes a charge, a recurring payment succeeds, or when collecting subscription payments. Benefits of Integration Conclusion Integrating Stripe with Microsoft Dynamics 365 Business Central simplifies the payment collection process, streamlines accounting tasks, and improves overall business efficiency. By following the steps above, businesses can easily set up this integration and begin accepting payments through Stripe directly within Business Central. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
How to Send D365 CRM Emails with Attachments Using Power Automate
Introduction In this guide, we’ll walk through the process of sending emails from D365 CRM with attachments using Power Automate. This step-by-step approach will help you understand how to automate your email communications from CRM with attachments efficiently. Use-Case Let’s say you’re working on a project where you need to send emails from D365 CRM that include attachments. In this example, the document is stored in SharePoint, and its URL is linked within the CRM record. This setup is common in CRM where files are centrally stored in SharePoint but need to be easily accessible in CRM for email communication and tracking in CRM. However, this approach is versatile—whether you want to attach specific documents, generate them dynamically, or handle a range of file types, it can be adapted to meet your use-case needs. Why this solution? Main objective of using D365 Emails is the ability to track the emails to the record to keep track of communications in timeline. Also, manually attaching documents to each email is time-consuming and prone to errors. With Power Automate, you can automate this process, ensuring that every email includes the right attachment without extra steps. This solution not only saves time but also reduces the risk of sending incorrect or outdated files, keeping your communications accurate and efficient. Implementation – Step by Step As per my use-case, I have added a column in Accounts table that will hold my SharePoint file URL which I’ll use in power automate. Step 1: Trigger the Flow when a flag is marked true to send email report. Step 2: Get the file content using SharePoint path Step 3: Create a new ‘Email Message’ record in data verse (Add a new row) Step 4: Add a new row for ‘Attachments’ and link to email message record Add the custom value as shown below Add Base64 to your file content Add file name Step 5: Send the email That’s it Let’s test it – Results Trigger the flag (as per my use-case) The Email record with attachment Conclusion By integrating Power Automate to handle attachments from SharePoint, you streamline your email process, save time, and minimize errors. This solution is especially valuable for cases requiring frequent attachments or centralized file storage, as it keeps communication efficient and files up-to-date. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
