Latest Microsoft Dynamics 365 Blogs | CloudFronts - Page 18

Error Handling Techniques in Dynamics 365 Plugins

Posted On February 12, 2025 by Vidit Gholam Posted in Tagged in

Have You Ever Struggled with Debugging Errors in Dynamics 365 Plugins? If you’ve been working with Dynamics 365 plugins, you’ve likely encountered scenarios where your plugin failed unexpectedly. Debugging these failures can be a challenge, especially in production environments where attaching a debugger is not always an option. How do you ensure that errors are logged effectively? How do you prevent the plugin from breaking critical business processes? In this blog, I will walk you through the best error-handling techniques for Dynamics 365 plugins, ensuring that you can capture, log, and handle errors gracefully. Why Trust Me? As a Microsoft Certified Trainer and Dynamics 365 Consultant, I have extensive experience working with Dynamics 365 CRM, Power Platform, and Azure. Over the years, I have encountered and resolved numerous plugin errors in live environments. Through my blogs and speaking engagements, I have shared valuable insights on building robust and scalable solutions in Dynamics 365. This expertise allows me to provide you with practical and effective error-handling strategies that you can implement immediately. Understanding Plugin Execution and Error Scenarios Before diving into error handling techniques, let’s briefly understand the plugin execution model. Plugins in Dynamics 365 execute in the sandbox (isolated) mode or full-trust (non-isolated) mode and can be synchronous or asynchronous. Common error scenarios in plugins include: Now, let’s explore how to handle these errors effectively. 1.) Using Try-Catch Blocks for Exception Handling The simplest and most effective way to handle errors is by wrapping your plugin logic inside a try-catch block. Why This Works: 2.) Using ITracingService for Logging Dynamics 365 provides the ITracingService to log debug messages, which is particularly useful in sandboxed plugins where direct debugging is not possible. Benefits: 3.) Logging Errors to a Custom Entity For persistent logging, consider storing error details in a custom entity (e.g., Plugin Error Log). Why This Helps: 4. Using Secure Configuration for External API Calls If your plugin interacts with external APIs, store credentials in the secure configuration rather than hardcoding them. Benefits: 5. Handling Recursion and Infinite Loops Dynamics 365 allows detecting recursive plugin execution using Depth in IPluginExecutionContext. Why? Conclusion Error handling in Dynamics 365 plugins is crucial for maintaining stability and ensuring seamless business operations. By implementing try-catch blocks, using tracing services, logging errors to a custom entity, managing secure configurations, and handling recursion, you can build robust and maintainable plugins. I encourage you to apply these techniques to your plugins and explore additional monitoring tools like Application Insights for even better observability. Have you faced any plugin debugging challenges? Share your experiences in the comments below! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Bank Integration and Reconciliation using D365 F&O

Timely vendor invoice processing and vendor payments means good supplier relationships and operational efficiency. Manual processing of vendor invoice and payments involve risks of amount error, duplicate payment. Challenges:   Solution : This will allow automation for invoice processing and no scope for manual intervention for payment processing and record reconciliation. Conclusion: Apt vendor invoice management is essential in building and sustaining a company’s operational capabilities and financial balance. This translates into streamlining payment operations, avoiding expensive delays, and strengthening supplier relationships. With the rise of automation and digital solutions, managing procurement and payments has become more efficient and error-free. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com. ‍

Share Story :

How to Add and Customize Tooltips in Power BI

In Power BI, tooltips are an effective way to provide additional context and details about your data. With just a hover, users can view insights that enhance their understanding of the visualization without overwhelming the main report page.Whether you’re a beginner or an experienced developer, learning how to add and customize tooltips in Power BI can significantly improve your report’s interactivity and user experience. This blog will guide you through the process, offering tips to create tooltips that are both informative and visually appealing. 1. What Are Tooltips in Power BI?Tooltips are pop-up details that appear when users hover over a data point in a visualization. They can display additional information about the data, such as summary statistics, comparisons, or related insights. 2. Why Use Tooltips? 3). Step By Step Procedure  Step 1: Open the Power BI report and create a visual. Step 2: Create a new page in Power BI, then go to Visualization – Format Your Report – Canvas Setting – Select Option Tooltip. Visualization – Format Your Report Canvas Setting Step 3: Then add the related visual that you need to add as a tooltip Step 4: Then click on the visual where you have to add the tooltips. ON the tooltip option and select the page where you have added the Tooltip. Step 5: Final Look of the visualization. Conclusion: Tooltips are a powerful feature in Power BI that can elevate the interactivity and usability of your reports. By adding custom tooltips, you can provide deeper insights without compromising the clarity of your main visuals. Following these steps and best practices will help you create tooltips that enhance your report’s overall impact. Ready to enhance your Power BI reports with custom tooltips? Start by experimenting with a simple tooltip page in your existing report. For more Power BI tips and tricks, explore our other blogs or contact out to us at transform@cloudfonts.com.

Share Story :

Clear Tax GST integration with D365 F&O

In order to operate and prosper Companies need to complete several compliances. Legal compliances are crucial as non-compliance attracts financial penalties, interest charges, and additional tax assessments. For businesses, tax compliance is crucial for maintaining a good reputation and building trust with customers, suppliers, and investors Critical Issue:   Manual data upload in GST portal for GST return filing. Generating E-Invoices and E-Way bill manually. Challenges:  Risk of errors in manual processing. Delays in data synchronization impacting compliance. Solution : Finance clear tax integration for D365 helps to manage e-way bill, e-invoicing through integration with GSP portal for GST. It automates the following : •Generate e-invoice, e-way bill. •Fetch IRN Number, QR Code & E-way bill number. •Cancel e-invoice, e-way bill. Conclusion: Ensuring tax compliance involves understanding your tax obligations, keeping accurate records, and staying informed about changes in tax laws. In addition, by enabling automation in compliances, Companies can achieve and maintain data accuracy, scalability, and enhanced reporting and real time updates. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Create a paginated report from a Schematic model in Report Builder

Posted On February 6, 2025 by Siddhesh Pal Posted in

In a data-rich business environment, delivering structured, print-ready reports is essential for effective decision-making. Paginated reports excel in providing detailed, scalable outputs such as invoices, financial statements, and operational summaries.For professionals working with a schematic model, Report Builder offers an intuitive platform for creating these reports. This blog will guide you through the process of designing a paginated report from your schematic model, ensuring accuracy and efficiency. 1. What Is a Schematic Model?A schematic model is a visual blueprint that outlines the structure and relationships between data entities like tables, columns, and keys. It is used to standardize and optimize data queries for reporting purposes. 2. What Are Paginated Reports?Paginated reports are highly formatted outputs designed for printing or sharing as PDF, Word, or Excel files. Unlike interactive dashboards, these reports are ideal for scenarios requiring precise layouts and handling large data sets. Step-by-Step Guide to Creating a Paginated Report in Report Builder. Step-1: Open the Report Builder and Select on the Get Data, Get the data from the Semantic model. Step-2: Develop the report and publish the report. Step-3: Review the report in the power bi service Conclusion:Building paginated reports from a schematic model ensures accuracy, scalability, and professionalism. By using Report Builder, you can transform your raw data into actionable, structured reports that meet business requirements. Ready to create your first paginated report? Start by analyzing your schematic model and defining your reporting needs. If you need guidance, feel free to explore more resources or contact our team for expert advice, you can reach out to us at transform@cloudfonts.com.

Share Story :

Creating and Accessing Blob Storage with Azure Data Factory: A Complete Guide

Introduction: This guide will walk you through creating and accessing Azure Blob Storage and integrating it with Azure Data Factory to automate data pipelines. From setting up a storage account and managing containers to configuring pipelines and transferring data to an Azure SQL Database, this step-by-step tutorial ensures you gain a comprehensive understanding of the process. Steps: 3. Click on + Create to initiate the creation of a new storage account. 4. Fill in the required fields like subscription, resource group, and region. Review all the settings before proceeding. 5. Create a Storage Account 6. Once the storage account is created, go to the resource by clicking on Go to Resource. 7. In the storage account, navigate to the Containers section and click + Container to create a new container for storing your files. 8. Click on the container you just created to access its contents. 9. Upload the desired JSON file into the container by clicking on Upload and selecting the file from your local system. 10. Ensure that the uploaded file is now listed in the container. 11. Go back to the Azure Portal and search for Azure Data Factory to open the ADF service. 12. From the ADF home screen, go to Author > Datasets. Click + New Dataset to create a new dataset for your Blob Storage. 13. Select the Azure Blob Storage dataset type, as you are working with data stored in Blob Storage. 14. Choose the data format that matches the file you uploaded, such as JSON, and click Continue. 15. Enter the necessary details for your dataset, including the file path and format settings. Select the appropriate Authentication type and specify the Storage account where the Blob Storage resides. Click Create to finalize the dataset creation. 16. Verify the settings and click OK to confirm the dataset configuration. 17. Navigate to the Pipelines section and click + New Pipeline to create a pipeline that will define your data flow. 18. Pipeline gets created successfully as shown below. 19. In the pipeline, select the dataset type as Azure SQL Database and click Continue to set up the SQL Database dataset. 20. Provide the necessary Linked Service details for your SQL database and click Create. 21. After configuring both the source and target datasets, and the pipeline, publish all the elements to save your work. 22. Once the pipeline is running successfully, you can verify its functionality by querying the destination database to ensure data is being transferred properly. a. Go to the SQL Database and select the relevant database. b. Select the database on which we have perform a query. c. Log in with your credentials. d. Write a simple test query to verify data has been transferred from Blob Storage to the SQL Database. Execute the query and confirm that the expected output is returned. Conclusion: Integrating Azure Blob Storage with Azure Data Factory is a powerful way to manage and automate data workflows in the cloud. This guide walks you through creating a storage account, configuring containers, uploading data, and designing a pipeline to process and transfer data to Azure SQL Database. By following these steps, you can efficiently handle large-scale data integration and ensure seamless communication between your data sources and destinations. Azure Data Factory not only simplifies the process of orchestrating data pipelines but also provides robust options for monitoring and optimizing workflows. Whether you are managing JSON files, processing transactional data, or setting up complex ETL processes, Azure’s ecosystem offers a reliable and scalable solution. Start exploring these tools today to unlock new possibilities in data-driven operations! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Connecting Application Insights Logs and Query Through Logic Apps

Application Insights is a powerful monitoring tool within Azure that provides insights into application performance and diagnostics. Logic Apps, on the other hand, enable workflow automation for integrating various Azure services. By combining these tools, you can automate querying Application Insights logs and take actions based on the results. This blog explains how to set up this connection step-by-step. Prerequisites Before proceeding, ensure you have the following: Step 1: Enable Logs in Application Insights To ensure Application Insights data is accessible: Step 2: Create a KQL Query KQL (Kusto Query Language) is used to query Application Insights logs: Step 3: Set Up a Logic App Create a Logic App that will query Application Insights: Step 4: Configure Logic App Actions To execute and process the query: 2. Add a Body for the request: “`json { “query”: “traces | where timestamp >= ago(1h) | summarize Count=count() by severityLevel” } 3. Add actions to handle the response, such as sending an email or creating an alert based on the query results. Step 5: Test the Workflow Use Cases Conclusion Integrating Application Insights logs with Logic Apps is a straightforward way to automate log queries and responses. By leveraging the power of KQL and Azure’s automation capabilities, you can create robust workflows that monitor and react to your application’s performance metrics in real-time. Explore these steps to maximize the synergy between Application Insights and Logic Apps for a more proactive and automated approach to application monitoring and management. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Create a paginated report from a Schematic model

In data analytics, paginated reports are essential for creating detailed, print-ready documents like financial statements, invoices, and performance reports. These reports are perfect for scenarios where a clear and well-organized layout is required.So, how can you create these reports using a schematic model? In this blog, we’ll break it down step by step, showing you how to turn raw data into meaningful, easy-to-read reports. Core Content 1. What Is a Schematic Model?A schematic model is a structured representation of your data, showing relationships between entities like tables, columns, and keys. It acts as the blueprint for querying and organizing your data efficiently. Tools like Power BI and SQL Server Analysis Services (SSAS) commonly use schematic models to simplify data workflows. 2. Why Paginated Reports Matter Step-by-Step Guide to Creating a Paginated Report Step-1: Open the Power BI Service and select the report semantic model and there is an option for Create Paginated Report. Step-2: After opening you will find the Editor page from where you can develop the report Step-3: Design the report as per you requirement After creating the report, save the report and you can see new paginated report is visible in service. Conclusion:Creating a paginated report from a schematic model is a streamlined process when approached methodically. By leveraging a structured model, you ensure accuracy, scalability, and professional presentation for your business needs. CTA:Ready to transform your data into actionable insights? Start exploring your schematic model today and design your first paginated report. For guidance or best practices, explore more resources or reach out to our team of experts. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Building Real-Time Dashboards with Azure Stream Analytics and Power BI

Real-time dashboards are essential for monitoring live data and gaining instant insights into business operations. Azure Stream Analytics and Power BI provide an efficient way to process and visualize streaming data. In this blog, we will walk through the steps to build a real-time dashboard using these tools, with illustrative images to guide you. Why Real-Time Dashboards Are Needed In today’s fast-paced world, businesses need to make decisions quickly based on live data. Real-time dashboards enable organizations to: Use Cases for Real-Time Dashboards Real-time dashboards can be applied across various industries, including: Prerequisites Before we begin, ensure you have the following: Step 1: Set Up Your Data Source

Share Story :

Building Custom Solutions with Low-Code Plugins: Part 1- Overview

Low-code development has revolutionized the way businesses build software applications. By providing a visual, drag-and-drop interface, low-code platforms enable developers to quickly create complex applications without writing much code. However, even with the power of low-code platforms, there may be times when you need to extend their capabilities to meet specific business requirements. This is where low-code plugins come into play. Low-code plugins are small pieces of software that can be added to a low-code platform to extend its functionality. In this blog post, we will discuss the benefits of using low-code plugins, the steps involved in creating them, and some tips for successful development. Benefits of Using Low-Code Plugins Low-code plugins offer a number of benefits for businesses, including: Steps in Creating a Low-Code Plugin The process of creating a low-code plugin typically involves the following steps: Tips for Successful Low-Code Plugin Development Here are some tips for developing successful low-code plugins: Example Use Cases Low-code plugins can be used to solve a variety of business problems. Here are some examples: Conclusion Low-code plugins offer a powerful way to extend the capabilities of low-code platforms and create custom solutions that meet specific business needs. By following the steps outlined in this blog post and incorporating the tips for successful development, you can effectively leverage low-code plugins to drive innovation and achieve your business objectives. Later we will see working of the Low-Code Plugin in Dynamics 365 CRM with an example. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange