Category Archives: Blog
How to connect logic App with APIM
In a cloud-first world, seamless integrations are the backbone of modern applications. Azure Logic Apps and API Management (APIM) are two powerful tools that enable businesses to automate workflows and manage APIs effectively.By connecting Logic Apps to APIM, you can expose your automated workflows as APIs, ensuring they are secure, scalable, and easy to manage. In this blog, we’ll walk you through the process of integrating Logic Apps with APIM to maximize the potential of your Azure ecosystem. 1. What Are Logic Apps and API Management? Logic Apps:Logic Apps is an Azure service for automating workflows, integrating various systems, and processing data efficiently. Whether it’s connecting SaaS apps, on-premises systems, or cloud services, Logic Apps excels at simplifying complex integrations. API Management (APIM):APIM is an Azure service that allows you to publish, manage, secure, and monitor APIs. It acts as a gateway for APIs, providing essential features like throttling, caching, and access control. 2. Why Integrate Logic Apps with APIM? Step-by-Step Guide to Connecting Logic Apps with APIM Step 1: Open Azure APIM and click on APIs Step 2: Click on Add API and Logic app from the Azure Resource Step 3: Browse for the logic app and give the in APIM Step 4: Click on test to test the APIM request Step 5: Check the URL and send the request After sending the request from APIM you can check the logic app is triggered. Conclusion Integrating Azure Logic Apps with API Management is a game-changer for building secure, scalable, and manageable API-driven solutions. This integration empowers businesses to expose their workflows as reusable APIs, enhance security, and maintain centralized control. Ready to connect your Logic Apps with APIM? Start by designing a simple Logic App workflow and adding it to your API Management instance. If you need expert guidance, explore more Azure integration tips on our blog or reach out to us at transform@cloudfonts.com.
Share Story :
How to configure Device License in Business Central
A Device License in Business Central is a type of user license that provides access to the application on a specific device, rather than a user-based license. This is particularly useful for scenarios where a shared device, such as a point of sale (POS) terminal, warehouse scanner, or shared workstation, needs to access Business Central without requiring individual user licenses. Why Device Licenses are Important Device licenses are particularly useful for organizations that have multiple employees using the same device at different times. Examples of such use cases include: Steps to achieve the goal
Share Story :
Correction of Inventory Cost
Inventory valuation is important for any manufacturing and trading business. The stakeholders would be Cost Accountants, CFOs and investors. Further, Inventory cost is the major budget element. Recently, we had a client raise the issue of cost price of inventory items not getting correctly calculated because of the following factors: Steps to be followed: 2. Go to Adjustment=>Fixed Price 3. It will show inventory transactions as per the selection criteria, with their quantity and cost price. 4. Click on Fixed Price and put the approved cost price per unit for the item variant.Click on the post button. 5. There will be a posting entry in Closing & Adjustment tab. 6. Run recalculation for the item as on the cost price date. 7. Review the Inventory Aging report or Inventory transaction report. The updated price must be reflected. 8. Recalculation can be run together after updating individual item’s cost price. 9. This must be done prior to the running inventory month close. Conclusion : This process should be an integral part of inventory valuation. Correct inventory valuation would ensure correct cost of goods sold (COGS), gross profit (GP) and cost value of asset in Balance Sheet. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Apply Row Level Security in Power BI
In today’s data-driven world, security is a top priority. As organizations rely on Power BI for analytics and reporting, ensuring that users only see data relevant to their roles is crucial. This is where Row-Level Security (RLS) comes into play.RLS allows you to restrict access to data at the row level based on user roles. In this blog, we’ll guide you through the process of implementing RLS in Power BI, ensuring your data is both secure and personalized for every user. What is Row-Level Security (RLS)? Row-Level Security is a feature in Power BI that enables you to control access to rows of data based on user roles. By applying RLS, you ensure that users see only the data relevant to their responsibilities, preventing unauthorized access. Why is RLS Important? Step 1: Open Power BI go to Modeling tab and click on manage roles Step 2: Add new roles select the appropriate table then filter the required data. Here I have done the filter based on the region, so I am giving access to the East region to the selected user. Step 3: Publish the report to the service or you can check from the Power BI Desktop app Step 4: Now, remove the View as the role from the desktop, publish the report in the service, and give access to the user as per requirement. Conclusion:Row-Level Security is an indispensable tool for ensuring data security and personalization in Power BI. By restricting access to data based on roles, you can enhance user experiences, improve compliance, and safeguard sensitive information. Ready to secure your Power BI reports with Row-Level Security? Start by identifying your data access requirements and defining roles in Power BI Desktop. If you need expert guidance, feel free to reach out, at transform@cloudfonts.com. or explore more Power BI tips on our blog.
Share Story :
Error Handling Techniques in Dynamics 365 Plugins
Have You Ever Struggled with Debugging Errors in Dynamics 365 Plugins? If you’ve been working with Dynamics 365 plugins, you’ve likely encountered scenarios where your plugin failed unexpectedly. Debugging these failures can be a challenge, especially in production environments where attaching a debugger is not always an option. How do you ensure that errors are logged effectively? How do you prevent the plugin from breaking critical business processes? In this blog, I will walk you through the best error-handling techniques for Dynamics 365 plugins, ensuring that you can capture, log, and handle errors gracefully. Why Trust Me? As a Microsoft Certified Trainer and Dynamics 365 Consultant, I have extensive experience working with Dynamics 365 CRM, Power Platform, and Azure. Over the years, I have encountered and resolved numerous plugin errors in live environments. Through my blogs and speaking engagements, I have shared valuable insights on building robust and scalable solutions in Dynamics 365. This expertise allows me to provide you with practical and effective error-handling strategies that you can implement immediately. Understanding Plugin Execution and Error Scenarios Before diving into error handling techniques, let’s briefly understand the plugin execution model. Plugins in Dynamics 365 execute in the sandbox (isolated) mode or full-trust (non-isolated) mode and can be synchronous or asynchronous. Common error scenarios in plugins include: Now, let’s explore how to handle these errors effectively. 1.) Using Try-Catch Blocks for Exception Handling The simplest and most effective way to handle errors is by wrapping your plugin logic inside a try-catch block. Why This Works: 2.) Using ITracingService for Logging Dynamics 365 provides the ITracingService to log debug messages, which is particularly useful in sandboxed plugins where direct debugging is not possible. Benefits: 3.) Logging Errors to a Custom Entity For persistent logging, consider storing error details in a custom entity (e.g., Plugin Error Log). Why This Helps: 4. Using Secure Configuration for External API Calls If your plugin interacts with external APIs, store credentials in the secure configuration rather than hardcoding them. Benefits: 5. Handling Recursion and Infinite Loops Dynamics 365 allows detecting recursive plugin execution using Depth in IPluginExecutionContext. Why? Conclusion Error handling in Dynamics 365 plugins is crucial for maintaining stability and ensuring seamless business operations. By implementing try-catch blocks, using tracing services, logging errors to a custom entity, managing secure configurations, and handling recursion, you can build robust and maintainable plugins. I encourage you to apply these techniques to your plugins and explore additional monitoring tools like Application Insights for even better observability. Have you faced any plugin debugging challenges? Share your experiences in the comments below! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Bank Integration and Reconciliation using D365 F&O
Timely vendor invoice processing and vendor payments means good supplier relationships and operational efficiency. Manual processing of vendor invoice and payments involve risks of amount error, duplicate payment. Challenges: Solution : This will allow automation for invoice processing and no scope for manual intervention for payment processing and record reconciliation. Conclusion: Apt vendor invoice management is essential in building and sustaining a company’s operational capabilities and financial balance. This translates into streamlining payment operations, avoiding expensive delays, and strengthening supplier relationships. With the rise of automation and digital solutions, managing procurement and payments has become more efficient and error-free. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Add and Customize Tooltips in Power BI
In Power BI, tooltips are an effective way to provide additional context and details about your data. With just a hover, users can view insights that enhance their understanding of the visualization without overwhelming the main report page.Whether you’re a beginner or an experienced developer, learning how to add and customize tooltips in Power BI can significantly improve your report’s interactivity and user experience. This blog will guide you through the process, offering tips to create tooltips that are both informative and visually appealing. 1. What Are Tooltips in Power BI?Tooltips are pop-up details that appear when users hover over a data point in a visualization. They can display additional information about the data, such as summary statistics, comparisons, or related insights. 2. Why Use Tooltips? 3). Step By Step Procedure Step 1: Open the Power BI report and create a visual. Step 2: Create a new page in Power BI, then go to Visualization – Format Your Report – Canvas Setting – Select Option Tooltip. Visualization – Format Your Report Canvas Setting Step 3: Then add the related visual that you need to add as a tooltip Step 4: Then click on the visual where you have to add the tooltips. ON the tooltip option and select the page where you have added the Tooltip. Step 5: Final Look of the visualization. Conclusion: Tooltips are a powerful feature in Power BI that can elevate the interactivity and usability of your reports. By adding custom tooltips, you can provide deeper insights without compromising the clarity of your main visuals. Following these steps and best practices will help you create tooltips that enhance your report’s overall impact. Ready to enhance your Power BI reports with custom tooltips? Start by experimenting with a simple tooltip page in your existing report. For more Power BI tips and tricks, explore our other blogs or contact out to us at transform@cloudfonts.com.
Share Story :
Clear Tax GST integration with D365 F&O
In order to operate and prosper Companies need to complete several compliances. Legal compliances are crucial as non-compliance attracts financial penalties, interest charges, and additional tax assessments. For businesses, tax compliance is crucial for maintaining a good reputation and building trust with customers, suppliers, and investors Critical Issue: Manual data upload in GST portal for GST return filing. Generating E-Invoices and E-Way bill manually. Challenges: Risk of errors in manual processing. Delays in data synchronization impacting compliance. Solution : Finance clear tax integration for D365 helps to manage e-way bill, e-invoicing through integration with GSP portal for GST. It automates the following : •Generate e-invoice, e-way bill. •Fetch IRN Number, QR Code & E-way bill number. •Cancel e-invoice, e-way bill. Conclusion: Ensuring tax compliance involves understanding your tax obligations, keeping accurate records, and staying informed about changes in tax laws. In addition, by enabling automation in compliances, Companies can achieve and maintain data accuracy, scalability, and enhanced reporting and real time updates. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Create a paginated report from a Schematic model in Report Builder
In a data-rich business environment, delivering structured, print-ready reports is essential for effective decision-making. Paginated reports excel in providing detailed, scalable outputs such as invoices, financial statements, and operational summaries.For professionals working with a schematic model, Report Builder offers an intuitive platform for creating these reports. This blog will guide you through the process of designing a paginated report from your schematic model, ensuring accuracy and efficiency. 1. What Is a Schematic Model?A schematic model is a visual blueprint that outlines the structure and relationships between data entities like tables, columns, and keys. It is used to standardize and optimize data queries for reporting purposes. 2. What Are Paginated Reports?Paginated reports are highly formatted outputs designed for printing or sharing as PDF, Word, or Excel files. Unlike interactive dashboards, these reports are ideal for scenarios requiring precise layouts and handling large data sets. Step-by-Step Guide to Creating a Paginated Report in Report Builder. Step-1: Open the Report Builder and Select on the Get Data, Get the data from the Semantic model. Step-2: Develop the report and publish the report. Step-3: Review the report in the power bi service Conclusion:Building paginated reports from a schematic model ensures accuracy, scalability, and professionalism. By using Report Builder, you can transform your raw data into actionable, structured reports that meet business requirements. Ready to create your first paginated report? Start by analyzing your schematic model and defining your reporting needs. If you need guidance, feel free to explore more resources or contact our team for expert advice, you can reach out to us at transform@cloudfonts.com.
Share Story :
Creating and Accessing Blob Storage with Azure Data Factory: A Complete Guide
Introduction: This guide will walk you through creating and accessing Azure Blob Storage and integrating it with Azure Data Factory to automate data pipelines. From setting up a storage account and managing containers to configuring pipelines and transferring data to an Azure SQL Database, this step-by-step tutorial ensures you gain a comprehensive understanding of the process. Steps: 3. Click on + Create to initiate the creation of a new storage account. 4. Fill in the required fields like subscription, resource group, and region. Review all the settings before proceeding. 5. Create a Storage Account 6. Once the storage account is created, go to the resource by clicking on Go to Resource. 7. In the storage account, navigate to the Containers section and click + Container to create a new container for storing your files. 8. Click on the container you just created to access its contents. 9. Upload the desired JSON file into the container by clicking on Upload and selecting the file from your local system. 10. Ensure that the uploaded file is now listed in the container. 11. Go back to the Azure Portal and search for Azure Data Factory to open the ADF service. 12. From the ADF home screen, go to Author > Datasets. Click + New Dataset to create a new dataset for your Blob Storage. 13. Select the Azure Blob Storage dataset type, as you are working with data stored in Blob Storage. 14. Choose the data format that matches the file you uploaded, such as JSON, and click Continue. 15. Enter the necessary details for your dataset, including the file path and format settings. Select the appropriate Authentication type and specify the Storage account where the Blob Storage resides. Click Create to finalize the dataset creation. 16. Verify the settings and click OK to confirm the dataset configuration. 17. Navigate to the Pipelines section and click + New Pipeline to create a pipeline that will define your data flow. 18. Pipeline gets created successfully as shown below. 19. In the pipeline, select the dataset type as Azure SQL Database and click Continue to set up the SQL Database dataset. 20. Provide the necessary Linked Service details for your SQL database and click Create. 21. After configuring both the source and target datasets, and the pipeline, publish all the elements to save your work. 22. Once the pipeline is running successfully, you can verify its functionality by querying the destination database to ensure data is being transferred properly. a. Go to the SQL Database and select the relevant database. b. Select the database on which we have perform a query. c. Log in with your credentials. d. Write a simple test query to verify data has been transferred from Blob Storage to the SQL Database. Execute the query and confirm that the expected output is returned. Conclusion: Integrating Azure Blob Storage with Azure Data Factory is a powerful way to manage and automate data workflows in the cloud. This guide walks you through creating a storage account, configuring containers, uploading data, and designing a pipeline to process and transfer data to Azure SQL Database. By following these steps, you can efficiently handle large-scale data integration and ensure seamless communication between your data sources and destinations. Azure Data Factory not only simplifies the process of orchestrating data pipelines but also provides robust options for monitoring and optimizing workflows. Whether you are managing JSON files, processing transactional data, or setting up complex ETL processes, Azure’s ecosystem offers a reliable and scalable solution. Start exploring these tools today to unlock new possibilities in data-driven operations! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
