Category Archives: Blog
Create a paginated report from a Schematic model in Report Builder
In a data-rich business environment, delivering structured, print-ready reports is essential for effective decision-making. Paginated reports excel in providing detailed, scalable outputs such as invoices, financial statements, and operational summaries.For professionals working with a schematic model, Report Builder offers an intuitive platform for creating these reports. This blog will guide you through the process of designing a paginated report from your schematic model, ensuring accuracy and efficiency. 1. What Is a Schematic Model?A schematic model is a visual blueprint that outlines the structure and relationships between data entities like tables, columns, and keys. It is used to standardize and optimize data queries for reporting purposes. 2. What Are Paginated Reports?Paginated reports are highly formatted outputs designed for printing or sharing as PDF, Word, or Excel files. Unlike interactive dashboards, these reports are ideal for scenarios requiring precise layouts and handling large data sets. Step-by-Step Guide to Creating a Paginated Report in Report Builder. Step-1: Open the Report Builder and Select on the Get Data, Get the data from the Semantic model. Step-2: Develop the report and publish the report. Step-3: Review the report in the power bi service Conclusion:Building paginated reports from a schematic model ensures accuracy, scalability, and professionalism. By using Report Builder, you can transform your raw data into actionable, structured reports that meet business requirements. Ready to create your first paginated report? Start by analyzing your schematic model and defining your reporting needs. If you need guidance, feel free to explore more resources or contact our team for expert advice, you can reach out to us at transform@cloudfonts.com.
Share Story :
Creating and Accessing Blob Storage with Azure Data Factory: A Complete Guide
Introduction: This guide will walk you through creating and accessing Azure Blob Storage and integrating it with Azure Data Factory to automate data pipelines. From setting up a storage account and managing containers to configuring pipelines and transferring data to an Azure SQL Database, this step-by-step tutorial ensures you gain a comprehensive understanding of the process. Steps: 3. Click on + Create to initiate the creation of a new storage account. 4. Fill in the required fields like subscription, resource group, and region. Review all the settings before proceeding. 5. Create a Storage Account 6. Once the storage account is created, go to the resource by clicking on Go to Resource. 7. In the storage account, navigate to the Containers section and click + Container to create a new container for storing your files. 8. Click on the container you just created to access its contents. 9. Upload the desired JSON file into the container by clicking on Upload and selecting the file from your local system. 10. Ensure that the uploaded file is now listed in the container. 11. Go back to the Azure Portal and search for Azure Data Factory to open the ADF service. 12. From the ADF home screen, go to Author > Datasets. Click + New Dataset to create a new dataset for your Blob Storage. 13. Select the Azure Blob Storage dataset type, as you are working with data stored in Blob Storage. 14. Choose the data format that matches the file you uploaded, such as JSON, and click Continue. 15. Enter the necessary details for your dataset, including the file path and format settings. Select the appropriate Authentication type and specify the Storage account where the Blob Storage resides. Click Create to finalize the dataset creation. 16. Verify the settings and click OK to confirm the dataset configuration. 17. Navigate to the Pipelines section and click + New Pipeline to create a pipeline that will define your data flow. 18. Pipeline gets created successfully as shown below. 19. In the pipeline, select the dataset type as Azure SQL Database and click Continue to set up the SQL Database dataset. 20. Provide the necessary Linked Service details for your SQL database and click Create. 21. After configuring both the source and target datasets, and the pipeline, publish all the elements to save your work. 22. Once the pipeline is running successfully, you can verify its functionality by querying the destination database to ensure data is being transferred properly. a. Go to the SQL Database and select the relevant database. b. Select the database on which we have perform a query. c. Log in with your credentials. d. Write a simple test query to verify data has been transferred from Blob Storage to the SQL Database. Execute the query and confirm that the expected output is returned. Conclusion: Integrating Azure Blob Storage with Azure Data Factory is a powerful way to manage and automate data workflows in the cloud. This guide walks you through creating a storage account, configuring containers, uploading data, and designing a pipeline to process and transfer data to Azure SQL Database. By following these steps, you can efficiently handle large-scale data integration and ensure seamless communication between your data sources and destinations. Azure Data Factory not only simplifies the process of orchestrating data pipelines but also provides robust options for monitoring and optimizing workflows. Whether you are managing JSON files, processing transactional data, or setting up complex ETL processes, Azure’s ecosystem offers a reliable and scalable solution. Start exploring these tools today to unlock new possibilities in data-driven operations! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Connecting Application Insights Logs and Query Through Logic Apps
Application Insights is a powerful monitoring tool within Azure that provides insights into application performance and diagnostics. Logic Apps, on the other hand, enable workflow automation for integrating various Azure services. By combining these tools, you can automate querying Application Insights logs and take actions based on the results. This blog explains how to set up this connection step-by-step. Prerequisites Before proceeding, ensure you have the following: Step 1: Enable Logs in Application Insights To ensure Application Insights data is accessible: Step 2: Create a KQL Query KQL (Kusto Query Language) is used to query Application Insights logs: Step 3: Set Up a Logic App Create a Logic App that will query Application Insights: Step 4: Configure Logic App Actions To execute and process the query: 2. Add a Body for the request: “`json { “query”: “traces | where timestamp >= ago(1h) | summarize Count=count() by severityLevel” } 3. Add actions to handle the response, such as sending an email or creating an alert based on the query results. Step 5: Test the Workflow Use Cases Conclusion Integrating Application Insights logs with Logic Apps is a straightforward way to automate log queries and responses. By leveraging the power of KQL and Azure’s automation capabilities, you can create robust workflows that monitor and react to your application’s performance metrics in real-time. Explore these steps to maximize the synergy between Application Insights and Logic Apps for a more proactive and automated approach to application monitoring and management. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Create a paginated report from a Schematic model
In data analytics, paginated reports are essential for creating detailed, print-ready documents like financial statements, invoices, and performance reports. These reports are perfect for scenarios where a clear and well-organized layout is required.So, how can you create these reports using a schematic model? In this blog, we’ll break it down step by step, showing you how to turn raw data into meaningful, easy-to-read reports. Core Content 1. What Is a Schematic Model?A schematic model is a structured representation of your data, showing relationships between entities like tables, columns, and keys. It acts as the blueprint for querying and organizing your data efficiently. Tools like Power BI and SQL Server Analysis Services (SSAS) commonly use schematic models to simplify data workflows. 2. Why Paginated Reports Matter Step-by-Step Guide to Creating a Paginated Report Step-1: Open the Power BI Service and select the report semantic model and there is an option for Create Paginated Report. Step-2: After opening you will find the Editor page from where you can develop the report Step-3: Design the report as per you requirement After creating the report, save the report and you can see new paginated report is visible in service. Conclusion:Creating a paginated report from a schematic model is a streamlined process when approached methodically. By leveraging a structured model, you ensure accuracy, scalability, and professional presentation for your business needs. CTA:Ready to transform your data into actionable insights? Start exploring your schematic model today and design your first paginated report. For guidance or best practices, explore more resources or reach out to our team of experts. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Building Real-Time Dashboards with Azure Stream Analytics and Power BI
Real-time dashboards are essential for monitoring live data and gaining instant insights into business operations. Azure Stream Analytics and Power BI provide an efficient way to process and visualize streaming data. In this blog, we will walk through the steps to build a real-time dashboard using these tools, with illustrative images to guide you. Why Real-Time Dashboards Are Needed In today’s fast-paced world, businesses need to make decisions quickly based on live data. Real-time dashboards enable organizations to: Use Cases for Real-Time Dashboards Real-time dashboards can be applied across various industries, including: Prerequisites Before we begin, ensure you have the following: Step 1: Set Up Your Data Source
Share Story :
Building Custom Solutions with Low-Code Plugins: Part 1- Overview
Low-code development has revolutionized the way businesses build software applications. By providing a visual, drag-and-drop interface, low-code platforms enable developers to quickly create complex applications without writing much code. However, even with the power of low-code platforms, there may be times when you need to extend their capabilities to meet specific business requirements. This is where low-code plugins come into play. Low-code plugins are small pieces of software that can be added to a low-code platform to extend its functionality. In this blog post, we will discuss the benefits of using low-code plugins, the steps involved in creating them, and some tips for successful development. Benefits of Using Low-Code Plugins Low-code plugins offer a number of benefits for businesses, including: Steps in Creating a Low-Code Plugin The process of creating a low-code plugin typically involves the following steps: Tips for Successful Low-Code Plugin Development Here are some tips for developing successful low-code plugins: Example Use Cases Low-code plugins can be used to solve a variety of business problems. Here are some examples: Conclusion Low-code plugins offer a powerful way to extend the capabilities of low-code platforms and create custom solutions that meet specific business needs. By following the steps outlined in this blog post and incorporating the tips for successful development, you can effectively leverage low-code plugins to drive innovation and achieve your business objectives. Later we will see working of the Low-Code Plugin in Dynamics 365 CRM with an example. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Building Better Forms: Mastering Form Components in Dynamics 365
In today’s ever-evolving app development landscape, delivering an exceptional user experience is critical. Power Apps offers various tools to help developers create intuitive and efficient applications, and one of the standout features is the Form Component. This feature simplifies the design and usability of forms, making applications more scalable and maintainable. What Are Form Components? Form Components in Power Apps are modular elements that can be created once and reused across multiple forms or applications. By utilizing these components, developers can maintain consistency in design, functionality, and behavior. Essentially, they act as reusable building blocks for forms, streamlining the development process and enhancing the user experience. A common use case for Form Components is displaying entity-specific forms, such as a Quote Lookup field. Let’s explore how to implement a Form Component for this scenario. Implementing a Form Component for the Quote Lookup Field Imagine you have a requirement to display the form of a specific entity, such as a Quote, using the Quote Lookup field. Follow these steps to set it up: After selecting the form in the Component, the Lookup field will display like this Save and Publish: After adding the Form Component, click ‘Save’ and then ‘Publish’ to apply your changes. Key Considerations Once the setup is complete, your Quote Lookup field will display the desired form seamlessly. Here’s how it will look: With these steps, you can enhance the functionality of your forms and deliver a better user experience in your Dynamics 365 applications. Happy developing! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
PowerApps Copilot: Transforming Formula Creation with New Features
Introduction PowerApps continues to evolve with new features that simplify formula creation and make app development more accessible for everyone. The recent updates bring innovative tools like natural language-based Power Fx formula generation and enhanced formula explanations. In this blog, we’ll explore these new features and provide actionable tips and tricks to help you leverage them effectively in your apps. 1. Generate Power Fx Formulas Using Natural Language One of the standout updates is the ability to create Power Fx formulas using natural language instructions. This feature is perfect for both beginners and experienced developers looking to save time. How It Works: Practical Tip: Use natural language for complex formulas that are hard to write manually, such as: This approach accelerates formula creation, reduces errors, and lowers the learning curve for new users. 2. Enhanced Formula Explanation for Better Understanding Have you ever been puzzled by a long or intricate formula? The enhanced formula explanation feature can help by providing plain language explanations for selected parts of a formula. How It Works: Practical Tip: 3. Multi-Language Support in Formula Generation With the growing global adoption of PowerApps, formula generation now supports multiple languages. This feature ensures that users can work comfortably in their preferred language. How It Works: Practical Tip: Use this feature when collaborating with teams across regions. It allows contributors to describe actions in their native language, making formula generation inclusive and efficient. 4. Speed Up App Development with AI Assistance AI-based suggestions in the formula bar aren’t just for natural language inputs. They can help optimize existing formulas and suggest best practices as you build. How It Works: Practical Tip: Examples below Hope this helps Conclusion The latest PowerApps formula updates are game changers for app developers. From generating formulas with natural language to debugging them with enhanced explanations, these features simplify app development and make PowerApps more accessible to users of all skill levels. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Mastering Concurrency in Power Automate: An Essential Guide for Optimized Workflows
Introduction Power Automate has revolutionized process automation by offering a low-code platform for building efficient workflows. However, when dealing with large-scale data or simultaneous operations, concurrency becomes a critical concept. Understanding and managing concurrency ensures that workflows run smoothly without performance bottlenecks or data integrity issues. In this blog, we’ll explore the concept of concurrency in Power Automate, its implications, and how to configure it effectively. Along the way, we’ll illustrate the topic with a practical example to help you grasp its real-world application. 1. What Is Concurrency in Power Automate? Concurrency refers to the ability of a workflow to execute multiple iterations or steps simultaneously. While concurrency can significantly speed up workflows, it must be handled carefully to avoid conflicts, particularly when working with shared resources or sequential processes. 2. Why Concurrency Matters Managing concurrency effectively can: However, improper configuration can lead to issues like data overwrites, skipped steps, or exceeding service limits. 3. Configuring Concurrency in Power Automate a) Setting Concurrency in Loop Actions Loop actions (e.g., “Apply to each”) in Power Automate have a concurrency control setting that determines how many items can be processed in parallel. b) Default Setting: By default, loops run sequentially. 4. Practical Example: Parallel Processing for Email Notifications a) Scenario: Your organization frequently sends mass email notifications to users based on CRM data. Using sequential processing causes delays, especially for large datasets. b) Solution: Implement a Power Automate workflow with concurrency enabled: Trigger: The workflow starts with a scheduled recurrence trigger or a Dataverse event. Data Retrieval: Fetch user data from Dataverse or SharePoint. Apply to Each: Enable concurrency control for the “Apply to Each” loop. Set a parallelism degree of 5 to process 5 emails simultaneously. Send Email: Each iteration sends an email notification to a user. Error Handling: Use retry policies or error-handling branches to manage failures. Outcome: The workflow completes email notifications significantly faster, improving operational efficiency while maintaining reliability. Following image contains settings of ‘Apply to Each’ action in Power Automate 5. Key Considerations and Best Practices a) Identify Dependencies: Avoid enabling concurrency for workflows with interdependent steps. b) Service Limits: Check Power Automates limits to prevent throttling. c) Monitor Performance: Use Power Automate analytics to monitor workflow performance and adjust settings as needed. d) Test Before Deployment: Ensure workflows behave as expected under concurrent execution. Conclusion Concurrency in Power Automate is a powerful tool for optimizing workflows, especially when handling bulk operations or parallel tasks. By understanding its settings and best practices, you can design workflows that are both efficient and reliable. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Taking a deep dive into the physical and financial postings in Dynamics 365 F&O.
In Dynamics 365 Finance & Operations (D365F&O), the concepts of physical and financial posting are at the core of inventory and transaction management. Understanding how these two processes work and their impact on inventory valuation and ledger updates is crucial for maintaining accurate financial records and operational efficiency. The Physical and Financial posting Checkboxes are in the Item Model Group. The pathway for which is: Inventory Management>Set Up> Inventory> Item Model Group. So, what is Physical Posting? Physical Posting refers to recording the movement or status change of Inventory Items without affecting the Financial Ledger. If this option is cleared, packing slips, product receipts, and production orders that are reported as finished are not posted in the ledger, regardless of the settings in the parameter setup pages. These transactions track physical inventory levels and ensure operational accuracy. Examples of physical postings include: Physical postings are essential for operational teams to track stock levels and manage logistics effectively. However, they do not impact the financial statements until a corresponding financial posting occurs. What is Financial Posting? Financial Posting occurs when a Transaction affects the company’s General Ledger, impacting financial accounts such as Cost of Goods Sold (COGS) and Accounts Payable/Receivable. If this option is cleared, the way accounting entries are handled changes significantly to simplify the process. When a purchase order is invoice-updated, the value of the items is posted only to the item consumption account and not to the inventory receipt account. Similarly, when a sales order is invoice-updated, no entries are made in either the item consumption account or the issue account. This option is especially helpful for service items, where posting item consumption during sales order invoicing isn’t necessary. By clearing this option, the journal lines for these items do not generate any ledger postings, keeping your financial records clean and focused without unnecessary complexities. Examples of financial postings include: Financial postings ensure that all inventory transactions are accurately reflected in financial records, enabling proper accounting and compliance with regulatory standards. Key Differences Between Physical and Financial Posting Aspect Physical Posting Financial Posting Impact Tracks inventory movement/status. Updates financial accounts. Ledger Update No impact on the general ledger. Impacts general ledger accounts. Use Case Operational purposes (e.g., stock tracking). Financial reporting and accounting. Examples Product receipts, stock transfers. Invoices, COGS postings, sales revenue. Configuring Posting in D365F&O D365F&O allows businesses to control how physical and financial postings are handled using parameters and setups. Here’s how you can configure them: To encapsulate, Physical and financial postings in D365F&O are fundamental to achieving a seamless connection between operational processes and financial reporting. They ensure that inventory movements are accurately tracked and that financial records reflect real-time business activities. By configuring these setups correctly, organizations can enhance their decision-making capabilities, reduce errors, and maintain compliance with accounting standards. Moreover, understanding the nuances of these postings allows businesses to streamline operations. For example, leveraging features like item model groups or automated posting parameters ensures that teams can focus on strategic growth rather than manual corrections. This integration of operational and financial data also supports better collaboration between departments, paving the way for improved efficiency and transparency. Ultimately, D365F&O empowers businesses to not only track their inventory effectively but also align their financial records with operational realities, creating a robust framework for sustainable growth and success. That’s it for this blog. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.