The Hidden Power BI Feature That Will Transform Your Data Automation
Are you tired of manually writing complex DAX queries for your Power Automate flows? What if Power BI has been secretly recording every optimized query for you all along? The Challenge Every Power BI Developer Faces For growing businesses, as much as their dashboards and reports are important, automating data workflows becomes equally crucial. As organizations scale, the need to extract Power BI insights programmatically increases, making efficient query extraction essential to maintaining operational flow and development productivity. If you’re considering streamlining your Power BI to Power Automate integration process, this article is for you. I’m confident this article will guide you in mastering a Power BI technique that helps you achieve these impressive productivity gains. Key Takeaways What Exactly is Performance Analyzer? Performance Analyzer is Power BI’s built-in diagnostic tool that captures every single operation happening behind the scenes when you interact with your reports. Think of it as a detailed activity log that records not just what happened, but exactly how Power BI executed each query. Most developers use it for performance troubleshooting, but here’s the secret: it’s actually your gateway to extracting production-ready DAX queries for automation. Step 1: Unleashing the Performance Analyzer Accessing Your Hidden Toolkit The Performance Analyzer isn’t hidden in some obscure menu – it’s right there in your Power BI Desktop ribbon, waiting to revolutionize your workflow. To activate Performance Analyzer: Starting Your Query Capture Session Think of this as putting Power BI under a microscope. Every interaction you make will be recorded and analyzed. The capture process: Step 2: Extracting the Golden DAX Queries Decoding the Performance Data When you expand any visual event in the Performance Analyzer, you’ll see several components: Here’s where it gets exciting: Click on “Copy query” next to the DAX Query section. Real-World Example: Sales Dashboard Automation Let’s say you have a sales dashboard with a card showing total revenue. After recording and expanding the performance data, you might extract a DAX query like this: This is pure gold – it’s the exact query Power BI uses internally, optimized and ready for reuse! The DAX queries can be used in the following areas: To conclude, I encourage you to take a close look at your current Power BI automation processes. Identify one manual reporting task that you perform weekly – perhaps a sales summary, performance dashboard update, or data quality check. Start with this simple action today: Open one of your existing Power BI reports, activate Performance Analyzer, and extract just one DAX query. Then build a basic Power Automate flow using that query. This single step will demonstrate the power of this technique and likely save you hours in your next automation project. Need practical guidance on implementing this in your organization? Feel free to connect at transform@cloudfronts.com for specific solutions that can help you develop more effective Power BI automation workflows. Taking action now will lead to significant time savings and more robust automated reporting for your business.
Share Story :
Copy On-Premises SQL Database to Azure SQL Server Using ADF: A Step-by-Step Guide
Migrating an on-premises SQL database to the cloud can streamline operations and enhance scalability. Azure Data Factory (ADF) is a powerful tool that simplifies this process by enabling seamless data transfer to Azure SQL Server. In this guide, we’ll walk you through the steps to copy your on-premises SQL database to Azure SQL Server using ADF, ensuring a smooth and efficient migration. Prerequisites Before you begin, ensure you have: Step 1: Create an Azure SQL Server Database First, set up your target database in Azure: Step 2: Configure the Azure Firewall To allow ADF to access your Azure SQL Database, configure the firewall settings: Step 3: Connect Your On-Premises SQL Database to ADF Next, use ADF Studio to link your on-premises database: Step 4: Set Up a Linked Service A Linked Service is required to connect ADF to your on-premises SQL database: Step 5: Install the Integration Runtime for On-Premises Data Since your data source is on-premises, you need an Integration Runtime: Finally, ensure everything is set up correctly: Step 6: Verify and Test the Connection To conclude, migrating you’re on-premises SQL database to Azure SQL Server using ADF is a straightforward process when broken down into these steps. By setting up the database, configuring the firewall, and establishing the necessary connections, you can ensure a secure and efficient data transfer. With your data now in the cloud, you can leverage Azure’s scalability and performance to optimize your workflows. Happy migrating! Please refer to our case study of the city Council https://www.cloudfronts.com/case-studies/city-council/ to know more about how we used the Azure Data Factory and other AIS to deliver seamless integration. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com.
Share Story :
Error Handling in Azure Data Factory (ADF): Part 1
Azure Data Factory (ADF) is a powerful ETL tool, but when it comes to error handling, things can get tricky—especially when you’re dealing with parallel executions or want to notify someone on failure. In this two-part blog series, we’ll walk through how to build intelligent error handling into your ADF pipelines. This post—Part 1—focuses on the planning phase: understanding ADF’s behavior, the common pitfalls, and how to set your pipelines up for reliable error detection and notification. In Part 2, we’ll implement everything you’ve planned to use ADF control flows. Part 1: Planning for Failures Step 1: Understand ADF Dependency Behavior In ADF, activities can be connected via dependency conditions like: When multiple dependencies are attached to a single activity, ADF uses an OR condition. However, if you have parallel branches, ADF uses an AND condition for the following activity—meaning the next activity runs only if all parallel branches succeed. Step 2: Identify the Wrong Approach Many developers attempt to add a “failure email” activity after each pipeline activity, assuming it will trigger if any activity fails. This doesn’t work as expected: Step 3: Design with a Centralized Failure Handler in Mind So, what’s the right approach? Plan your pipeline in a way that allows you to handle any failure from a centralized point—a dedicated failure handler. Here’s how: Step 4: Plan Your Notification Strategy Error detection is one half of the equation. The other half is communication. Ask yourself: To conclude, start thinking about Logic Apps, Webhooks, or Azure Functions that you can plug in later to send customized notifications. We’ll cover the “how” in the next blog, but the “what” needs to be defined now. Planning for failure isn’t pessimism—it’s smart architecture.By understanding ADF’s behavior and avoiding common mistakes with parallel executions, you can build pipelines that fail gracefully, alert intelligently, and recover faster. In Part 2, we’ll take this plan and show you how to implement it step-by-step using ADF’s built-in tools. Please refer to our case study https://www.cloudfronts.com/case-studies/city-council/ to know more about how we used the Azure Data Factory and other AIS to deliver seamless integration. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com.
Share Story :
How to Recover Azure Function App Code
Azure Function Apps are a powerful tool for creating serverless applications, but losing the underlying code can be a stressful experience. Whether due to a missing backup, accidental deletion, or unclear deployment pipelines, the need to recover code becomes critical. Thankfully, even without backups, there are ways to retrieve and reconstruct your Azure Function App code using the right tools and techniques. In this blog, we’ll guide you through a step-by-step process to recover your code, explore the use of decompilation tools, and share preventive tips to help you avoid similar challenges in the future. Step 1: Understand Your Function App Configuration Step 2: Retrieve the DLL File To recover your code, you need access to the compiled assembly file (DLL).From Kudu (Advanced Tools), navigate to the site/wwwroot/bin directory where the YourFunctionApp.dll file resides and download it. Step 3: Decompile the DLL File Once you have the DLL file, use a .NET decompiler to extract the source code by opening .dll file using a .Net decompiler and running the decompiler script. The decompiler I have used here is dotPeek which is a free .Net decompiler. To Conclude, recovering a Function App without backups might seem daunting, but by understanding its configuration, retrieving the compiled DLL, and using decomplication tools, you can successfully reconstruct your code. To prevent such situations in the future you can enable Source Control to Integrate your Function App with GitHub or Azure DevOps or set backups. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com. Please refer to our customer success story Customer Success Story – BUCHI | CloudFronts to know more about how we used the function app and other AIS to deliver seamless integration. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Trim and Remove Spaces from Multiple Columns in Power Query
Efficient data cleaning is a crucial step in any data preparation process, and Power Query makes it easy to handle common tasks like trimming and removing unnecessary spaces with functions that you can apply across multiple columns and queries at once. By creating and invoking a function, you can quickly trim and remove spaces from all the columns & tables you need, saving time and effort. In this blog, we’ll show you how to use Power Query functions to streamline your data-cleaning process. The power query we are going to use to trim text in columns is – (text as text, optional char_to_trim as text) =>letchar = if char_to_trim = null then ” ” else char_to_trim,split = Text.Split(text, char),removeblanks = List.Select(split, each _ <> “”),result=Text.Combine(removeblanks, char)inresult This Power Query function takes text as input and removes extra spaces or a specified character from a text string. It splits the text into parts, filters out empty strings, and recombines the cleaned parts using the specified character. If no character is provided, it defaults to removing spaces. The power query we are going to use to remove spaces from the text is – (InputTxt as text) => let Clendata = Text.Combine(List.Select(Text.Split(Text.Trim(InputTxt),” “),each _ <> “”),“”) in Clendata The Power Query function removes all spaces from a given text string. It trims the input, splits it by spaces, filters out blanks, and then combines the parts into a single string. The result is a clean, space-free text, ideal for standardized data preparation. Now, we have our power query function ready, we can use this function across multiple columns or dataset. To do so, go to Add Column > Invoke Custom Function > Your Power Query Function. To conclude, Cleaning and transforming data in Power Query become much easier and more efficient with the use of custom functions. Whether you need to remove spaces, clean multiple columns, or standardize text, these functions save time and ensure consistency across your dataset. By applying these techniques, you can handle large, messy datasets with ease, making your data ready for analysis or reporting. Start implementing these simple yet powerful methods today to streamline your data preparation process! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Add Tooltip for Column Headers in Power BI: A Step-by-Step Guide
Introduction Tooltips are a powerful feature in Power BI, offering additional context and information for visuals. However, tooltips are not natively supported for column headers in Power BI. This means users cannot view detailed information about column headers directly. Fortunately, there’s a workaround to address this limitation, which we’ll explore in this blog post. The method we are going to use is the Action buttons. We will add an action button to column headers, and the action button will have a tooltip that shows header details. Please find a step-by-step guide for the same – 1. First, we need to insert a button. Here I have used a blank button since I want to show the tooltip when the user hovers through a particular column area, but you can use any of the buttons based on your choice. 2. Now, go to the button visual settings, turn on the action button, set the type to bookmark with None as an option, and specify the tooltip. Conclusion In this blog post, we discussed adding tooltips to the column headers to give users more context. we hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com.
Share Story :
How to run an SSRS report on a selected record in the CRM using FetchXml
If you are working with SSRS reports in Dynamics 365 CRM, you may have to run or use the report for a single selected record, depending on your need. In this blog, I will explain how to run a report on a selected record in CRM. Running a report on a selected record is also referred to as prefiltering, so let’s first understand what prefiltering is in an SSRS Report. What is Prefiltering in an SSRS Report? Prefiltering in an SSRS Report is the process of applying filters before data is retrieved from the data source. It limits the dataset to include only the relevant records by incorporating conditions directly into the query or stored procedure used to fetch data for the report. It only shows relevant data that is needed for the reporting. To add prefiltering to an SSRS report, follow these steps: – First, identify the entity you want to run the report for. If the same report needs to be run on multiple entities in CRM, you will need to add the prefiltering condition to all required entities. – For example, I have considered this project entity <entity name=”msdyan_project” enableprefiltering=”1″ prefilterparametername=”Parameter1″> enableprefiltering = “1” specifies that the data for ‘msdyan_project’ will be filter before report is shown for user. – When running the above query, a parameter named “Parameter1” will be created automatically in Visual Studio. – Now, publish the SSRS report to the desired environment and add the required field to the related project field to the related record type and you are all set. Conclusion Running an SSRS report on a single CRM record is an effective way to display relevant data while ensuring the report remains context sensitive. In this blog, we have provided a step-by-step guide to adding a prefilter in your SSRS report. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Performance Optimization Techniques in Power BI
Introduction Building efficient Power BI reports can be challenging, especially when working with large datasets. One common issue Power BI users encounter is the “stack overflow” error, which can disrupt the report-building process. I In this blog I will share some performance optimization techniques that you can use in building power BI report. When using power query or importing data you might have got this error – “Expression.Error: Evaluation resulted in a stack overflow and cannot continue.” This error occurs when there’s a large amount of data is being imported or not enough memory available memory available for Power BI to complete the operation. This issue can be resolved by increasing the Memory and CPU cores that can be used by Power BI while querying or evaluations. There are two settings that we need to keep in mind – By default, the maximum number of simultaneous evaluations is equal to the number of logical CPU cores on the machine and Maximum memory used per simultaneous evaluation is 432 MB. Personally, I have kept these values in between or close to maximum value depending on my requirement and system. Also, here is link to recommendations by Microsoft for managing Power BI workload and evaluation configurations – https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-evaluation-configuration Conclusion Optimizing performance in Power BI is crucial for handling large datasets and preventing issues like the “stack overflow” error. By adjusting settings for simultaneous evaluations and memory allocation, you can significantly improve report processing and responsiveness. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
JSON to JSON Transformation using Azure Logic Apps and Liquid
Introduction In this blog post, I’ll walk you through the process of transforming JSON to JSON using Azure Logic Apps and the Liquid Template Language. This step-by-step guide will demonstrate how you can use Azure Integration Services to achieve your transformation goals. What is Liquid Template Language? The Liquid Template Language (commonly referred to as “Liquid”) is a flexible, open-source template language developed by Shopify. It is widely used to render dynamic content in platforms such as Shopify themes, Jekyll websites, and web applications. Liquid uses placeholders, loops, and conditional statements to pull dynamic data into a web template, making it an effective tool for JSON transformation. Prerequisites To complete this tutorial, you’ll need: Sample Input JSON We will use the following sample JSON file for this tutorial: { “FirstName”: “Deepak”, “LastName”: “Ch”, “Add1”: “T square, Saki Vihar Road, Andheri East”, “Add2”: “Mumbai”, “Landmark”: “Near Car Showroom”, “PhoneNo1”: 9812727261, “PhoneNo2”: 2121233322 } Desired Output JSON The client’s requirement is to transform the input JSON into the following format: { “Full Name”: “Deepak Ch”, “Address”: “T square, Saki Vihar Road, Andheri East, Mumbai, Near Car Showroom”, “Phone”: “9812727261, 2121233322” } Step-by-Step Guide – Step 1: Create a Free Azure Integration Account Step 2: Add the Liquid Template Map Step 3: Create a Logic App Step 4: Transform JSON to JSON using Liquid Here’s the Liquid template used for this transformation: { “Full Name”: “{{content.FirstName}} {{content.LastName}}”, “Address”: “{{content.Add1}}, {{content.Add2}}, {{content.Landmark}}”, “Phone”: “{{content.PhoneNo1}}, {{content.PhoneNo2}}” } Step 5: Test with Postman Final Output The output JSON will be: { “Full Name”: “Deepak Ch”, “Address”: “T square, Saki Vihar Road, Andheri East, Mumbai, Near Car Showroom”, “Phone”: “9812727261, 2121233322” } We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 2
In continuation to our Part 1, welcome to part 2 of the blog on Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault. We have already completed the necessary setup in part 1, so if you haven’t read part 1 yet, please do so before proceeding with this part. Assumptions- Before going further, let’s first discuss the assumptions we made: Now, let’s discuss the step to create a pipeline to refresh the access token: – – Create a web activity to pull the client ID, client secret, and refresh token you created in part 1. – As for settings, you use this setup, and URI is your Azure key vault’s Secret Identifier. – Similarly, set up web activities for the client ID, client secret, and refresh token. – For the refresh token, I have done setup as shown but you may want to change it according to your API requirements. Body- grant_type=refresh_token&refresh_token=@{activity(‘Get Refresh Token’).output.value} Authorization- Basic @{base64(concat(activity(‘Get Client Id’).output.value, ‘:’, activity(‘Get Client Secret’).output.value))} – After this, use another web activity to refresh the access token using the refresh token and save it to the Azure Key Vault. Body- { “value”: “@{activity(‘Refresh Access Token’).output.access_token}” } Conlusion: This blog provides a comprehensive guide to automating the access token and refresh token generation process using Azure Data Factory and Azure Key Vault. By following the steps outlined, you can ensure seamless token management, reduce manual interventions, and maintain secure access to your resources. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com