Category Archives: Power Query
How we designed & deployed an Income Pipeline Report for a Texas, U.S. based Cybersecurity & AI Business Solutions Firm, via MS D365 Project Operations and Power BI.
Summary Designed a two-page Power BI Income Pipeline Report for a Texas-based Cybersecurity & AI Business Solutions firm using Microsoft Dynamics 365 Project Operations. Unified visibility across Opportunity, Unbilled Income, Billed Income, and Paid Income in a single view. Introduced Average Turnaround to forecast realistic cash collection timelines based on actual payment behavior. Integrated Dynamics 365 Project Operations with QuickBooks to connect sales, delivery, invoicing, and cash collection. Enabled a 17-week rolling revenue forecast with week-by-week cash visibility. Provided dual invoice status for contractual vs realistic payment tracking. Table of Contents 1. Introduction 2. The Business Problem 3. Report Structure Overview 4. The Income Pipeline 5. Project Revenue Forecast 6. Design Principles 7. Business Impact 8. FAQs 9. Conclusion 1. Introduction Managing revenue across a professional services firm is rarely straightforward. When your business spans cybersecurity assessments, AI-driven solutions, and long-term managed services engagements, the gap between work being delivered and cash actually landing in the bank can be wide ā and costly if left unmonitored. This is precisely the challenge we set out to solve for a U.S.-based Cybersecurity and AI Business Solutions firm running their operations on Microsoft Dynamics 365 Project Operations. The result was a two-page Power BI report ā the Income Pipeline Report ā that gives leadership a real-time, end-to-end view of every dollar moving through the business: from early-stage opportunity, through unbilled and billed income, all the way to cash collected. This post walks through how the report was built, how each data layer was modelled, and why the design decisions were made the way they were. 2. The Business Problem The firm needed clarity across four distinct but connected stages of their revenue lifecycle: Sales opportunities and pipeline value Delivered but unbilled work Outstanding invoices and expected payments Actual vs expected payment behavior This would answer as well as resolve the following questions – Where are active sales opportunities sitting, and how much pipeline value do they represent? Which project work has been delivered but not yet invoiced? Which invoices have been raised and sent to clients, and when are they realistically going to be paid? And finally, how does actual payment behaviour compare against what was expected? Each of these questions existed in isolation before. Project managers had partial visibility into their own contracts, and needed a comprehensive birdās eye view of all of these together. Finance had QuickBooks data but lacked the context of the delivery pipeline. Leadership had no consolidated view. The Income Pipeline Report brought all of this together in a single, navigable Power BI experience. 3. Report Structure Overview The report consists of two pages: Income Pipeline Report ā a high-level pipeline view across four stages: Opportunity, Unbilled Income, Billed Income, and Paid Income, each with summary cards and interactive donut charts. Project Revenue Forecast ā a time-distributed breakdown of expected cash collection across a rolling 17-week horizon, organised by customer and contract. 4. The Income Pipeline The Four-Stage Pipeline Banner Across the top of the report, four chevron-style stage indicators guide the revenue journey: Opportunity ā Unbilled Income ā Billed Income ā Paid Income Each stage includes a summary card showing record count and total value Provides immediate visibility into where revenue is sitting Highlights potential bottlenecks across the pipeline Stage 1 ā Opportunity Data sourced from Dynamics 365 Sales using Business Process Flow (BPF) Uses active BPF stage (Develop, Propose, Close) instead of static fields Ensures accurate reflection of real sales progression Estimated revenue pulled directly from opportunity records Donut chart shows distribution across Develop, Propose, and Close stages Stage 2 ā Unbilled Income Represents contracted or delivered work not yet invoiced Sourced from project contract lines in Dynamics 365 Project Operations Includes: Fixed Fee milestones (explicit values) Time & Material (T&M) estimates based on resource allocations T&M calculated as allocated hours Ć billing rate Clearly marked as estimated until billing run is executed Grouped into payment expectation buckets (30, 60, 90, 120, 180+ days) Uses Average Turnaround to forecast realistic payment timing Stage 3 ā Billed Income (Confirmed Invoices) Combines Dynamics 365 Project Operations and QuickBooks data Tracks invoices that are confirmed and sent to clients Introduces Average Turnaround: Average days from invoice creation to payment Based on historical payment behaviour Each invoice has two statuses: Contractual (due date) Estimated (based on Average Turnaround) Provides realistic vs contractual payment visibility Includes: Due-date based categorisation Estimated overdue analysis Prevents misleading insights from strict payment terms alone Stage 4 ā Paid Income Tracks fully collected invoices Uses QuickBooks for actual payment dates Groups payments by time bands (under 30, 60, 90 days, etc.) Enables comparison between actual vs estimated payment behaviour Continuously improves accuracy of Average Turnaround Tooltip Drill-Down Hover shows: Payment band Record count Total value Drill-through available for detailed record-level analysis 5. Project Revenue Forecast Overview Distributes expected cash collection across a rolling 17-week window Shifts view from pipeline stage to time-based forecasting Hierarchy and Structure Customer ā Contract ā Revenue Type Revenue types include: T&M run schedules Fixed Fee milestones Confirmed invoices Each row shows: Customer Contract Billing type Average Turnaround Value mapped to expected payment week Weeks range from Week 0 to Week 16 Top row aggregates total expected cash per week Colour Coding Amber ā Unbilled income Green ā Invoice within terms Red ā Overdue (based on estimated payment date) Drill-Through to Detail Click any row to view detailed breakdown Includes: Billed invoices with due and estimated dates Unbilled milestones and run schedules Connects high-level forecast to transactional detail 6. Design Principles Average Turnaround over payment terms Reflects actual customer behaviour instead of contractual assumptions. Dual invoice status Provides both contractual and realistic payment visibility. Consistent time buckets Ensures comparability across Opportunity, Unbilled, Billed, and Paid stages. Weekly forecasting instead of monthly Supports short-term cash flow planning aligned with operational rhythm. 7. Business Impact Improved cash flow predictability Earlier visibility of at-risk invoices Unified cross-team visibility Improved T&M billing discipline Increased accountability 8. FAQs What is Average Turnaround and why does it … Continue reading How we designed & deployed an Income Pipeline Report for a Texas, U.S. based Cybersecurity & AI Business Solutions Firm, via MS D365 Project Operations and Power BI.
Share Story :
How a US-Based Food Distributor Used Power BI to Reduce Wastage and Gain Global Supply Chain Visibility
Summary In global food distribution, timing is everything. A delay in decision-making can lead to stock shortages, food wastage, or supply chain disruptions. Are you struggling to get real-time visibility into your distribution network? With increasing data from warehouses, suppliers, and logistics, making quick and accurate decisions becomes challenging. In this blog, Iāll share how we used Microsoft Power BI to accelerate decision-making in a global food distribution scenario. Core Content Why Decision-Making is Critical in Food Distribution Global food distribution involves: Without proper insights: Organizations need a system that provides real-time visibility and actionable insights. Challenges Faced In our implementation, we observed: This made decision-making slow and reactive rather than proactive. Solution Using Power BI We built a centralized reporting solution using Power BI that: Key Features of the Dashboard Real-Time Inventory Tracking Regional Demand Analysis Shipment Monitoring Performance Metrics Step-by-Step Approach Step 1: Data Integration Step 2: Data Modeling Step 3: Dashboard Development Step 4: Performance Optimization Step 5: Deployment & Sharing Real-World Impact After implementing the solution: ā Decision-making time reduced significantlyā Improved visibility across global operationsā Faster identification of supply-demand gapsā Reduced food wastageā Increased operational efficiency Managers could now make decisions in minutes instead of hours. Best Practices ā Centralize data sourcesā Use interactive dashboards for quick insightsā Focus on business KPIsā Optimize data models for performanceā Ensure data accuracy and consistency Conclusion In global food distribution, speed and accuracy in decision-making are crucial. By leveraging Power BI, organizations can transform scattered data into meaningful insights and act faster. A well-designed dashboard not only improves visibility but also empowers teams to make proactive decisions, reducing risks and improving efficiency. Call to Action If your organization is struggling with slow decision-making in supply chain operations, start by building a centralized reporting solution in Power BI. Identify your key metrics, integrate your data, and create dashboards that drive real-time insights. The right data at the right time can make all the difference. Connect with CloudFronts to get started at transform@cloudfonts.com
Share Story :
Optimizing Power BI Dataset Performance Using Incremental Refresh for Large-Scale Analytics.
Summary Use Case / Why This Matters Prerequisites Before implementing incremental refresh in Microsoft Power BI, ensure the following: Step-by-Step Implementation Step 1: Create Parameters (RangeStart & RangeEnd) This step defines the data boundaries for incremental refresh. These parameters will control which data gets refreshed. Step 2: Apply Filter in Power Query This step filters the dataset using the parameters. Select your date column Apply filter: DateColumn >= RangeStart AND DateColumn < RangeEnd This ensures only relevant data is processed. Step 3: Enable Query Folding This step ensures filtering happens at the data source level. Right-click last step ā View Native Query If available ā Query folding is enabled Query folding is critical for performance optimization. Step 4: Configure Incremental Refresh Policy This step defines how much data to store and refresh. This creates partitions in the dataset. Step 5: Publish to Power BI Service This step activates incremental refresh in the cloud. After publishing, Power BI automatically manages partitions. Business Impact Following the implementation, organizations achieved the following results Metric Before After Dataset refresh time 2ā3 hours (full refresh) 30ā45 minutes Data processing load Entire dataset processed Only recent data processed Report performance Slow with large datasets Faster load & interaction System resource usage High Optimized and controlled Incremental refresh significantly improves scalability and ensures consistent performance for enterprise reporting. To conclude, Incremental refresh in Microsoft Power BI transforms how organizations handle large datasets by reducing refresh times and improving performance. By implementing proper data filtering, query folding, and refresh policies, businesses can scale their analytics without compromising speed. As data volumes continue to grow, adopting incremental refresh is no longer optionalāit is essential for efficient and cost-effective reporting. If your Power BI reports are slowing down due to large datasets, start implementing Incremental Refresh today. Begin by identifying your date columns, defining parameters, and configuring refresh policies. A small change can lead to massive performance improvements in your reporting environment. We hope you found this blog useful. If you would like to learn more or discuss similar solutions, feel free to reach out to us at transform@cloudfronts.com.
Share Story :
Understanding VertiPaq Engine Internals for Better Power BI Performance Optimization
Summary Prerequisites Before diving into VertiPaq optimization, ensure you have: Step-by-Step Understanding of VertiPaq Internals Step 1: Columnar Storage Architecture VertiPaq stores data in a columnar format instead of rows, enabling faster scanning and better compression. Impact: Reduces query execution time significantly. Step 2: Data Compression Techniques VertiPaq applies advanced compression techniques: Impact: Reduces memory footprint and improves performance. Step 3: Segmentation and Partitions VertiPaq divides data into segments for efficient processing. Impact: Faster query execution and scalability. Step 4: Cardinality Optimization Cardinality refers to the number of unique values in a column. Best Practices: Step 5: Relationship and Model Design Efficient relationships improve VertiPaq performance. Impact: Reduces query complexity and improves performance. Business Impact Following optimization based on VertiPaq principles, organizations achieved: Metric Before After Report load time 15ā20 seconds 5ā8 seconds Dataset size 1.5 GB 600 MB Query performance Slow with complex models Optimized and responsive User experience Lagging dashboards Smooth interaction To conclude, understanding the VertiPaq engine in Microsoft Power BI is key to unlocking high-performance analytics. By optimizing data models with proper structure, compression techniques, and relationships, organizations can achieve faster insights and scalable reporting. As datasets grow in size and complexity, mastering VertiPaq internals becomes essential for every Power BI developer and data professional. If you want to build high-performance Power BI reports, start by analyzing your data model and optimizing it based on VertiPaq principles. A small improvement in data structure can lead to massive gains in performance. We hope you found this blog useful. If you would like to learn more or discuss similar solutions, feel free to reach out to us at transform@cloudfronts.com.
Share Story :
Overcoming Dataverse Connector Limitations: The Power Automate Approach to Export Hidden
Working with Microsoft Dataverse Connector in Power BI is usually straightforwardāuntil you encounter a table that simply refuses to load any rows, even though the data clearly exists in the environment. This happens especially with hidden, virtual, or system-driven tables (e.g. msdyn_businessclosure, msdyn_scheduleboardsetting) which are commonly used in Field Service and Scheduling scenarios. Before jumping to a workaround, itās important to understand why certain Dataverse tables donāt load in Power BI, what causes this behavior, and why the standard Dataverse connector may legitimately return zero rows. Causes – 1] The Table Is a Virtual or System Table with Restricted AccessSystem-managed Dataverse tables like msdyn_businessclosure are not exposed to the Dataverse connector because they support internal scheduling and platform functions. 2] No Records Exist in the Root Business Unit Data owned by child business units is not visible to Power BI accounts associated with a different BU, resulting in zero rows returned. 3] The Table Is Not Included in the Standard Dataverse Connector Some solution-driven or non-standard tables are omitted from the Dataverse connectorās supported list, so Power BI cannot load them. Solution: Export Dataverse Data Using Power Automate + Excel Sync Since Power BI can read:-> OneDrive-hosted files-> Excel files-> SharePoint-hosted spreadsheets ā¦a suitable workaround is to extract the restricted Dataverse table into Excel using a scheduled (When the records are few) / Dataverse triggered (When there are many records and you only want a single one, to avoid pagination) Power Automate flow. What it can do –-> Power Automate can access system-driven tables.-> Excel files in SharePoint can be refreshed by Power BI Service.-> we can bypass connector restrictions entirely.-> The method works even if entities have hidden metadata or internal platform logic. This ensures:-> Consistent refresh cycles-> Full visibility of all table rows-> No dependency on Dataverse connector limitations Use case I needed to use the Business Closures table (Dataverse entity: msdyn_businessclosure) for a few calculations and visuals in a Power BI report. However, when I imported it through the Dataverse Connector, the table consistently showed zero records, even though the data was clearly present inside Dynamics 365. There are 2 reasons possible for this –1] It is a System/Platform Tablemsdyn_businessclosure is a system-managed scheduling table, and system tables are often hidden from external connectors, causing Power BI to return no data. 2] The Table Is Not Included in āStandard Tablesā Exposed to Power BIMany internal Field Service and scheduling entities are excluded from the Dataverse connectorās metadata, so Power BI cannot retrieve their rows even if they exist. So here, we would fetch the records via “Listing” in Power automate and write to an excel file to bypass the limitations that hinder the exposure of that data; without compromising on user privileges, or system roles; we can also control or filter the rows being referred directly at source before reaching PBI Report. Automation steps – 1] Select a suitable trigger to fetch the rows of that entity (Recurring or Dataverse, whichever is suitable). 2] List the rows from the entity (Sort/Filter/Select/Expand as necessary). 3] Perform a specific logic (e.g. clearing the existing rows, etc.) on the excel file where the data would be written to. 4] For each row in the Dataverse entity, select a primary key (e.g. the GUID), provide the path to the particular excel file (e.g. SharePoint -> Location -> Document Library -> File Name -> Sheet or Table in the Excel File), & assign the dynamic values of each row to the columns in the excel file. 5] Once this is done, import it to the PBI Report by using suitable Power Query Logic in the Advanced Editor as follows – -> a) Loading an Excel File from SharePoint Using Web.Contents() – Source = Excel.Workbook(Web.Contents(“https://<domain>.sharepoint.com/sites/<Location>/Business%20Closures/msdyn_businessclosures.xlsx”),null,true), What this step does: -> Uses Web.Contents() to access an Excel file stored in SharePoint Online.-> The URL points directly to the Excel file msdyn_businessclosures.xlsx inside the SharePoint site.-> Excel.Workbook() then reads the file and returns a structured object containing:All sheets, Tables, Named ranges Parameters used: null ā No custom options (e.g., column detection rules)true ā Indicates the file has headers (first row contains column names) -> b) Extracting a Table Named āTable1ā from the Workbook – msdyn_businessclosures_Sheet = Source{[Item=”Table1″, Kind=”Table”]}[Data], This would search inside the Source object (which includes all workbook elements), and look specifically for an element where: Item = “Table1” ā the name of the table in the Excel fileKind = “Table” ā ensures it selects a table, not a sheet with the same name & would extract only the Data portion of that table. As a result, we get Power Query table containing the exact contents of Table1 inside the Excel workbook, to which we can further apply our logic filter, clean, etc. To conclude, when Dataverse tables refuse to load through the Power BI Dataverse Connectorāespecially system-driven entities like msdyn_businessclosureāthe issue is usually rooted in platform-level restrictions, connector limitations, or hidden metadata. Instead of modifying these constraints, offloading the data through Power Automate ā Excel ā Power BI provides a controlled, reliable, and connector-independent integration path. By automating the extraction of Dataverse rows into an Excel file stored in SharePoint or OneDrive, you ensure: This method is simple to build, stable to maintain, and flexible enough to adapt to any Dataverse table -whether standard, custom, or system-managed. For scenarios where Power BI needs insights from hidden or restricted Dataverse tables, this approach remains one of the most practical and dependable solutions. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.
Share Story :
How to Build a Scorecard in Power BI
What Is a Scorecard in Power BI? A Scorecard is a visual performance monitoring tool that allows you to track key metrics (goals) against predefined targets. Power BIās Metrics (formerly Goals) feature helps you: Why Use Scorecards? Hereās why Scorecards are powerful for any team: Benefit Description Goal Alignment Track KPIs aligned to strategic objectives. Accountability Assign owners and collaborators for each goal. Real-time Tracking Monitor progress with live metrics. Visual Reporting Easy-to-read dashboards and history tracking. Step-by-Step: How to Build a Scorecard in Power BI Step 1: Navigate to Power BI Service Go to Power BI Service and choose the workspace where you want to create your Scorecard (Premium or Pro workspaces only). Step 2: Create a New Scorecard Youāll now land on a blank Scorecard canvas. Step 3: Add Metrics to the Scorecard You can connect it to an existing Power BI dataset or manually input values. Step 4: Link Metrics to Data (Optional but Recommended) To automate tracking: This ensures your Scorecard updates automatically with data refreshes. Step 5: Customize the Scorecard You can also create hierarchies ā group related goals under broader objectives. Step 6: Share & Collaborate Once your Scorecard is built: To conclude, Power BI Scorecards turn your data into action. They help track goals in real time, assign ownership, and keep teams focused on what matters most. Whether you’re managing a sales team, a project, or company-wide objectives ā Power BI Scorecards are a game-changer for performance tracking. Want to bring visibility and accountability to your team goals? Head to Power BI Service and start building your first Scorecard today! Need help connecting metrics to your datasets? Reach out, and weāll guide you step by step. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Bridge Your Database and Dataverse: Complete Integration Guide
Modern applications demand seamless, real-time data access. Microsoft Dataverseāthe data backbone of the Power Platformāmakes it easier to build and scale low-code apps, but often your enterprise data resides in legacy databases. Connecting a database to Dataverse enables automation, reporting, and app-building capabilities using the Power Platformās ecosystem. In this blog, weāll walk you through how to connect a traditional SQL database (Azure SQL or On-Premises) to Microsoft Dataverse. What is Dataverse? Dataverse is Microsoftās cloud-based data platform, designed to securely store and manage data used by business applications. Itās highly integrated with Power Apps, Power Automate, and Dynamics 365. Key Features: Why Connect Your Database to Dataverse? Step-by-Step Guide: Connecting a Database to Dataverse Step 1: Open the Power Apps and select the proper Environment Step 2: Open Dataflow in Power Apps and create a new Dataflow Step 3: Connect to the Database using SQL Server Database. Step 4: Add the required credentials to make the connection between the database and Dataverse. Step 5: Add proper mapping of the column and find the unique ID of the table Step 6: Set the schedule refresh and publish the Dataflow. Step 7: Once Dataflow is published, we can see the table in the Power apps To conclude, connecting your database to Dataverse amplifies the power of your data, enabling app development, automation, and reporting within a unified ecosystem. Whether you need real-time access or periodic data sync, Microsoft offers flexible and secure methods to integrate databases with Dataverse. Start exploring virtual tables or dataflows today to bridge the gap between your existing databases and the Power Platform. Want to learn more? Check out our related guides on Dataverse best practices and virtual table optimization. We hope you found this blog useful. If you would like to discuss anything further, please reach out to us at transform@cloudfonts.com.
Share Story :
How to Trim and Remove Spaces from Multiple Columns in Power Query
Efficient data cleaning is a crucial step in any data preparation process, and Power Query makes it easy to handle common tasks like trimming and removing unnecessary spaces with functions that you can apply across multiple columns and queries at once. By creating and invoking a function, you can quickly trim and remove spaces from all the columns & tables you need, saving time and effort. In this blog, weāll show you how to use Power Query functions to streamline your data-cleaning process. The power query we are going to use to trim text in columns is – (text as text, optional char_to_trim as text) =>letchar = if char_to_trim = null then ” ” else char_to_trim,split = Text.Split(text, char),removeblanks = List.Select(split, each _ <> “”),result=Text.Combine(removeblanks, char)inresult This Power Query function takes text as input and removes extra spaces or a specified character from a text string. It splits the text into parts, filters out empty strings, and recombines the cleaned parts using the specified character. If no character is provided, it defaults to removing spaces. The power query we are going to use to remove spaces from the text is – (InputTxt as text) => let Clendata = Text.Combine(List.Select(Text.Split(Text.Trim(InputTxt),” “),each _ <> “”),“”) in Clendata The Power Query function removes all spaces from a given text string. It trims the input, splits it by spaces, filters out blanks, and then combines the parts into a single string. The result is a clean, space-free text, ideal for standardized data preparation. Now, we have our power query function ready, we can use this function across multiple columns or dataset. To do so, go to Add Column > Invoke Custom Function > Your Power Query Function. To conclude, Cleaning and transforming data in Power Query become much easier and more efficient with the use of custom functions. Whether you need to remove spaces, clean multiple columns, or standardize text, these functions save time and ensure consistency across your dataset. By applying these techniques, you can handle large, messy datasets with ease, making your data ready for analysis or reporting. Start implementing these simple yet powerful methods today to streamline your data preparation process! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
How to Apply Row Level Security in Power BI
In todayās data-driven world, security is a top priority. As organizations rely on Power BI for analytics and reporting, ensuring that users only see data relevant to their roles is crucial. This is where Row-Level Security (RLS) comes into play.RLS allows you to restrict access to data at the row level based on user roles. In this blog, weāll guide you through the process of implementing RLS in Power BI, ensuring your data is both secure and personalized for every user. What is Row-Level Security (RLS)? Row-Level Security is a feature in Power BI that enables you to control access to rows of data based on user roles. By applying RLS, you ensure that users see only the data relevant to their responsibilities, preventing unauthorized access. Why is RLS Important? Step 1: Open Power BI go to Modeling tab and click on manage roles Step 2: Add new roles select the appropriate table then filter the required data. Here I have done the filter based on the region, so I am giving access to the East region to the selected user. Step 3: Publish the report to the service or you can check from the Power BI Desktop app Step 4: Now, remove the View as the role from the desktop, publish the report in the service, and give access to the user as per requirement. Conclusion:Row-Level Security is an indispensable tool for ensuring data security and personalization in Power BI. By restricting access to data based on roles, you can enhance user experiences, improve compliance, and safeguard sensitive information. Ready to secure your Power BI reports with Row-Level Security? Start by identifying your data access requirements and defining roles in Power BI Desktop. If you need expert guidance, feel free to reach out, at transform@cloudfonts.com. or explore more Power BI tips on our blog.
Share Story :
How to Add and Customize Tooltips in Power BI
In Power BI, tooltips are an effective way to provide additional context and details about your data. With just a hover, users can view insights that enhance their understanding of the visualization without overwhelming the main report page.Whether youāre a beginner or an experienced developer, learning how to add and customize tooltips in Power BI can significantly improve your report’s interactivity and user experience. This blog will guide you through the process, offering tips to create tooltips that are both informative and visually appealing. 1. What Are Tooltips in Power BI?Tooltips are pop-up details that appear when users hover over a data point in a visualization. They can display additional information about the data, such as summary statistics, comparisons, or related insights. 2. Why Use Tooltips? 3). Step By Step Procedure Step 1: Open the Power BI report and create a visual. Step 2: Create a new page in Power BI, then go to Visualization – Format Your Report – Canvas Setting – Select Option Tooltip. Visualization – Format Your Report Canvas Setting Step 3: Then add the related visual that you need to add as a tooltip Step 4: Then click on the visual where you have to add the tooltips. ON the tooltip option and select the page where you have added the Tooltip. Step 5: Final Look of the visualization. Conclusion: Tooltips are a powerful feature in Power BI that can elevate the interactivity and usability of your reports. By adding custom tooltips, you can provide deeper insights without compromising the clarity of your main visuals. Following these steps and best practices will help you create tooltips that enhance your report’s overall impact. Ready to enhance your Power BI reports with custom tooltips? Start by experimenting with a simple tooltip page in your existing report. For more Power BI tips and tricks, explore our other blogs or contact out to us at transform@cloudfonts.com.