Tag Archives: power bi
Building Real-Time Dashboards with Azure Stream Analytics and Power BI
Real-time dashboards are essential for monitoring live data and gaining instant insights into business operations. Azure Stream Analytics and Power BI provide an efficient way to process and visualize streaming data. In this blog, we will walk through the steps to build a real-time dashboard using these tools, with illustrative images to guide you. Why Real-Time Dashboards Are Needed In today’s fast-paced world, businesses need to make decisions quickly based on live data. Real-time dashboards enable organizations to: Use Cases for Real-Time Dashboards Real-time dashboards can be applied across various industries, including: Prerequisites Before we begin, ensure you have the following: Step 1: Set Up Your Data Source
Performance Optimization Techniques in Power BI
Introduction Building efficient Power BI reports can be challenging, especially when working with large datasets. One common issue Power BI users encounter is the “stack overflow” error, which can disrupt the report-building process. I In this blog I will share some performance optimization techniques that you can use in building power BI report. When using power query or importing data you might have got this error – “Expression.Error: Evaluation resulted in a stack overflow and cannot continue.” This error occurs when there’s a large amount of data is being imported or not enough memory available memory available for Power BI to complete the operation. This issue can be resolved by increasing the Memory and CPU cores that can be used by Power BI while querying or evaluations. There are two settings that we need to keep in mind – By default, the maximum number of simultaneous evaluations is equal to the number of logical CPU cores on the machine and Maximum memory used per simultaneous evaluation is 432 MB. Personally, I have kept these values in between or close to maximum value depending on my requirement and system. Also, here is link to recommendations by Microsoft for managing Power BI workload and evaluation configurations – https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-evaluation-configuration Conclusion Optimizing performance in Power BI is crucial for handling large datasets and preventing issues like the “stack overflow” error. By adjusting settings for simultaneous evaluations and memory allocation, you can significantly improve report processing and responsiveness. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Gain Business Insights faster by generating Power BI Reports quickly with just 1 click in Dataverse
Hi All, I’m going to show a useful feature that you can leverage to view and create instant Power BI visuals that is generated automatically based on the current view. Documentation Link Just an example: How it looks Steps to achieve this: Step 1: You need to enable this feature in Model-Driven App itself Edit Model-Driven App -> Settings -> Features -> ‘Enable Power BI quick report visualization on a table‘. Save and publish the settings Note: You also need ‘TDS endpoint‘ enabled in the environment feature settings Step 2: Refresh your browser and navigate to any table records view ( I took cases in the example) Step 3: Click the ‘Visualize this view‘ button on the command bar Note: You need to add the necessary columns in the current view if you want to show those columns in Power BI Reports Step 4: You can now see the Report generated automatically within a few mins. You can save these Reports if all necessary information is displayed here. Hope this helped you get faster Business Insights with auto-generated Power BI visuals.
How to create Date table using M query
You Might have seen the other ways to create the DateTable in Power BI using m query by adding two Custom columns such as startdate and enddate and then using DateRange function you can populate the dates between startdate and enddate However, in this blog, we will see the M query that enables you to create the datetable on user define the start and end date using parameters Step 1: Open the Power BI desktop application and then select the option Transform Data Step2: Click on the new source. then select the Blank query option from the dropdown Step3: Click on Advanced editor and paste the below query Step 4: After clicking on ok you will see the input fields to enter the start and end date Step 5: Enter the dates required and then click on Invoke You can see the new table is create for given date range Hope this helps you!! Thank You
Full Outer join Using DAX in Power BI
Thinking of full outer join in Power BI what comes first in your mind? How can we achieve full outer join in Power BI? Common Answer will be the “Use Merge Query” Option in Power Query Window. However, I would like to tell you that we can use DAX to achieve Full Outer Join. Full Outer Join = left Outer Join + right Anti Join Customer Table: Order Table: Click on new table and write the below DAX: DAX for Full Outer Join of Customer and Order Table: Result: Hope this helps!! Thank You!!
Display Horizontal Page tab in Power BI web
In Power BI Desktop app and when we edit the report on Power BI workspace, page tab visible at bottom, but when we publish the report on to the Power BI Service then it is visible on left side. We can change the Tab Position in the Power BI web as well, to do that we can follow below steps. 1.Go to report setting 2.Enable the option of Pages Pane enable the button and save the changes. And when we open the report tabs will be visible at the bottom. Hope this helps!
Add rows to Power BI dataset for date range
Hi in this blog we will see how we can add rows to power bi dataset for a given date range. For example if you have a dataset which has start and end date and you want rows for each date between this range then this blog will help you. Step 1: Open Power BI load your dataset and go to transform data. Right click on your from date column and change its type to date. Step 2: From add columns click on custom column Step 3: In custom column formula add following code: { Number.From([From])..Number.From([To]) } Step 4: Expand this column to new rows to get your result. Step 5: Change the datatype of this column to date. In this way you can prepare your dataset for all the dates present in your From and To column. Hope this helps.
Connect Azure Databricks to Power BI
Open Power BI and Click on Get Data and Search for Azure Databricks and click on connect. It will ask for below details, Server Hostname HTTP Path Now we will see, how to get above details, Go to Azure Databricks and click on Clusters Once clusters is opened the go to Advanced setting > JDBC/ODBC. Under this we can get Server Hostname and HTTP Path, which can be used in above steps. Fill the details and click on OK, It will ask for user credentials, after that it will open a pop up asking to select the from List of Tables. Select Tables and click on load In this way we can create Power BI report based on the current data received from Azure Databeicks. In this way we can create Power BI report and create fields above it. azure BI
DAX For Relationships in Power BI
Hi everyone in this blog we will see the different DAX that are used to define or use the relationship between two tables. In Power BI there are two type of relationships 1. One to One (1:1) 2. One to Many (1:*) Now lets look at the DAX functions that we can use with these relationships. USERELATIONSHIP- Specifies the relationship to be used in a specific calculation as the one that exists between columnName1 and columnName2. Syntax USERELATIONSHIP(<columnName1>,<columnName2>) Where, columnName1 The name of an existing column, using standard DAX syntax and fully qualified, that usually represents the many side of the relationship to be used; if the arguments are given in reverse order the function will swap them before using them. This argument cannot be an expression. columnName2 The name of an existing column, using standard DAX syntax and fully qualified, that usually represents the one side or lookup side of the relationship to be used; if the arguments are given in reverse order the function will swap them before using them. This argument cannot be an expression. Key Point The function returns no value; the function only enables the indicated relationship for the duration of the calculation. Example = CALCULATE(SUM(ISales[SalesAmount]), USERELATIONSHIP(Sales[ShippingDate], DateTime[Date])) Limitations USERELATIONSHIP can only be used in functions that take a filter as an argument. USERELATIONSHIP cannot be used when row level security is defined for the table in which the measure is included. RELATED – Returns a related value from another table. Syntax RELATED(<column>) Where, column – The column that contains the values you want to retrieve. Key Point A single value that is related to the current row. Example FILTER( ‘Sales_USD’, RELATED(‘Territory'[TerritoryCountry])<>”United States”) RELATEDTABLE Evaluates a table expression in a context modified by the given filters. Where, tableName – The name of an existing table using standard DAX syntax. It cannot be an expression. Key Point A table of values. Example = SUMX( RELATEDTABLE(‘Sales_USD’) , [Amount_USD]) Limitation The RELATEDTETABLE function changes the context in which the data is filtered, and evaluates the expression in the new context that you specify. This function is a shortcut for CALCULATETABLE function with no logical expression. This function is not supported for use in Direct Query mode when used in calculated columns or row-level security (RLS) rules. Hope this helps.
Create new aggregate measurement / Entity store in D365 Finance and Operation to be consumed in Power BI report in D365 Finance and Operations
In D365 there are several option to export/Import data like Data Entities, BYOD, Aggregate measurements/Entity store. In this blog we will learn How to create entity store in D365 Finance and operations. Aggregate Measurements/Entity Stores are use to create Power Bi reports with nearly Live data where user have option to set its recurrence that is how often you want to refresh entity store there is no need to manually refresh the data as per suggested Batch job will run for respective entity stores. You can also forcefully refresh data by pressing refresh button on entity store page. So lets start with development of aggregate measurements, aggregate dimensions. Add New aggregate measurements object to the project where we required Add required attributes Add required measures Add required dimensions Add dimensions where view of dimension and aggregate measurements are different Build The Model Refresh Entity store from D365 Finance and Operation Environment Verify that respected view is created for aggregate measurement Add New aggregate measurements object to the project where we required. In this step we need to right click on new item Now select aggregate measurement and name it as per requirement in our case its “CFSAggregateMeasrure” Now assign required views in table property of the aggregate measurement as follow In my case I have selected “InventOnHandByWarehouse” view. Add required attributes Now add required attribute by right click on Attributes and assign required field in attributes as follows Add required measures After adding attributes add dimensions same how we added other attributes as follows After adding measure assign required field to it and operation which you want to perform on that field (for example. :- count, Average, etc. ) Add required dimensions By default some of the dimensions are provided like company and date which are showed in screen shot. And assign required fields in relation of dimensions Add dimensions where view of dimension and aggregate measurements are different If dimensions needs different view we need to create new aggregate dimension as follows In my case name of aggregate dimension is “CFSAggregateDimension ” Now assign required view to dimension as follows. After this create new attributes and assign fields to that attributes as follows After adding new dimension attribute if required you can assign more than one field reference as follows After this step assign fields to the respective field reference as follows. Now select the respected attribute and select its usage property and change it as key which will make it as dimensional key which will be helpful while making relations. There are 3 options under usage property description of each as follows Key If you specify usage property as “key” system will define the key of the dimension using this attribute Parent If you specify usage property as “parent” system will parent child hierarchy with this field as parent level. Regular If you specify usage property as “Regular” ,this is an attribute without any special behavior and it is default value. After setting usage property attach this dimension to our aggregate measurement by dropping required aggregate measurement on dimension section of it. Now define its dimension attribute property as follows After this you need to make relationship among the views as follows Fact Dimension If you want to make desired aggregate dimension as fact dimension go to desired dimension in aggregate measurement and make set “is fact dimension” property to yes/No as follows Build The Model After this case build the model which is used for this development of project as follows Refresh Entity store from D365 Finance and Operation Environment After successful build go to environment page’s Entity store section using following navigation System administration >> setup >> entity store and refresh the desired entity On required entity please press the refresh button You can also set schedule to refresh this entity by selecting edit button and enable its automatic refresh toggle and the set its recurrence as follows Verify that respected view is created for aggregate measurement In final step go to your VM’s SSMS and look for Axdw database in which in view section look for views with your aggregate measurement and dimension and name as follows After select query you can see the data of that view Now your entity store is ready to consume by power bi reporting service. Thank You!