How to create Date tables in Power BI. Why are they remarkably important in data models?
In this blog I am going to explain you how to create a Date table in Power BI desktop. A date table is a table that contains one column of DateTime type with unique values. It’s also called CALENDAR table A data model typically contains a date table A date table is designed to slice dates It should contain all dates included in the period to analyse It should contain one column of DateTime type Each day should be represented by a single a row There should be no gaps in the data It should be marked as a date table Using a date table with only the relevant year is best practise It’s recommended to have a rich set of additional columns Give the data model the flexibility to perform time intelligence calculations by using specific DAX functions Slice data by using one of its columns Create relationships between a date table and other tables in the data model Filter and group data across multiple tables Types of Date tables There are two ways to create a table Automatic Date Tables Power BI automatically creates a date table for each Date or DateTime column in the model. This makes it possible to slice each date by year, quarter, month and day. Calculated Date Tables It can be done by using two DAX functions: CALENDAR and CALENDARAUTO Syntax: Dates = CALENDAR(<StartDate>,<EndDate>) Dates = CALENDAR ( FIRSTDATE(Table[column]), LASTDATE(Table[column]) ) Calendar functions return a table with a single column named “Date” and a list of values for the dates. Calendar requires the boundaries of the set of dates. CalendarAuto searches among all the dates in the data model and automatically finds the first and last year referenced within the model. Dates = CALENDARAUTO() The final formula for the calculated table should look something like this: Date = ADDCOLUMNS ( CALENDAR (DATE(2018,1,1), DATE(2025,12,31)), “DateAsInteger”, FORMAT ([Date], “YYYYMMDD”), “Year”, YEAR ([Date]), “Monthnumber”, FORMAT ([Date], “MM”), “YearMonthnumber”, FORMAT ([Date], “YYYY/MM”), “YearMonthShort”, FORMAT ([Date], “YYYY/mmm”), “MonthNameShort”, FORMAT ([Date], “mmm”), “MonthNameLong”, FORMAT ([Date], “mmmm”), “DayOfWeekNumber”, WEEKDAY ([Date]), “DayOfWeek”, FORMAT ([Date], “dddd”), “DayOfWeekShort”, FORMAT ([Date], “ddd”), “Quarter”, “Q” & FORMAT ([Date], “Q”), “YearQuarter”, FORMAT ([Date], “YYYY”) & “/Q” & FORMAT ([Date], “Q”) ) Once the date table is in place, it is a good idea to mark it as a date table as shown here: This both simplifies the code to author time intelligence calculations and automatically disables the auto Date/Time feature. I hope this will help you.
Share Story :
Develop Custom Workflow:Counting Journals workflow
In this blog, you will learn how to develop a custom workflow that is not present out of the box in D365 Finance. For this blog I’m creating a workflow for Counting Journal (Inventory management>>Journal Entries>>Item Counting>>Counting) because there is no such workflow for inventory management module. The followings are steps that are to be followed. Steps:- 1. Create a base Enum for Document Status 2. Create a table extension and add Enum to table 3. Add document status field to form extension 4. Create query 5. Create workflow category 6. Create a workflow type 7. Now add can submit workflow and updateworkflow methods to tables class extension 8. Update submitmanager class 9. Update EventHandler class 10. Workflow approval creation 11. Add an element to workflow Type 12. Now add workflow setup form to the inventory management module Now we are performing steps in detail 1. Create a base enum for Document Status Make sure you have created a solution, project and assign the model to the project. Now add new item and select Base Enum. And Add Base Enum and name it as CFS_InventoryCountingWorkflow 2. Create a table extension and add enum to table We are creating a workflow for inventory management so we need to create an extension of InventoryJournalTable and drag above created base enum to table which will create a new field of type Enum. 3. Add Document status field to form extension Now we need to create an extension of form InventJournalCount and add newly created table field to forms grid by dragging it from data source to form grid control and assign label as approval status. 4. Create query Now again add a new query to project and name it as CFS_InventoryCountingWorkflow and add the InventJournalTable and set properties as follows 5. Create workflow category Now we are going to add Workflow category to project and name it as CFS_InventJournalCounting and set the properties as follows 6. Create a workflow type After adding workflow category its time to add workflow type name it as CFS_InventoryJournalCounting and set its properties as follows this will create new elements such as classes and action menu items for submit and all actions. 7. Now add can submit workflow and updateworkflow methods to tables To add a method to the table we need to create a table class extension for that add a new class and name it as InventJournalTable_CFSExtension and need to add updateWorkflow and canSubmitWorkflow methods. You can use the following code [Extensionof(tableStr(InventJournalTable))] final class InventJournalTable_Extension { public boolean canSubmitToWorkflow(str _workflowType) { boolean ret = next cansubmitToWorkflow(_workflowType); ret = this.CFS_InventoryCountingWorkflow == CFS_InventoryCountingWorkflow::Draft; return ret; } public static void updateWorkflowStatus(RecId _documentRecId, CFS_InventoryCountingWorkflow _status) { ttsbegin; InventJournalTable document; update_recordset document setting CFS_InventoryCountingWorkflow = _status where document.RecId == _documentRecId; ttscommit; } } 8. Update submitmanager class Make sure you have the same name for submitting manager class or rename it as follows and following code to that public class CFS_InventoryJournalCountingSubmitManager { private InventJournalTable document; private WorkflowVersionTable versionTable; private WorkflowComment comment; private WorkflowWorkItemTable workItem; private SysUserId userId; private boolean isSubmission; private WorkflowTypeName workflowType; public static void main(Args args) { // TODO: Write code to execute once a work item is submitted. if (args.record().TableId != tableNum(InventJournalTable)) { throw error(‘Error attempting to submit document’); } InventJournalTable document = args.record(); FormRun caller = args.caller() as FormRun; boolean isSubmission = args.parmEnum(); MenuItemName menuItem = args.menuItemName(); CFS_InventoryJournalCountingSubmitManager manager = CFS_InventoryJournalCountingSubmitManager::construct(); manager.init(document, isSubmission, caller.getActiveWorkflowConfiguration(), caller.getActiveWorkflowWorkItem()); if (manager.openSubmitDialog(menuItem)) { manager.performSubmit(menuItem); } caller.updateWorkflowControls(); } /// <summary> /// Construct method /// </summary> /// <returns>new instance of submission manager</returns> public static CFS_InventoryJournalCountingSubmitManager construct() { return new CFS_InventoryJournalCountingSubmitManager(); } /// <summary> /// parameter method for document /// </summary> /// <param name = “_document”>new document value</param> /// <returns>current document</returns> public Inventjournaltable parmDocument(Inventjournaltable _document = document) { document = _document; return document; } /// <summary> /// parameter method for version /// </summary> /// <param name = “_versionTable”>new version table value</param> /// <returns>current version table</returns> public WorkflowVersionTable parmVersionTable(WorkflowVersionTable _versionTable = versionTable) { versionTable = _versionTable; return versionTable; } /// <summary> /// parameter method for comment /// </summary> /// <param name = “_comment”>new comment value</param> /// <returns>current comment value</returns> public WorkflowComment parmComment(WorkflowComment _comment = comment) { comment = _comment; return comment; } /// <summary> /// parameter method for work item /// </summary> /// <param name = “_workItem”>new work item value</param> /// <returns>current work item value</returns> public WorkflowWorkItemTable parmWorkItem(WorkflowWorkItemTable _workItem = workItem) { workItem = _workItem; return workItem; } /// <summary> /// parameter method for user /// </summary> /// <param name = “_userId”>new user value</param> /// <returns>current user value</returns> public SysUserId parmUserId(SysUserId _userId = userId) { userId = _userId; return userId; } /// <summary> /// parameter method for isSubmission flag /// </summary> /// <param name = “_isSubmission”>flag value</param> /// <returns>current flag value</returns> public boolean parmIsSubmission(boolean _isSubmission = isSubmission) { isSubmission = _isSubmission; return isSubmission; } /// <summary> /// parameter method for workflow type /// </summary> /// <param name = “_workflowType”>new workflow type value</param> /// <returns>current workflow type</returns> public WorkflowTypeName parmWorkflowType(WorkflowTypeName _workflowType = workflowType) { workflowType = _workflowType; return workflowType; } /// <summary> /// Opens the submit dialog and returns result /// </summary> /// <returns>true if dialog closed okay</returns> protected boolean openSubmitDialog(MenuItemName _menuItemName) { if (isSubmission) { return … Continue reading Develop Custom Workflow:Counting Journals workflow
Share Story :
How to Add Workflow form for D365 finance and Operation modules
In D365 Finance and Operation for some modules, there is no Workflow setup. eg:- Inventory management module. In such a case, we need to perform the following steps after which you can see the workflow setup form which will include all the workflows for that specific module. First, we need to add the Display menu item and set properties as: Enum Type parameter to ModuleAxapta and Enum Parameter to the module ( in our case Inventory ) Object Type to form. Object to WorkflowtableListPageRnr After this create a menu extension of the module where we need to place the form. Drag the display menu item to the menu extension. After a successful build, we are set to enlist new workflows for an inventory module using workflow forms. The same way you can attach workflow form to the new module as follows:- Create an extension of ModuleAxapta Base Enum Add a new element to base enum and name it your custom module name and label it Create a new Display Menu Item and set its properties as follows Create a new menu and set its label and name as per your requirement And insert your display menu item to your custom menu/module Create an extension of “MainMenu” Menu and add new menu item reference and set its properties as follows Build the solution your Workflow setup form will be visible for that module
Share Story :
How to fix Client found response content type of ”, but expected ‘text/xml’
While running Integration map from Salesforce to AX we might encounter blank response received but expected text/xml. In current use case existing user was deactivated and new user was added into salesforce. To fix this kind of issue we can follow below steps: Login to the salesforce environment Go to setup Following screen will open and go to quick find and type “Outbound Message” Following screen will open click on continue Then you will see the outbound message list Click on edit and change user name In this way we can fix the blank response issue in salesforce.
Share Story :
How to Capture Error Description and Error Log Date in Script Component in SSIS
Our Scenario: To resolve the Errors that occur due to componentMetadata130 and to capture the Error Log Date and Error Description while doing custom Error Handling in SSIS use the following steps. Solution: Error Handling at OLEDB Source Block: Step 1: Add a Script Component to catch the Error Description, Error Log Date and Entity. Select Transformation and click on Ok. Step 2: Connect the Error Output (Red Arrow) from OLDEB Source to the Script Component. Select “Redirect row” for all columns in the Error and Truncation Columns to redirect Error Output to Script Component. Step 3: Configure the Script Component as following: In Input Columns section Select Error Code and Error Column. In Inputs and Outputs section Add following Columns Column Name Data Type Length ErrorDescription Unicode string [DT_WSTR] 500 ErrorLogDate database timestamp [DT_DBTIMESTAMP] —- Entity string [DT_STR] 50 In the Connection Managers Section add a new connection and select your SQL connection. In the Script Section click on Edit Script. After a minute a New Editor Window will Open. Here you have to copy and paste the following Script inside the “public override void Input0_ProcessInputRow(Input0Buffer Row)” section. Code Snippet: try { Row.Entity = “Unit”; Row.ErrorDescription = this.ComponentMetaData.GetErrorDescription(Row.ErrorCode); Row.ErrorLogDate = DateTime.Now; } catch (Exception ex) { Row.ErrorDescription = “Unable to get Error Column Name”; } Click on Save and then Close the Window. Step 4: Add a Data Conversion Block to avoid any Truncation Errors because of Data Type Conversion between NVarchar and Varchar Data Types of the Error Description Column. Select ErrorDecription Column and select Data Type as String. Click on OK. Step 5: Connect the Output columns of Script Component to the required Destination Columns. Conclusion: This is the simplest way in which we can record the Error Description and Error Log Date while custom Error Handling in SSIS.
Share Story :
Create new Sandbox and copy Production over to it in PowerPlatform Admin Center
This is one of the standard approaches if you want to copy over a Production instance over to a Sandbox one. Most common scenarios being when you’ve on-boarded a new client to Dynamics 365 / PowerPlatform and you’re nearing Go Live for the first time! You need a Sandbox to then follow new customization on the Sandbox and avoid doing anything on the Production. Here’s how you copy over from Production to Sandbox! Creating a Sandbox Create a new Environment by clicking on the +New button as below Now, you can directly create a new Sandbox environment in the PowerPlatform Admin Center (https://admin.powerplatform.microsoft.com/environments) Remember: Both the source and target environments need to be in the same region to copy. Additionally, I’ve selected the Create a Database option too since I want to speed up the process. Note that you can chose which all apps you want to deploy if you chose Enable Dynamics 365 apps And that’s it. You’ll need to wait for some time until this is processed! If you’re also looking to create a Database explicitly for an existing environment using a previous method, you can check this post – Create new CDS Environment and Database quickly from PowerApps Admin Center While this happens in the background, you can see the message on the Environments section on the Power Platform Admin Center And upon completion, you’ll see the below message Copying From Production to Sandbox Next step is to copy over the Production to the newly created Sandbox environment. Select the Production environment you want to copy over and look at the ribbon, you’ll see a copy button. Once you click on the Copy, you’ll be asked about the target environment, chose the created Sandbox. Once you confirm, the copying process will start. And you’ll be taken to this page which tells you what stage the process is in – Once all the copying is completed, this notification will appear. Now, the next step is to turn off Administration Mode. Click on settings link as shown below by arrow. Once you flick the switch and Save, Administration Mode will be disabled letting others to log in as well. Additionally, now we have the option to keep Background Operations On as well while keeping Administration Mode On as well. Now, you are all set. Hope this helps!
Share Story :
How to solve “The Locale ID 9 is not installed on this system” error in SSIS
Scenario : We are building a SSIS Package but receiving the “The Locale ID 9 is not installed on this system” error. We are using the SQL server 2017 Enterprise Edition and Visual Studio 2015 Community Edition with SSDT 2015 Edition. Solution : Step 1: Go to the Project Properties as shown below. Step 2: In the Configuration Properties check if the Target SQL version is set to correct version. Step 3: If still not working set the SQL Server to a lower version than your current version.
Share Story :
Auto scale the Power BI Embedded capacity using Job Scheduler in Azure
Power BI Embedded is a Microsoft Azure service that is useful for the ISVs and developers to embed visuals, reports and even dashboard into the application. As Power BI Embedded is a PaaS analytics solution which provide Azure based capacity, Power BI embedded charge customers on an hourly basis there are n annual commitment for the Power BI Embedded service. As Power BI Embedded charges on hourly basis and there is no direct Auto Scaling feature available on Azure but, we do have API provided by using which we can scale the capacity. In this blog we are going to see how scale the Power BI Embedded capacity using PowerShell script. Before going to start we’ll first quick list the set up the prerequisites: You will need an Azure account, if you are implementing the PowerShell script for your organisation then you must have co-administrator role assign kindly keep in mind that if you have contributor role assign then you’ll not be able to make Automation account.(we’ll see about the Automation account in the later part of this blog.) Power BI Embedded subscription. Automation Account. I’m assuming you already have Azure account along with the subscription for the Power BI Embedded. Steps:- Create Automation Account:- Automation account is use to manage the Azure resource across all the subscription for the given tenant. To create Automation click on the create resource in your Azure portal as shown below and search for Automation account. Or you can type in search box Automation Account. 2. Click on create Automation Account and make sure to fill the following details. If you have multiple subscription then make sure to select proper subscription from drop-down. Make sure create Azure Run As account is selected to Yes (if you are co-administrator or administrator then it will by default selected to Yes). Once we create Azure automation account it will show under automation account. 3. Open the Automation account and go to the Connections and add below connection and types as shown below (Click on Add a connection and type the name and type as shown below) 4. For the AzureClassicRunAsConnection set the CertificateAssetName to AzureRunAsCertificate. 5. Add the Power BI Embedded subscription to your resource group. 6. Once we have Automation account ready go to the Runbooks under Process Automation in Automation Account. Runbook is useful for the routine procedures and operations. We can also use Azure Function app instead of Runbook. 7. Click on the Create a runbook and use fill following details. 8. Once we open runbook make sure to import the Module AzureRM.PowerBIEmbedded which can be installed by going to Module under Shared Resources then click on Browse gallery and search for the AzureRM.PowerBIEmbedded module. 9. Use the below PowerShell script which can also be found on the Power BI discussion site. $resourceGroupName = “<your resource group>” $instanceName = “<Power BI embedded instance name>” $azureProfilePath = “” $azureRunAsConnectionName = “AzureRunAsConnection” #”PowerBIAutoscale” $configStr = “ [ { Name: “”Weekday Heavy Load Hours”” ,WeekDays:[1,2,3,4,5] ,StartTime: “”06:45:00″” ,StopTime: “”23:45:00″” ,Sku: “”A4″” } , { Name: “”Early AM Hours”” ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: “”00:00:00″” ,StopTime: “”04:44:00″” ,Sku: “”A1″” } , { Name: “”Model Refresh”” ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: “”04:45:00″” ,StopTime: “”06:45:00″” ,Sku: “”A3″” } , { Name: “”Weekend Operational Hours”” ,WeekDays:[6,0] ,StartTime: “”06:45:00″” ,StopTime: “”18:00:00″” ,Sku: “”A3″” } ] “ $VerbosePreference = “Continue” $ErrorActionPreference = “Stop” Import-Module “AzureRM.PowerBIEmbedded” Write-Verbose “Logging in to Azure…” # Load the profile from local file if (-not [string]::IsNullOrEmpty($azureProfilePath)) { Import-AzureRmContext -Path $azureProfilePath | Out-Null } # Load the profile from Azure Automation RunAS connection elseif (-not [string]::IsNullOrEmpty($azureRunAsConnectionName)) { $runAsConnectionProfile = Get-AutomationConnection -Name $azureRunAsConnectionName Add-AzureRmAccount -ServicePrincipal -TenantId $runAsConnectionProfile.TenantId ` -ApplicationId $runAsConnectionProfile.ApplicationId -CertificateThumbprint $runAsConnectionProfile.CertificateThumbprint | Out-Null } # Interactive Login else { Add-AzureRmAccount | Out-Null } $fmt = “MM/dd/yyyy HH:mm:ss” # format string $culture = [Globalization.CultureInfo]::InvariantCulture $startTime = Get-Date Write-Verbose “Current Local Time: $($startTime)” $startTime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($startTime, [System.TimeZoneInfo]::Local.Id, ‘Eastern Standard Time’) Write-Verbose “Current Time EST: $($startTime)” $scheduleTimeMidnight = ($startTime).Date Write-Verbose “Schedule Time Base (Midnight): $($scheduleTimeMidnight)” $currentDayOfWeek = [Int]($scheduleTimeMidnight).DayOfWeek Write-Verbose “DOW: $($currentDayOfWeek)” $stateConfig = $configStr | ConvertFrom-Json #| Select-Object Sku, WeekDays, Name, StartTime, EndTime #, @{Name=”StartTime”; Expression={[DateTime]:Smiley TonguearseExact($_.StartTime, $fmt, $culture)}}, @{Name=”StopTime”; Expression={[DateTime]:Smiley TonguearseExact($_.StopTime, $fmt, $culture)}} Write-Verbose “Writing Config Objects…” foreach($x in $stateConfig) { Write-Verbose “Name: $($x.Name)” Write-Verbose “Weekdays: $($x.WeekDays -join ‘,’)” $x.StartTime = ($scheduleTimeMidnight).AddHours([int]$x.StartTime.Split(“{:}”)[0]).AddMinutes([int]$x.StartTime.Split(“{:}”)[1]).AddSeconds([int]$x.StartTime.Split(“{:}”)[2]) Write-Verbose “Start Time: $($x.StartTime)” $x.StopTime = ($scheduleTimeMidnight).AddHours([int]$x.StopTime.Split(“{:}”)[0]).AddMinutes([int]$x.StopTime.Split(“{:}”)[1]).AddSeconds([int]$x.StopTime.Split(“{:}”)[2]) Write-Verbose “End Time: $($x.StopTime)” } Write-Verbose “Getting current status…” # Get the server status $pbiService = Get-AzureRmPowerBIEmbeddedCapacity -ResourceGroupName $resourceGroupName switch ($pbiService.State) { “Scaling” { Write-Verbose “Service scaling operation in progress… Aborting.” end } “Succeeded” {Write-Verbose “Current Status: Running”} Default {Write-Verbose “Current Status: $($pbiService.State)”} } Write-Verbose “Current Capacity: $($pbiService.Sku)” # Find a match in the config $dayObjects = $stateConfig | Where-Object {$_.WeekDays -contains $currentDayOfWeek } # If no matching day then exit if($dayObjects -ne $null){ # Can’t treat several objects for same time-frame, if there’s more than one, pick first $matchingObject = $dayObjects | Where-Object { ($startTime -ge $_.StartTime) -and ($startTime -lt $_.StopTime) } | Select-Object -First 1 if($matchingObject -ne $null) { Write-Verbose “Current Config Object” Write-Verbose $matchingObject.Name Write-Verbose “Weekdays: $($matchingObject.WeekDays -join ‘,’)” Write-Verbose “SKU: $($matchingObject.Sku)” Write-Verbose “Start Time: $($matchingObject.StartTime)” Write-Verbose “End Time: $($matchingObject.StopTime)” # if Paused resume if($pbiService.State -eq “Paused”) { Write-Verbose “The service is Paused. Resuming the Instance” $pbiService = Resume-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -ResourceGroupName $resourceGroupName -PassThru -Verbose } # Change the SKU if needed if($pbiService.Sku -ne $matchingObject.Sku) { Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to $($matchingObject.Sku)” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } else { Write-Verbose “No Interval Found. Checking current capacity tier.” if($pbiService.Sku -ne “A2”) { Write-Verbose “No Interval Found. Scaling to A2” Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to A2” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } } else { Write-Verbose “No Interval Found. Checking current capacity tier.” if($pbiService.Sku -ne “A2”) { Write-Verbose “No Interval Found. Scaling to A2” Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to A2” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } Write-Verbose “Done!” 10. Above script not includes Capacity pause, we can add that in the script. 11. Once we done with the script click on Save and the Publish the script. 12. Create the Schedule under the Shared Resources … Continue reading Auto scale the Power BI Embedded capacity using Job Scheduler in Azure
Share Story :
Multiple Ways to Share a Canvas App in PowerApps
Introduction :- This blog explains multiple options how to share a Canvas Apps in Power Apps. Steps :- Select App and click on Share option, below screenshot for reference Options of Sharing Published App :- 1. Specify each User by Name. Enter Username in sharing panel and click Share button App can be shared with other Users by checking the Checkbox for Co-owner. 2. Specify that your entire Organization. Type Everyone in the sharing panel and select the option of Everyone of your organization. 3. Specify a security group in Azure Active Directory. Select security group name in sharing panel Note :- You can’t share an app with a distribution group in your organization or with a group outside your organization. You can’t grant Co-owner permission to a security group if you created the app from within a solution. Tips :- Regardless of permissions, no two people can edit an app at the same time. If one person opens the app for editing, other people can run it but not edit it. Notify user for sharing app, select the Send an email invitation to new users check box. Conclusion :- Hope above Blog helps you sharing designed Canvas App in Power Apps with Users as per business requirement.
Share Story :
Embedding Power BI reports into Webpages- Part 1
In this blog we are going to embed a Power BI report into webpage, to do so below are the steps, 1. Go to https://app.powerbi.com/ and create new report or we can publish already created report, in this case, we have one sample report, which contain three pages Page1: Team Scorecard Page2: Industry Margin Analysis Page3: Executive Scorecard 2. Open the report to be embedded and Go to File ->embed Upon clicking on Embed, we will get following popup, which contain link that can be used for embedding and HTML code. 3. Now we have to paste the html code into HTML page, we can edit the iframe height and width 4. Now open the HTML page in the web browser then we can see the Power BI Report, in this, we can apply the filter, when we load the HTML page then we need to sign in. In this way, we can embed Power BI reports into Webpages.