Category Archives: Others
Convert your FAQ’s Page to a Bot using Power Virtual Agents
Introduction: FAQ’s are meant to provide easy answers to User questions, but not always will the user like to read the entire FAQ page, in such situations we can provide the users with a BOT to answer their queries. This can be done using Power virtual Agents. Follow this simple steps to create your bot. 1) Login to Power Virtual Agents and go to topics 2) Go to Suggest topics 3) Paste the weblink of FAQ page, click add and then Start. 4) Once we get all the suggested topics, click add to existing to make it an existing topic. 5) Once added to existing, turn the topic on and test your bot. Conclusion: You can further embed the bot with your website and users will find it easy to access the bot instead of reading the FAQ page.
Share Story :
How to Trigger Pipeline in ADF?
Introduction: This blog will guide you through how you can schedule your Pipeline in ADF with the help of scheduled trigger. The Time is crucial when you schedule your Pipeline. Go through all the steps to avoid the common mistake which you might make. Step 1: Click on Trigger and select āNew/Editā. Step 2: Click on āNewā. Step 3: Select Type = āScheduledā. Set the Start Date (UTC) and Time Recurrence to 1 Week(s) and Select the required Day(s). Step 4: Click on OK and Publish the changes. Step 5: The Time that you must enter here is in UTC, so convert the local time at which you want to schedule to UTC and set it accordingly. Use the following link to convert it. https://www.prokerala.com/travel/timezones/time-converter.php
Share Story :
SQL Server Delete duplicates record from a table
In this blog we will learn how to delete duplicates records from a table in SQL Server. Letās create a sample table for the demo. Step 1: Create temporary table called tmpTable as follows DECLARE @tmpTable TABLE(EmpID INT IDENTITY(1, 1), EName VARCHAR(50)) Step 2: Insert some record into the tmpTable table. INSERT INTO @tmpTable(EName) VALUES (‘Ricky’) INSERT INTO @tmpTable(EName) VALUES (‘Ricky’) INSERT INTO @tmpTable(EName) VALUES (‘Ricky’) INSERT INTO @tmpTable(EName) VALUES (‘Lloyd’) INSERT INTO @tmpTable(EName) VALUES (‘Lloyd’) INSERT INTO @tmpTable(EName) VALUES (‘Lloyd’) INSERT INTO @tmpTable(EName) VALUES (‘Jack’) Step 3: write SQL query from the tmpTable table Select * from @tmpTable The below screenshot shows the output of the query There are duplicate rows (1,2,3) and (4,5,6) for the Employee that have the same Employee name. To delete the duplicate rows from the SQL Server, you follow these steps Find duplicate rows using GROUP BY clause or ROW_NUMBER() or RANK() function. Use DELETE statement to remove the duplicate rows. Method 1: Find the maximum employee id from the table using GROUP BY function and delete those record that are not present in main table. DELETE FROM @tmpTable WHERE EmpID NOT IN ( SELECT MAX(tmp.EmpID) as [MaxEID] FROM @tmpTable as tmp GROUP BY tmp.EName ) Method 2: Using the common table expression (CTE) to delete duplicate rows: ;WITH tblTest AS ( SELECT EmpID , ROW_NUMBER() OVER (PARTITION BY EName ORDER BY EmpID) AS [RowNumber] FROM @tmpTable ) DELETE FROM tblTest WHERE RowNumber > 1 In this statement First, the CTE uses the ROW_NUMBER() function to find the duplicate rows specified by values in the EName columns. Then, the DELETE statement deletes all the duplicate rows but keeps only one occurrence of each duplicate group. Method 3: We can also delete duplicate row using RANK() as shown below : DELETE FROM @tmpTable WHERE EmpID IN ( SELECT a.EmpID FROM ( SELECT tmp.* , RANK() OVER (PARTITION BY EName ORDER BY EmpID) as [Rank] FROM @tmpTable as tmp ) as a WHERE a.Rank > 1 ) I hope this will help you.
Share Story :
How to fix Client found response content type of ”, but expected ‘text/xml’
While running Integration map from Salesforce to AX we might encounter blank response received but expected text/xml. In current use case existing user was deactivated and new user was added into salesforce. To fix this kind of issue we can follow below steps: Login to the salesforce environment Go to setup Following screen will open and go to quick find and type āOutbound Messageā Following screen will open click on continue Then you will see the outbound message list Click on edit and change user name In this way we can fix the blank response issue in salesforce.
Share Story :
How to Capture Error Description and Error Log Date in Script Component in SSIS
Our Scenario: To resolve the Errors that occur due to componentMetadata130 and to capture the Error Log Date and Error Description while doing custom Error Handling in SSIS use the following steps. Solution: Error Handling at OLEDB Source Block: Step 1: Add a Script Component to catch the Error Description, Error Log Date and Entity. Select Transformation and click on Ok. Step 2: Connect the Error Output (Red Arrow) from OLDEB Source to the Script Component. Select āRedirect rowā for all columns in the Error and Truncation Columns to redirect Error Output to Script Component. Step 3: Configure the Script Component as following: In Input Columns section Select Error Code and Error Column. In Inputs and Outputs section Add following Columns Column Name Data Type Length ErrorDescription Unicode string [DT_WSTR] 500 ErrorLogDate database timestamp [DT_DBTIMESTAMP] —- Entity string [DT_STR] 50 In the Connection Managers Section add a new connection and select your SQL connection. In the Script Section click on Edit Script. After a minute a New Editor Window will Open. Here you have to copy and paste the following Script inside the āpublic override void Input0_ProcessInputRow(Input0Buffer Row)ā section. Code Snippet: try { Row.Entity = “Unit”; Row.ErrorDescription = this.ComponentMetaData.GetErrorDescription(Row.ErrorCode); Row.ErrorLogDate = DateTime.Now; } catch (Exception ex) { Row.ErrorDescription = “Unable to get Error Column Name”; } Click on Save and then Close the Window. Step 4: Add a Data Conversion Block to avoid any Truncation Errors because of Data Type Conversion between NVarchar and Varchar Data Types of the Error Description Column. Select ErrorDecription Column and select Data Type as String. Click on OK. Step 5: Connect the Output columns of Script Component to the required Destination Columns. Conclusion: This is the simplest way in which we can record the Error Description and Error Log Date while custom Error Handling in SSIS.
Share Story :
Auto scale the Power BI Embedded capacity using Job Scheduler in Azure
Power BI Embedded is a Microsoft Azure service that is useful for the ISVs and developers to embed visuals, reports and even dashboard into the application. As Power BI Embedded is a PaaS analytics solution which provide Azure based capacity, Power BI embedded charge customers on an hourly basis there are n annual commitment for the Power BI Embedded service. As Power BI Embedded charges on hourly basis and there is no direct Auto Scaling feature available on Azure but, we do have API provided by using which we can scale the capacity. In this blog we are going to see how scale the Power BI Embedded capacity using PowerShell script. Before going to start weāll first quick list the set up the prerequisites: You will need an Azure account, if you are implementing the PowerShell script for your organisation then you must have co-administrator role assign kindly keep in mind that if you have contributor role assign then youāll not be able to make Automation account.(weāll see about the Automation account in the later part of this blog.) Power BI Embedded subscription. Automation Account. Iām assuming you already have Azure account along with the subscription for the Power BI Embedded. Steps:- Create Automation Account:- Automation account is use to manage the Azure resource across all the subscription for the given tenant. To create Automation click on the create resource in your Azure portal as shown below and search for Automation account. Or you can type in search box Automation Account. 2. Click on create Automation Account and make sure to fill the following details. If you have multiple subscription then make sure to select proper subscription from drop-down. Make sure create Azure Run As account is selected to Yes (if you are co-administrator or administrator then it will by default selected to Yes). Once we create Azure automation account it will show under automation account. 3. Open the Automation account and go to the Connections and add below connection and types as shown below (Click on Add a connection and type the name and type as shown below) 4. For the AzureClassicRunAsConnection set the CertificateAssetName to AzureRunAsCertificate. 5. Add the Power BI Embedded subscription to your resource group. 6. Once we have Automation account ready go to the Runbooks under Process Automation in Automation Account. Runbook is useful for the routine procedures and operations. We can also use Azure Function app instead of Runbook. 7. Click on the Create a runbook and use fill following details. 8. Once we open runbook make sure to import the Module AzureRM.PowerBIEmbedded which can be installed by going to Module under Shared Resources then click on Browse gallery and search for the AzureRM.PowerBIEmbedded module. 9. Use the below PowerShell script which can also be found on the Power BI discussion site. $resourceGroupName = ā<your resource group>ā $instanceName = ā<Power BI embedded instance name>ā $azureProfilePath = āā $azureRunAsConnectionName = āAzureRunAsConnectionā #āPowerBIAutoscaleā $configStr = ā [ { Name: āāWeekday Heavy Load Hoursāā ,WeekDays:[1,2,3,4,5] ,StartTime: āā06:45:00″ā ,StopTime: āā23:45:00″ā ,Sku: āāA4″ā } , { Name: āāEarly AM Hoursāā ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: āā00:00:00″ā ,StopTime: āā04:44:00″ā ,Sku: āāA1″ā } , { Name: āāModel Refreshāā ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: āā04:45:00″ā ,StopTime: āā06:45:00″ā ,Sku: āāA3″ā } , { Name: āāWeekend Operational Hoursāā ,WeekDays:[6,0] ,StartTime: āā06:45:00″ā ,StopTime: āā18:00:00″ā ,Sku: āāA3″ā } ] ā $VerbosePreference = āContinueā $ErrorActionPreference = āStopā Import-Module āAzureRM.PowerBIEmbeddedā Write-Verbose āLogging in to Azureā¦ā # Load the profile from local file if (-not [string]::IsNullOrEmpty($azureProfilePath)) { Import-AzureRmContext -Path $azureProfilePath | Out-Null } # Load the profile from Azure Automation RunAS connection elseif (-not [string]::IsNullOrEmpty($azureRunAsConnectionName)) { $runAsConnectionProfile = Get-AutomationConnection -Name $azureRunAsConnectionName Add-AzureRmAccount -ServicePrincipal -TenantId $runAsConnectionProfile.TenantId ` -ApplicationId $runAsConnectionProfile.ApplicationId -CertificateThumbprint $runAsConnectionProfile.CertificateThumbprint | Out-Null } # Interactive Login else { Add-AzureRmAccount | Out-Null } $fmt = āMM/dd/yyyy HH:mm:ssā # format string $culture = [Globalization.CultureInfo]::InvariantCulture $startTime = Get-Date Write-Verbose āCurrent Local Time: $($startTime)ā $startTime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($startTime, [System.TimeZoneInfo]::Local.Id, āEastern Standard Timeā) Write-Verbose āCurrent Time EST: $($startTime)ā $scheduleTimeMidnight = ($startTime).Date Write-Verbose āSchedule Time Base (Midnight): $($scheduleTimeMidnight)ā $currentDayOfWeek = [Int]($scheduleTimeMidnight).DayOfWeek Write-Verbose āDOW: $($currentDayOfWeek)ā $stateConfig = $configStr | ConvertFrom-Json #| Select-Object Sku, WeekDays, Name, StartTime, EndTime #, @{Name=āStartTimeā; Expression={[DateTime]:Smiley TonguearseExact($_.StartTime, $fmt, $culture)}}, @{Name=āStopTimeā; Expression={[DateTime]:Smiley TonguearseExact($_.StopTime, $fmt, $culture)}} Write-Verbose āWriting Config Objectsā¦ā foreach($x in $stateConfig) { Write-Verbose āName: $($x.Name)ā Write-Verbose āWeekdays: $($x.WeekDays -join ā,ā)ā $x.StartTime = ($scheduleTimeMidnight).AddHours([int]$x.StartTime.Split(ā{:}ā)[0]).AddMinutes([int]$x.StartTime.Split(ā{:}ā)[1]).AddSeconds([int]$x.StartTime.Split(ā{:}ā)[2]) Write-Verbose āStart Time: $($x.StartTime)ā $x.StopTime = ($scheduleTimeMidnight).AddHours([int]$x.StopTime.Split(ā{:}ā)[0]).AddMinutes([int]$x.StopTime.Split(ā{:}ā)[1]).AddSeconds([int]$x.StopTime.Split(ā{:}ā)[2]) Write-Verbose āEnd Time: $($x.StopTime)ā } Write-Verbose āGetting current statusā¦ā # Get the server status $pbiService = Get-AzureRmPowerBIEmbeddedCapacity -ResourceGroupName $resourceGroupName switch ($pbiService.State) { āScalingā { Write-Verbose āService scaling operation in progress⦠Aborting.ā end } āSucceededā {Write-Verbose āCurrent Status: Runningā} Default {Write-Verbose āCurrent Status: $($pbiService.State)ā} } Write-Verbose āCurrent Capacity: $($pbiService.Sku)ā # Find a match in the config $dayObjects = $stateConfig | Where-Object {$_.WeekDays -contains $currentDayOfWeek } # If no matching day then exit if($dayObjects -ne $null){ # Canāt treat several objects for same time-frame, if thereās more than one, pick first $matchingObject = $dayObjects | Where-Object { ($startTime -ge $_.StartTime) -and ($startTime -lt $_.StopTime) } | Select-Object -First 1 if($matchingObject -ne $null) { Write-Verbose āCurrent Config Objectā Write-Verbose $matchingObject.Name Write-Verbose āWeekdays: $($matchingObject.WeekDays -join ā,ā)ā Write-Verbose āSKU: $($matchingObject.Sku)ā Write-Verbose āStart Time: $($matchingObject.StartTime)ā Write-Verbose āEnd Time: $($matchingObject.StopTime)ā # if Paused resume if($pbiService.State -eq āPausedā) { Write-Verbose āThe service is Paused. Resuming the Instanceā $pbiService = Resume-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -ResourceGroupName $resourceGroupName -PassThru -Verbose } # Change the SKU if needed if($pbiService.Sku -ne $matchingObject.Sku) { Write-Verbose āUpdating Capacity Tier from $($pbiService.Sku) to $($matchingObject.Sku)ā Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } else { Write-Verbose āNo Interval Found. Checking current capacity tier.ā if($pbiService.Sku -ne āA2ā) { Write-Verbose āNo Interval Found. Scaling to A2ā Write-Verbose āUpdating Capacity Tier from $($pbiService.Sku) to A2ā Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } } else { Write-Verbose āNo Interval Found. Checking current capacity tier.ā if($pbiService.Sku -ne āA2ā) { Write-Verbose āNo Interval Found. Scaling to A2ā Write-Verbose āUpdating Capacity Tier from $($pbiService.Sku) to A2ā Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } Write-Verbose āDone!ā 10. Above script not includes Capacity pause, we can add that in the script. 11. Once we done with the script click on Save and the Publish the script. 12. Create the Schedule under the Shared Resources … Continue reading Auto scale the Power BI Embedded capacity using Job Scheduler in Azure
Share Story :
Multiple Ways to Share a Canvas App in PowerApps
Introduction :- This blog explains multiple options how to share a Canvas Apps in Power Apps. Steps :- Select App and click on Share option, below screenshot for reference Options of Sharing Published App :- 1. Specify each User by Name. Enter Username in sharing panel and click Share button App can be shared with other Users by checking the Checkbox for Co-owner. 2. Specify that your entire Organization. Type Everyone in the sharing panel and select the option of Everyone of your organization. 3. Specify a security group in Azure Active Directory. Select security group name in sharing panel Note :- You can’t share an app with a distribution group in your organization or with a group outside your organization. You can’t grant Co-owner permission to a security group if you created the app from within a solution. Tips :- Regardless of permissions, no two people can edit an app at the same time. If one person opens the app for editing, other people can run it but not edit it. Notify user for sharing app, select the Send an email invitation to new users check box. Conclusion :- Hope above Blog helps you sharing designed Canvas App in Power Apps with Users as per business requirement.
Share Story :
How to create Dynamics 365 Finance & Operations (formerly Dynamics AX) connection in SSIS
Introduction: Creating a Connection is one of the first Steps during the integration of data. While performing Integration with Dynamics AX or more specifically Dynamics 365 for Finance and Operations create a connection using the following steps. Step 1: Right Click on Connection Managers and click on New Connection Manager. Step 2: Configure your AX Application to get the following details: Step 2.1 Go to www.portal.azure.com and login with your credentials. Go to āapp registrations ā. Step 2.2 Enter a Name for the Application, appropriate Support Account Type and Redirect URI having a Similar Address as that of the Service URL which you Enter and finally click on Register. Step 2.3 Note Down the Following details for creating connection. Step 2.4 Click on New client Secret and add description and select expires as required. Step 2.5 Add description and select Never in expires section and click on Add. Step 2.6 This is the Client secret which is Generated only once, so must copy and store it in a file for future reference. Step 2.7 In the Data Scope section select required value as per your AX deployment and click on register and click on Open in Browser. You will be redirected to your Finance and Operations page. Once authorized you can test the Connection by clicking on the Test Connection Button. Conclusion Now you can Move on with development of your Control and Data flow for Integration with Finance and Operations as a Source/Destination.
Share Story :
How to run your SSRS Report for Past āNā Days from the Report run Date
Introduction If have a parameter on our report to see the past āNā days data on our report, we can perform the following steps. Step 1: Add a new Parameter, say āDaysā Step 2: Under the Available Values Section, Add a new Value for the Parameter with value as 7 and expression having the following formula: =DateAdd(“d”,-7,Globals!ExecutionTime) Step 3: Under the Default Values Section, Add a new Value for the Parameter with expression having the following formula: =DateAdd(“d”,-7,Globals!ExecutionTime) Step 3: In your Dataset, add the following Filter Condition in the XML. <filter type=”and”> <condition attribute =”modifiedon” operator=”on-or-after” value=”@Days” /> </filter> Conclusion Thus, now you can view your report with records modified in the past āNā days.
Share Story :
Suggest Vendor Payments feature in Payment Journal – NAV/ Business Central
Introduction: In this blog, I will be attempting to highlight on Suggest Vendor Payment functionality in Payment Journals. Pre-requisites: Microsoft Dynamics Business Central Microsoft Dynamics NAV Demonstration: Last Payment Date: This date is the latest date which can be appeared on Vendor Ledger Entries for making the Payment. Entries that have Due Date or Discount Date before the Last Payment Date will be included in Suggest functionality. Last Payment Date Find Payment Discounts: Find all the Payments which will have discounts. If this field is TRUE then Suggest Vendor Payment functionality will include Payments with Discounts as well as Payments without Discounts. Find Payment Discounts Use Vendor Priority: While making payments, if you want to prepare the entries priority wise, then you will need to set the priority of Vendor under the Payments section. Use Vendor Priority Priority field in Payment Section in Vendor Card By default, the priority is set to 0. If Vendor priority is not set which means that the Vendor Payments are to be processed in Alphabetically FCFS fashion. Available Amount: If the Available Amount is filled in which means that the maximum amount for the payments will be considered. Available Amount Skip Exported Payments: This boolean is used to skip those payments for which the Payment EFT file has been exported. This is generally used where the Payments are not supposed to be posted but the new payments are to be included in the list. Skip Exported Payments Check Other Journal Batches: This boolean field is used to skip all those payments which have already been included in the different General Journal Batches. This will avoid including duplicate payments. Check Other Journal Batches Summarize Per Vendor: This will make a single Payment per Vendor which will be applied to multiple documents. Also, this boolean will create separate payment entries for payments with different currencies. This feature cannot be used in conjunction with ‘Calculate Posting Date from Applied Entries‘ Summarize per Vendor By Dimension: This will create the entries for which the applied document will have the correct dimension. By Dimension Posting Date: Sets the Posting Date for the General Journal Lines. Posting Date is generally set to today’s date or date after the late payment date. Posting Date Calculate Posting Date from Applied Entries: Sets the Posting Date from Applied Entries based upon the Due Date of the Applied Document. Calculate Posting Date from Application Starting Document No.: Starting No. Series for the Payment Journal. This value will be set to Next No. after current Payment Journal Lines have been created in Payment Journal. This field is generally auto-filled. Starting Document No. New Doc No. per Line: This is used to create a separate Document No. for a separate entry line. When this is set to false, automatically a single Document No. is used for multiple entries of a single Vendor. New Doc. No. per Line Balance Account Type: Balance account type can be a G/L Account or Bank. Bal. Account Type Bank Account No. Used to generate only payments from a specific Bank Account. Vendor Filters: Vendor Filters are basically used to get only those payments related to a particular Vendor. Vendor Filters Conclusion: Thus, Suggest Vendor Payments can be a pretty good tool to deal with Payments where you don’t have to manually go and reconcile for the Vendor Purchase Invoices. There are a lot of variants to get the pending Payments with a lot of filters in Suggested Vendor Payments.