Azure Archives - Page 14 of 14 - - Page 14

Category Archives: Azure

How to capture exception logs when Azure Data Factory Pipeline fail

Posted On February 20, 2020 by Sandip Patel Posted in

In this blog I am going to explain you how to store error logs using Stored procedure activity when Azure Data Factory Pipeline fail. First you need to create a Exceptionlogs table using following SQL script. CREATE TABLE [dbo].[ExceptionLogs]( [DataFactoryName] [varchar](100) NULL, [PipelineName] [varchar](100) NULL, [RunId] [varchar](100) NULL, [ErrorMessage] [varchar](1000) NULL, [CreatedOn] [datetime] NULL ) ON [PRIMARY] GO ALTER TABLE [dbo].[ExceptionLogs] ADD  CONSTRAINT [DF_CreatedOn]  DEFAULT (getdate()) FOR [CreatedOn] GO Also create a store procedure suppose you are going to use in ADF Pipeline. CREATE PROCEDURE [dbo].[Usp_ExceptionLog] (@DataFactoryName varchar(100), @PipelineName varchar(100), @runid varchar(100),@ErrorMessage varchar(1000)) AS BEGIN INSERT INTO ExceptionLogs ( [DataFactoryName], [PipelineName], [RunId], [ErrorMessage] ) VALUES ( @DataFactoryName, @PipelineName, @runid, @ErrorMessage ) END GO Here we are passing parameter like ADF name, Pipeline name, runid and error message that will mapped in ADF pipeline. Suppose you already created pipeline with some activities in ADF: Add a Stored Procedure activity to the canvas and connect another activity to this new activity using the arrow. When the connection has been made, right-click on the connection to change it to a Failure precedence constraint. This will change the color of the connector to red. Now click on Stored procedure activity and set the details like Stored procedure name and set the stored procedure parameters as seen in the below screen. Now we can run the pipeline and wait to see if failure of pipeline exception logs were stored in table. I hope this will help you.

Share Story :

How to Trigger Pipeline in ADF?

Introduction: This blog will guide you through how you can schedule your Pipeline in ADF with the help of scheduled trigger. The Time is crucial when you schedule your Pipeline. Go through all the steps to avoid the common mistake which you might make. Step 1: Click on Trigger and select “New/Edit”. Step 2: Click on “New”. Step 3: Select Type = “Scheduled”. Set the Start Date (UTC) and Time Recurrence to 1 Week(s) and Select the required Day(s). Step 4: Click on OK and Publish the changes. Step 5: The Time that you must enter here is in UTC, so convert the local time at which you want to schedule to UTC and set it accordingly. Use the following link to convert it. https://www.prokerala.com/travel/timezones/time-converter.php

Share Story :

How to Send Email Notifications for Failed Pipeline Runs : Part 1

Introduction: ADF has a feature to Monitor and Audit the ADF activity. These Alerts can be Fired on both success and failure of a pipeline based on how we configure it. Step 1: Go to ADF Monitory and click “New Alert Rule” to create a new alert. Step 2: Set the Alert rule name and its severity: – Sev 0 = Critical – Sev 1 = Error – Sev 2 = Warning – Sev 3 = Informational – Sev 4 = Verbose Here we will select Sev1 Step 3: Set the Alert criteria as Failed pipeline runs metrics which will trigger only when a pipeline activity fails. Step 4: Set the Alert criteria as Failed pipeline runs metrics which will trigger only when a pipeline activity fails.   Step 5: Select the Name of the Pipeline for which you want to send the alerts. Step 6: Select All the Failure Types. Step 7: Set the Alert Logic to compare the metric value with threshold calculated based on time aggregation. Set the period and frequency based on which the above time aggregation in alert logic condition works. For now, keep these options as default and click on Add Criteria. Note: Only two criteria can be added. Check the Next Part 2 of this Blog at :

Share Story :

How to Send Email Notifications for Failed Pipeline Runs : Part 2

Introduction: ADF has a feature to Monitor and Audit the ADF activity. These Alerts can be Fired on both success and failure of a pipeline based on how we configure it. We have already created a Target Criteria in the previous in Blog. Check it out here: In this Part we will configure Email Notification for Failure of Pipeline Runs Step 1: Under configure Email/SMS/Push/Voice notification click on Configure Notification to set an action group. An action group defines a set of notification preferences and actions included by Azure alerts. Step 2: Select Create new. Give an Action Group name and Short name and click on add notification. Step 3: Give the Action Name and check mark the Email Option. Step 4: Add the Email address and click on Add notification. Step 5: You can see that your Notification is now added and you can click on Add action group. Step 6: You can click on Create Rule once your Target criteria and Notifications are added and Enable rule upon creation is enabled. Step 7: Thus a new Email Alert has been created for Failure of Pipeline.

Share Story :

How to get Email notification when Azure Data Factory Pipeline fail

However, it seems there’s no “e-mail activity” in Azure Data Factory. I would like to send an e-mail notification if one of the activities fail. In this blog I am going to explain you how to send an e-mail notification using ADF Web Activity and Azure Logic App. Sending an Email with Logic Apps Logic Apps allows you to easily create a workflow in the cloud without having to write much code. First login to Azure.portal.com Choose to create a new resource, search Logic App Click on Create button, it will be asked to specify some details for the new Logic App: Click on Review + create again to finalize the creation of your new Logic App. After the app is deployed, you can find it in the resource menu. Click on the app to go the app itself. Click on Logic app designer link. In this tip, we need the HTTP request (“When a HTTP request is received”) as the trigger, since we’re going to use the Web Activity in ADF to start the Logic App. From ADF, we’re going to pass along some parameters in the HTTP request, which we’ll use in the e-mail later on. This can be done by sending JSON along in the body of the request. The following JSON schema is used: { “properties”: { “DataFactoryName”: {             “type”: “string”         }, “EmailTo”: {             “type”: “string”         }, “ErrorMessage”: {             “type”: “string”         }, “PipelineName”: {             “type”: “string”         }, “Subject”: {             “type”: “string”         } }, “type”: “object” } We’re sending the following information: The name of the data factory. Suppose we have a large environment with multiple instances of ADF. We would like to send which ADF has a pipeline with an error. The e-mail address of the receiver. An error messages. The name of the pipeline where there was an issue. The subject of the e-mail. In the editor, click on New step to add a new action to the Logic App: Click on New step it will send the e-mail. When you search for “mail” you will see there are may different actions: Click on Office 365 Outlook and select Send an email(V2) Once you’re logged in, you can configure the action. We’re are going to use dynamic content to populate some of the fields The result look like this: Once you click on Save, it will generate HTTP POST URL. Copy this URL for ADF. Triggering the Logic App from ADF Suppose you already created pipeline with some activities in ADF: Add a web activity to the canvas and connect another activity to this new activity using the arrow. When the connection has been made, right-click on the connection to change it to a Failure precedence constraint. This will change the color of the connector to red. Now we are using HTTP POST URL that will copied from the Azure Logic App We also need to add a header, where we will set the Content-Type to application/json. In the body, we enter the following JSON (following the structure mentioned before): { “DataFactoryName”: “@{pipeline().DataFactory}”, “PipelineName”: “@{pipeline().Pipeline}”, “Subject”: “An error has occurred!”, “ErrorMessage”: “The ADF pipeline has crashed! Please check the logs.”, “EmailTo”: “myemail@outlook.com” } We’re using system parameters to retrieve the name of the data factory and the name of the pipeline. All the other fields in the settings pane can be left as-is. Now we can run the pipeline and wait to see if any emails come in: I hope this will help you.

Share Story :

Power BI report using BYOD (Bring your Own Device)

Posted On December 19, 2019 by Admin Posted in

In this blog, we will learn how to create an Azure SQL database in Azure portal and export Finance and operation data into it. Steps: Go to www.portal.azure.com and create a new resource of SQL Database. Create New database, click on a new server and fill the necessary details, the login id and password will be the same that you will use to authenticate the database. Now the database is ready! You can click on Basic pricing tire to change the pricing of the database. Now copy the connecting string by clicking on Show database connection strings. Edit the string and add your id and password in it. Now go to finance and operations and open data management studio, click on Configure entity. Click on new and fill the required details. Name and description can be anything of your choice. Paste the connection string and click on validate. Once validated turn on the Enable triggers in target database toggle to ON. Now click on publish and select the entity that you want to export into your database. Click on change tracking and select enable entire entity. Now click on publish to publish the entity. This will detect the new added records in the table and export them to the database. On publishing, only the schema is exported to the database with entity name as a table name. Now again go to data management workspace and click on the export tab. Fill the name and description fields according to your wish and click on add entity. In add entity section enter the name of the entity that you have published earlier and selected the target entity, in our case it is the DemoBYOD. The refresh type should be incremental push only. Now the entity is added let’s export it. For that click on export in batch and set the recurrence according to your requirement. Set end date to no end date for the job to run for an infinite time. So the database is ready now let’s connect the database to Power BI! Open Power BI Desktop and click on get, select SQL Server database option. You already have the server name and database name with you! Select the required tables and load the data. All set you can start building your report! Hope this helps!

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange