Azure Archives - Page 7 of 8 - - Page 7

Tag Archives: Azure

Create new records in CRM from JSON data in Blob Storage

If you want to create new records from a json data, you can do it by using Logic App. First you need to make sure to upload json file in blob container Steps for Logic App: First select a HTTP Trigger block in a new logic app designer Next step is to select Get blob content (V2) and choose the storage account name and the blob file where you uploaded the json file. Note: If you don’t have a connection to Blob Storage account already you need to create one by clicking on change connection and the Add New. You need to fill the details properly to create a correct connection. Access Key of the storage account can be found in Storage account in access keys section Now select Parse Json block to extract all values. Click on use sample payload and paste your payload there to get a schema. Note: In Content if you directly put File content then you will get error after you run the logic. Error: You get this error so for the you need to convert the octet-stream file to json file by writing the following function json(body(‘Get_blob_content_(V2)’)) Now once your json is parsed you need to create record in CRM using this data by selecting Create a new record block in CRM. You need to first sign in with your CRM account and then choose the organization and entity. Later you need to map the fields. Note: If there are mulitple records in json then the logic app will automatically take the create new record block in  for each block as a step. Now the records will be created in CRM.

How to create a table using Azure Logic Apps with proper formatting

In this blog, we will see how we can create a table using HTML language using the Azure logic app. If you see carefully, there is an action block present which is “Create HTML table” but it does not give formatting flexibility. so in this blog, I will explain how we can use compose block to create a table with HTML syntax Step 1: To start with the logic app I took the recurrence trigger. you can use any trigger as per your requirement (which is set to run once a day) Step 2: Compose block is an important block as we are writing HTML syntax to format our table. For the demo I used sample data otherwise you can enter dynamic fields as well. in the highlighted section. Step 3: So output of Compose block is sent as the body of the email as shown in the below screenshot. Output: Email Hope this blog helps you. Thank you !!

Update records in Dynamics CRM using Azure Logic Apps

In this blog, we will see how we can update the records  CRM with the help of a logic app workflow. Step 1: Add the recurrence trigger in the logic app and set it to run in a one-day interval. you can set any interval. Without a trigger, you cannot create a logic app. Step 2: Add new step after recurrence trigger Step 3: Add the List records action from Dynamics 365 and connect to CRM with your credentials Select Account entity Step 4: For testing purposes, I have created a test account (account number = 1001)in the UAT environment as shown below. Step 5: Initialize the variable with account number is 1001. The account where you want to change/update the data Step 6: Filter the list of accounts where the account number is equal to 1001 as mentioned above step. Step 7: After finding a record from the account list then we will update the record here I updated the account name and city(Note: Account number should unique) Result Hope this blog helps you Thank you!

Trigger Azure Pipeline with logic app

Hello, friends in this blog we will see how to trigger an azure data factory pipeline using a logic app. Step 1: Create an Azure Data Factory pipeline for your integration. Step 2: Create a logic app of your preference, for this blog, I am creating an HTTP trigger logic app. Step 3: Now click on add step and search for Azure data factory. Step 4: Select create pipeline run and fill in the required information. Step 5: Trigger your logic app and let it finish the run. Once that is done go to the monitor section of your Data factory and check whether the integration pipeline is triggered or not. Since we are triggering the pipeline from the logic app the triggered will be a manual trigger instead of your ADF trigger name. Hope this helps.

Connect Azure Databricks to Power BI

Open Power BI and Click on Get Data and Search for Azure Databricks and click on connect.  It will ask for below details,  Server Hostname  HTTP Path    Now we will see, how to get above details,  Go to Azure Databricks and click on Clusters  Once clusters is opened the go to Advanced setting > JDBC/ODBC. Under this we can get Server Hostname and HTTP Path, which can be used in above steps.  Fill the details and click on OK, It will ask for user credentials, after that it will open a pop up  asking to select the from List of Tables. Select Tables and click on load  In this way we can create Power BI report based on the current data received from Azure Databeicks.  In this way we can create Power BI report and create fields above it. azure BI

Create Linked Service with Salesforce in Azure Data Factory

Pre-requisite: Salesforce Account (In this case we are having Developer trial license) Azure Data Factory in Azure. Steps To create linked service for Salesforce in Azure Data Factory we need username, password and secret key. The username and password is the same credential we use for login in Salesforce and we can create secret key by following these steps: Go to Setup Go to Personal Information > Reset My Security Token and click on Reset Security token. Once we clicked on reset Security token we will get an email which contains security token, below is sample email Now Go to Linked Service and create new connection, and Search for the Salesforce connector When we select the Salesforce connector it will ask to enter credentials Now connection gets created, Once we created connection create Datasets and then create pipeline. Review the mapping and run the pipeline.

Triggering Azure Pipeline from on premise SQL Server

In this blog, we are going to Trigger ADF Pipeline whenever there is insert or update operation is performed on on-premise SQL Server. Steps: Create ADF Pipeline.In this case we have already created Pipeline. And below is Dataflow. Create PowerShell Script (for Authentication and Triggering Pipeline).Below is the code for authentication and triggering of Pipeline, to do so we should have following details: Tenant ID, Application ID ,Client Secret, Subscription ID, Resource group Name, API version and pipeline name. Create Job in SQL and Trigger on Table where Insert update or delete. Make sure that SQL server agent is running. Create a SQL Job with following Job StepEnter step name, select type as PowerShell, select account will be used to run PowerShell script and enter the code in command section. Next we will create trigger on the tablesWe are creating triggers on the tables on which insert, update, delete operations will be performed. Now whenever we will do insert, update or delete in the table the pipeline will automatically get executed.We have following data in Accounts table, Now we updated currency of Account ID 1, the moment we updated the record pipeline gets automatically triggered, After some time check status, Hope above helps!

Creating Azure Data Factory

Login to azure portal, click on create a resource>Analytics>Data Factory Enter the required details and data factory gets created, enable git option will prompt to enter the git repository details which can later be used for CI/CD. In this way we can create Azure Data factory.

Creating Azure blob container

Azure blob container is a service for storing large amount of unstructured object data, text files or binary. We can publicly expose or can use blob storage privately. Below are the steps to create Azure blob container: Go to Storage account, select storage account (In this case cloudstorage123) > overview and click on containers. When we go inside of container, click on ‘+ container’ to create new container, then it will prompt to enter the container name and public access level. Below are public access levels, Private: By Default, it is private, and it is accessible to account owner. Blob: Allows public read access to blob. Container: Allows public read and list access to entire container. after entering above details container gets created, Now we will create blob storage, to do we have to click on upload Now we can see that Csv file uploaded and under the Input folder. Each step description, Select file from local system. In advanced section, Select Authentication type either using account key or Azure AD user account. Select the blob type: It can be Block blob, page blob and append blob. Block blob: It stored text and binary data, up to 4.7 TB Block blobs are made up of blocks of data and managed individually. Appends blob: This blob is like Block lob and it is optimized to appends operations. e.g. it can be used for logging Page blob: It stored random access file up to 8 TB in size, page blobs stores virtual hard drive and serve as disks for VM. Block size: select as per requirement, it can be 64KB,128KB,256KB,512KB,2MB,4MB,100MB Enter the folder name in which we wanted to upload blob, if we enter the folder name which does not exists it will create new folder. Hope above helps!

SEARCH :

FOLLOW CLOUDFRONTS BLOG :

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange