Azure Archives - Page 5 of 6 - - Page 5

Tag Archives: Azure

How to import an already existing Logic App Template in Azure Logic App

If you earlier have exported a Logic app template for further use and deleted that logic app, then below are the steps to import an already existing Logic app template as a Logic app. Go to https://portal.azure.com and then to the Logic app. Open any Logic app. For example, I took the below logic app. Go to Export Template. Click on Deploy Click on Edit Template Click on Load File. Select the template you want to import. Change the name to whatever you want. Here I will keep it as it is and Copy the name for further use. Copy the Name and click on Save. Check this page if it’s the same name and resource group as desired. Click on Review and Create. And then again, click on Create to create the Logic app. Now the Logic app template is imported and ready to use. I hope this was helpful.

Secure your Environment Variable inside Azure Function using Key Vault

In the previous blog, we learn about how to use the Environment variable in Azure Function. Environment variables are most important and confidential as they might contain the system credentials or configure that you don’t want any to access directly. The most unsecure way to store them is directly inside the Configuration of Azure function so in this blog we will see how to store the Environment variable inside the Key Vault and use the Key Vault reference in Configuration. You can create an Azure function and use Environment Variable to do so do refer to my previous blog: Step 1: Create a Key Vault Login to Azure Portal and click on + Create a resource called “Key Vault”. Click on Create You can create a new resource group or select the existing group based on your preference. (You can create all resources related to a single project in one resource group so that it will easier to manage resources project wise) You need to set up the Access policy during the creation or after creating the Key Vault. We will set up an access policy for Azure Function later in the blog. Step 2: Set an Access policy for Azure Function. Open the Azure Function in which you want to use the reference of Azure Key Vault. Navigate to Identity Tab and toggle System assigned status to “ON”. Copy the Object (Principle) ID as we are going to use it during adding access policy. Open Azure Key Vault and Navigate to the Access Policies section. Click on + Add Access Policy We will only Add Get Permission in the Secret permissions section. You need to only add the permission that you want to allow your Azure Function can perform. Select the Principle User. Copy the Copied “Object ID” into search it will be easier to find it. Click on Add. Once you will add a resource in the access policy, it will allow your resource (Azure Function) to perform a Get operation on all the secrets from Key Vault. Do not forget to Save after you add the policy. Step 3: Add Secret and Configure it inside Azure Function Navigate to the Secrets section and Click on + Generate/Import Enter the Name and Value(Credentials/Configuration) Open the current version and Copy the secret URL as it will be required while configuring the Reference. Navigate to the Configuration section of Azure Function. Change the Configuration and add the KeyVault Reference as below: @Microsoft.KeyVault(SecretUri=<Copied SecretURL>) Do not forget to Save after changing the configuration Step 4: Testing using Postman We will require the API testing tool, here I am using Postman and the following is the link to download “Postman”. https://www.postman.com/downloads/ Copy the Function URL and send a post request. As result, you will notice the username is coming from KeyVault as we have to change the configuration. Conclusion: This is how you can secure your Environment Variable using Azure KeyVault. You can set up multiple KeyVault for different deployment and access policies depending on the requirement. When you are storing the Credentials and Configuration in KeyVault, you just need to set up the Access policy. All configuration is managed by the Azure Admin and if origination has a policy that they don’t want to share the credentials they can follow this process and only share the KeyVault URL with the developer so that they only need to configure it.

Use of Environment Variable inside Azure Function in C#

In this blog, we will learn how to configure and use Environment Variable in Azure Function. Concept of using Environment variable during your development which needs to deploy on multiple servers and connect to multiple systems. Whenever you use Environment variables to store the global constant or system credentials it reduces the time which was required to modify the code base for multiple deployments. Let’s get started with Creating the Azure Function and using the Environment variable. Step 1: Create an Azure Function inside the Visual Studio: I have created the HTTP Trigger Azure function inside Visual Studio and the authorization level is Anonymous Once you create an Azure Function project you will notice there are many files created with the project.  We have a “local.setting.json” file that we are going to use and it holds the environment variable while the development and testing phase of your Azure function. Step 2: Declare credentials/global variable inside “local.setting.json” Information on local.setting.json – This file represents your local application setting for your Azure Function and it stays with your local build only. While you publish your azure function to Azure Portal or commit your code to your Repo this file is never pushed to the server as they are excluded/prevent from committing or publishing. Step 3: Access variable inside the code You can access the environment variable using the Environment class(required System lib). Based on your preference you can either access Environment variable the directly inside the code or you can create a CONSTANT class as mentioned below screenshot. Here, I am passing this environment variable inside a response so that we can test it. Step 4: Testing using Postman We will require the API testing tool, here I am using Postman and the following is the link to download “Postman”. https://www.postman.com/downloads/ To test the application, click on the Start button on top of the Navbar as mentioned below in the screenshot [Button will have Project Name]. It will take a few minutes to Load the Azure Emulator Following is the screen you will be able to see and copy the URL highlighted in the red below and paste that URL into Postman. Result: Now, you need to configure the Environment while deploying the Azure function on Azure Portal, and below are steps for the same: Step 1: Deploy an Azure Function on Azure Portal. You can directly create it on Portal or you can create it from Visual Studio. Here I am going to create and deploy it from Visual Studio: Right Click on the project and select Publish: Select Azure and Click on Next: Click on the + icon to create a new function, You can use the existing resource group or create a new resource group as per your preference: After deployment is completed you can see the deployed Azure function on Azure Portal: Now, if you try to use the Azure Function URL and post the request then you will notice that all the environmental credentials part is missing as we have not configured the environment variable on the Azure portal inside Azure Function App: Step 2: Configure the App Environment variable on Azure Portal for online deployment. Navigate to the Configuration section of your deployed Azure Function Add the Application setting in your Configuration and the Name should same as you have declared in your local.setting.json file. Make sure your click on Save to update the configuration on Portal. After configuration if you test your deployed Azure function using postman you will get your expected result: This is how you can configure and use the Environment variable inside your Azure Function. It will make your life easier while deploying Azure on Development, UAT, or Production Environment. You can store the Source and Destination system credentials, SQL Access credentials, or Custom Global Variable based on your business requirement of integration.

Create new records in CRM from JSON data in Blob Storage

If you want to create new records from a json data, you can do it by using Logic App. First you need to make sure to upload json file in blob container Steps for Logic App: First select a HTTP Trigger block in a new logic app designer Next step is to select Get blob content (V2) and choose the storage account name and the blob file where you uploaded the json file. Note: If you don’t have a connection to Blob Storage account already you need to create one by clicking on change connection and the Add New. You need to fill the details properly to create a correct connection. Access Key of the storage account can be found in Storage account in access keys section Now select Parse Json block to extract all values. Click on use sample payload and paste your payload there to get a schema. Note: In Content if you directly put File content then you will get error after you run the logic. Error: You get this error so for the you need to convert the octet-stream file to json file by writing the following function json(body(‘Get_blob_content_(V2)’)) Now once your json is parsed you need to create record in CRM using this data by selecting Create a new record block in CRM. You need to first sign in with your CRM account and then choose the organization and entity. Later you need to map the fields. Note: If there are mulitple records in json then the logic app will automatically take the create new record block in  for each block as a step. Now the records will be created in CRM.

How to create a table using Azure Logic Apps with proper formatting

In this blog, we will see how we can create a table using HTML language using the Azure logic app. If you see carefully, there is an action block present which is “Create HTML table” but it does not give formatting flexibility. so in this blog, I will explain how we can use compose block to create a table with HTML syntax Step 1: To start with the logic app I took the recurrence trigger. you can use any trigger as per your requirement (which is set to run once a day) Step 2: Compose block is an important block as we are writing HTML syntax to format our table. For the demo I used sample data otherwise you can enter dynamic fields as well. in the highlighted section. Step 3: So output of Compose block is sent as the body of the email as shown in the below screenshot. Output: Email Hope this blog helps you. Thank you !!

Update records in Dynamics CRM using Azure Logic Apps

In this blog, we will see how we can update the records  CRM with the help of a logic app workflow. Step 1: Add the recurrence trigger in the logic app and set it to run in a one-day interval. you can set any interval. Without a trigger, you cannot create a logic app. Step 2: Add new step after recurrence trigger Step 3: Add the List records action from Dynamics 365 and connect to CRM with your credentials Select Account entity Step 4: For testing purposes, I have created a test account (account number = 1001)in the UAT environment as shown below. Step 5: Initialize the variable with account number is 1001. The account where you want to change/update the data Step 6: Filter the list of accounts where the account number is equal to 1001 as mentioned above step. Step 7: After finding a record from the account list then we will update the record here I updated the account name and city(Note: Account number should unique) Result Hope this blog helps you Thank you!

Trigger Azure Pipeline with logic app

Hello, friends in this blog we will see how to trigger an azure data factory pipeline using a logic app. Step 1: Create an Azure Data Factory pipeline for your integration. Step 2: Create a logic app of your preference, for this blog, I am creating an HTTP trigger logic app. Step 3: Now click on add step and search for Azure data factory. Step 4: Select create pipeline run and fill in the required information. Step 5: Trigger your logic app and let it finish the run. Once that is done go to the monitor section of your Data factory and check whether the integration pipeline is triggered or not. Since we are triggering the pipeline from the logic app the triggered will be a manual trigger instead of your ADF trigger name. Hope this helps.

Connect Azure Databricks to Power BI

Open Power BI and Click on Get Data and Search for Azure Databricks and click on connect.  It will ask for below details,  Server Hostname  HTTP Path    Now we will see, how to get above details,  Go to Azure Databricks and click on Clusters  Once clusters is opened the go to Advanced setting > JDBC/ODBC. Under this we can get Server Hostname and HTTP Path, which can be used in above steps.  Fill the details and click on OK, It will ask for user credentials, after that it will open a pop up  asking to select the from List of Tables. Select Tables and click on load  In this way we can create Power BI report based on the current data received from Azure Databeicks.  In this way we can create Power BI report and create fields above it. azure BI

Create Linked Service with Salesforce in Azure Data Factory

Pre-requisite: Salesforce Account (In this case we are having Developer trial license) Azure Data Factory in Azure. Steps To create linked service for Salesforce in Azure Data Factory we need username, password and secret key. The username and password is the same credential we use for login in Salesforce and we can create secret key by following these steps: Go to Setup Go to Personal Information > Reset My Security Token and click on Reset Security token. Once we clicked on reset Security token we will get an email which contains security token, below is sample email Now Go to Linked Service and create new connection, and Search for the Salesforce connector When we select the Salesforce connector it will ask to enter credentials Now connection gets created, Once we created connection create Datasets and then create pipeline. Review the mapping and run the pipeline.

SEARCH :

FOLLOW CLOUDFRONTS BLOG :

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange