Azure Archives - Page 8 of 14 - - Page 8

Category Archives: Azure

Use of Environment Variable inside Azure Function in C#

In this blog, we will learn how to configure and use Environment Variable in Azure Function. Concept of using Environment variable during your development which needs to deploy on multiple servers and connect to multiple systems. Whenever you use Environment variables to store the global constant or system credentials it reduces the time which was required to modify the code base for multiple deployments. Let’s get started with Creating the Azure Function and using the Environment variable. Step 1: Create an Azure Function inside the Visual Studio: I have created the HTTP Trigger Azure function inside Visual Studio and the authorization level is Anonymous Once you create an Azure Function project you will notice there are many files created with the project.  We have a “local.setting.json” file that we are going to use and it holds the environment variable while the development and testing phase of your Azure function. Step 2: Declare credentials/global variable inside “local.setting.json” Information on local.setting.json – This file represents your local application setting for your Azure Function and it stays with your local build only. While you publish your azure function to Azure Portal or commit your code to your Repo this file is never pushed to the server as they are excluded/prevent from committing or publishing. Step 3: Access variable inside the code You can access the environment variable using the Environment class(required System lib). Based on your preference you can either access Environment variable the directly inside the code or you can create a CONSTANT class as mentioned below screenshot. Here, I am passing this environment variable inside a response so that we can test it. Step 4: Testing using Postman We will require the API testing tool, here I am using Postman and the following is the link to download “Postman”. https://www.postman.com/downloads/ To test the application, click on the Start button on top of the Navbar as mentioned below in the screenshot [Button will have Project Name]. It will take a few minutes to Load the Azure Emulator Following is the screen you will be able to see and copy the URL highlighted in the red below and paste that URL into Postman. Result: Now, you need to configure the Environment while deploying the Azure function on Azure Portal, and below are steps for the same: Step 1: Deploy an Azure Function on Azure Portal. You can directly create it on Portal or you can create it from Visual Studio. Here I am going to create and deploy it from Visual Studio: Right Click on the project and select Publish: Select Azure and Click on Next: Click on the + icon to create a new function, You can use the existing resource group or create a new resource group as per your preference: After deployment is completed you can see the deployed Azure function on Azure Portal: Now, if you try to use the Azure Function URL and post the request then you will notice that all the environmental credentials part is missing as we have not configured the environment variable on the Azure portal inside Azure Function App: Step 2: Configure the App Environment variable on Azure Portal for online deployment. Navigate to the Configuration section of your deployed Azure Function Add the Application setting in your Configuration and the Name should same as you have declared in your local.setting.json file. Make sure your click on Save to update the configuration on Portal. After configuration if you test your deployed Azure function using postman you will get your expected result: This is how you can configure and use the Environment variable inside your Azure Function. It will make your life easier while deploying Azure on Development, UAT, or Production Environment. You can store the Source and Destination system credentials, SQL Access credentials, or Custom Global Variable based on your business requirement of integration.

Share Story :

Create new records in CRM from JSON data in Blob Storage

If you want to create new records from a json data, you can do it by using Logic App. First you need to make sure to upload json file in blob container Steps for Logic App: First select a HTTP Trigger block in a new logic app designer Next step is to select Get blob content (V2) and choose the storage account name and the blob file where you uploaded the json file. Note: If you don’t have a connection to Blob Storage account already you need to create one by clicking on change connection and the Add New. You need to fill the details properly to create a correct connection. Access Key of the storage account can be found in Storage account in access keys section Now select Parse Json block to extract all values. Click on use sample payload and paste your payload there to get a schema. Note: In Content if you directly put File content then you will get error after you run the logic. Error: You get this error so for the you need to convert the octet-stream file to json file by writing the following function json(body(‘Get_blob_content_(V2)’)) Now once your json is parsed you need to create record in CRM using this data by selecting Create a new record block in CRM. You need to first sign in with your CRM account and then choose the organization and entity. Later you need to map the fields. Note: If there are mulitple records in json then the logic app will automatically take the create new record block in  for each block as a step. Now the records will be created in CRM.

Share Story :

How to create a table using Azure Logic Apps with proper formatting

In this blog, we will see how we can create a table using HTML language using the Azure logic app. If you see carefully, there is an action block present which is “Create HTML table” but it does not give formatting flexibility. so in this blog, I will explain how we can use compose block to create a table with HTML syntax Step 1: To start with the logic app I took the recurrence trigger. you can use any trigger as per your requirement (which is set to run once a day) Step 2: Compose block is an important block as we are writing HTML syntax to format our table. For the demo I used sample data otherwise you can enter dynamic fields as well. in the highlighted section. Step 3: So output of Compose block is sent as the body of the email as shown in the below screenshot. Output: Email Hope this blog helps you. Thank you !!

Share Story :

Update records in Dynamics CRM using Azure Logic Apps

In this blog, we will see how we can update the records  CRM with the help of a logic app workflow. Step 1: Add the recurrence trigger in the logic app and set it to run in a one-day interval. you can set any interval. Without a trigger, you cannot create a logic app. Step 2: Add new step after recurrence trigger Step 3: Add the List records action from Dynamics 365 and connect to CRM with your credentials Select Account entity Step 4: For testing purposes, I have created a test account (account number = 1001)in the UAT environment as shown below. Step 5: Initialize the variable with account number is 1001. The account where you want to change/update the data Step 6: Filter the list of accounts where the account number is equal to 1001 as mentioned above step. Step 7: After finding a record from the account list then we will update the record here I updated the account name and city(Note: Account number should unique) Result Hope this blog helps you Thank you!

Share Story :

Load JSON data from Azure Blob Storage to Microsoft Finance and operation

In this blog we will see how to we can integrate data from Azure Blob storage to Microsoft Finance and operations. In this use case we are updating the data in the finance and operation destination Prerequisite: Azure Blob Storage Azure Finance and operation Step 1 : In this we will create the HTTP Trigger workflow or you can selected any trigger based on requirement. Step 2 : Azure Logic App will read the data stored in the Azure Blob Storage in JSON format. Below is sample JSON format, [    {       “MeterId”:”A001″,       “MeterRead”:”100″    },    {       “MeterId”:”A003″,       “MeterRead”:”300″    }    ] Step 3: workflow logic, It will read JSON formatted data which contains the Meter ID and Meter Reading. Based on Meter ID it will fetch the record id. Using record id Meter Reading data will be updated in F&O. Destination Finance and operation:  Hope this helps!

Share Story :

Send a message/notification on Microsoft Teams as soon as an Opportunity is created in Dynamics 365 via Azure Logic Apps.

In this blog we will see the steps in order to send a automated message via Teams as soon as an Opportunity is created in Microsoft Dynamics 365. Step 1: Go to portal.azure.com and select the Azure Logic App Resource. Step 2: Enter all the details such as the Name, Resource Group, Subscription, Region, etc. required while creating a Logic App. Step 3: Select the Dynamics 365 trigger When a record is updated. Step 4: Select Opportunities entity after setting by the Dynamics 365 CRM connection Step 5: Set the data refresh time as required. Step 6: Select the IF action in the next step and the condition would be status_label=won for true. Step 7: Inside True Block select Post a message in a chat or channel option. You can also handle the condition for False block, but in this case we can leave it blank. Step 8: You can post this message in a Group, Channel or send it as a Personal chat. Step 10: Wait for the trigger successful notification. Step 11: Go to Dynamics 365 CRM and navigate to opportunities entity. Step 12: Open a test opportunity or create one if doesn’t exist and close the opportunity as won. Step 13: As soon as you closed this opportunity you may have received the following message. Hence in this blog we saw how we can send messages on MS Teams using Azure Logic Apps on triggering certain conditions. Hope this helped!

Share Story :

How to create a Xero Data Source in Azure Data Factory

Posted On December 9, 2021 by Jaison Menezes Posted in Tagged in ,

Hello, in this blog we will understand the steps required to create Xero as a data source in Azure Data Factory which can be then used to copy data from Xero into various target systems such as Azure SQL. Create a new Data Source in Azure Data Factory and search for Xero. Add the desired name and description for the data source. Select OAuth 2.0 as authentication and select the host as api.xero.com Go to App management | Xero (https://developer.xero.com/app/manage) and create a new Web app. Enter developer.xero.com as company/application URL. Go to OAuth 2.0 configuration copy the client id and client secret after generating it. Download and install Postman windows app and import this collection in Postman Collection Web View | Postman (getpostman.com) Create a new environment and initialize the following variables. Client ID, Client Secret are the same that are obtained from Xero. For “scopes” variable add offline_access accounting.transactions as initial value and openid profile email accounting.contacts accounting.settings as current value respectively. Save the environment. No go to collections and select ‘Get Started’, under the Auth tab select type as OAuth 2.0 Make sure the environment we just created is selected Call all the variables set in the environment by typing “{{” in their respective fields. Add https://login.xero.com/identity/connect/authorize to the Auth URL field Add https://identity.xero.com/connect/token to the Access Token Field Save the collection. Now After saving, click on Get new access token. Sign in into Xero and provide access to the organization from the pop-up window that appears Now select the access token which will be shown on screen, right click and select Set “your environment name” put its value in the access token variable whose value we have set before. Set value of refresh token variable using the same method and paste this refresh token in Azure data factory field as well. After setting Access token and refresh token, scroll to the top and click on Use this token button. Send the GET request to Xero API After successful API call you will get the tenant id as well, paste this tenant id in Azure data factory. Make sure encrypted endpoints are enabled. Test the connection and after successful testing click on apply The Xero Connector is ready for use. Hope this blog helped!!

Share Story :

Azure Databricks – How to read CSV file from blob storage and push the data into a synapse SQL pool table

In this blog, we will learn how to read CSV file from blob storage and push data into a synapse SQL pool table using Azure Databricks python script. In part1 we created an Azure synapse analytics workspace, dedicated SQL pool in this we have seen how to create a dedicated SQL pool. In this blog, we will use a JDBC connection string to connect the SQL pool. Step 1: Sign to the Azure portal. Open Azure Databricks and click on lunch workspace to create a new Notebook. Step 2: Once the Azure Databricks Studio opens click on New Notebook and select your language, here I have selected “Python” language. Step 3: Add the following code to connect your dedicated SQL pool using the JDBC connection string and push the data into a table. Python script : from azure.storage.blob import BlobServiceClient import pandas as pd import io import pyspark.sql storage_account_name = ‘Your Storage account name’ storage_account_access_key = ‘Your Storage account access key’ spark.conf.set(‘fs.azure.account.key.’ + storage_account_name + ‘.blob.core.windows.net’, storage_account_access_key) blob_container = ‘Your container name’ filePath = “wasbs://” + blob_container + “@” + storage_account_name + “.blob.core.windows.net/Your CSV file name” empDf = spark.read.format(“csv”).load(filePath, inferSchema = True, header = True) connectionString=”Your JDSB connection sting;encrypt=true;trustServerCertificate=false;rewriteBatchedStatements=true;loginTimeout=30;” empDf.write.jdbc(connectionString,”[dbo].[Employee]”, mode=”append”) Step 4: You can get the JDBC connection string >> First open Synapse work space on the left pane in Analytics pools open SQL pool. Select your SQL pool, in overview you can find the link “Show database connection strings” and clicked on JDBC tab and copy the connection string. Step 7: Now click on the Run All button to execute the main script. Your script will execute successfully and also check in the SQL table. Hope this will help.

Share Story :

How to use Create HTML Table block in Azure Logic Apps to format JSON data

Sometimes after extracting data from certain data sources in JSON format we have to format and make this data easily readable so as to send this data via Microsoft teams or email. In this blog I will format a sample JSON code into a HTML table Since I am not using a data source, I am initializing a variable with data type as array (Create HTML block supports array Variables) and put a sample JSON code in the value section. Now we will convert this JSON piece of code into respective HTML type, to do this we will use Create HTML block, we have to select the array variable we initialized earlier and the type of columns would be custom. Enter Header details, this can be any string value, For Value Field Click on on it, go to expression and type the following expression. item()?[‘Product_ID’] You can replace the “Product_ID” with the name of attribute in JSON string After this we will send this data via email and run the trigger. As you can see the JSON code is converted into readable Email via HTML Hope this blog helped.

Share Story :

Send Records from Microsoft Dynamics 365 through email using Azure Logic Apps.

In this blog we will copy a list of Account names that exists on our Microsoft Dynamics 365 system and send all these names via email using Azure logic Apps. To start the map select a HTTP request trigger, which would run on demand at the click of the run trigger button. After defining the trigger and an action to list rows from Dynamics 365 and select the entity needed from the drop down. In this case I have selected Accounts. You can also add filters using parameters to limit the data extracted. Initialize a variable in order to store the data, since there are more than one records the data type of the variable should be an array Now for each record (value) found in Dynamics 365 we have to add it to the array, therefore we use a For each loop and append new data to array inside the for loop Next we will use a Send Email block and add the variable which we used to store account names. After this we can run the flow On running the map an email containing all account names in our Dynamics 365 system gets received. Hope this blog helped!!

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange