Azure Archives - Page 10 of 15 - - Page 10

Category Archives: Azure

Azure Databricks – Part 2 – How to read Amazon DynmoDB table data using NoteBooks

In this blog, we will learn how to connect AWS DynmoDB and read the table data using Python script step by step. Step 1: In the left pane, select Azure Databricks. From the Common Tasks, select New Notebook. Step 2: In the Create Notebook dialog box, enter a name, select Python as the language, and select the Spark cluster that you created earlier. Step 3: Once the Notebook creates you can write a python script to connect AWS DynmoDB using the boto3 client library. To connect AWS DynmoDB you must have an AWS access key ID and AWS secret access key. Python script : # Databricks notebook source import boto3 import pandas as pd session = boto3.session.Session(aws_access_key_id=’your AWS access key ID’,aws_secret_access_key=’your AWS secret access key’,region_name=’your region’) dynamodb = session.resource(“dynamodb”) table = dynamodb.Table(“Table Name”) response = table.scan() items = response[“Items”] data = pd.DataFrame(items) output = data.to_csv (index_label=”idx”, encoding = “utf-8”) print(output) Step 4: Now you can check the output by pressing the Shift + Enter key or click on the Run cell. Hope this will help.

Share Story :

SQL Trigger not populating with Table in Logic App

Wondered How to solve SQL triggered Azure Logic Apps issue of not being able to select your table in dropdown? This blog will help you fix this issue.

Share Story :

ADF’s Wrangling Data Flow (Power Query)– How do you get matched rows from the two data sources using Inner Joins?

Posted On April 25, 2021 by Sandip Patel Posted in Tagged in

In this blog, we will learn how to get matched rows from the two data sources using inner join in ADF’s Wrangling Data Flow step by step. Step 1: Add a Power query flow as per the below screenshot. Step 2: In the New power query give the proper power query name and add the data source that you want to merge. Here I am adding two datasets named “DS_EMP1” and “DS_EMP2”, both data sources have employee information. Step 3: By default, the UserQuery will point to the first dataset query. All the transformation should be done on the UserQuery. Step 4: Now click on Merge queries to merge your dataset. Step 5: select a table and matching columns to create a merge table, here I have select EmpID as a common key to merge the data, and the join kind will be “Inner”. Step 6: Once you click the OK button, you got a warning “Nested join must be expanded”. Step 7: Click on expand dataset button to expand your result and select columns whatever you want from the other data source, here in my case both the datasets have the same column name so I deselect all the columns from the result dataset. Step 8: Now the UserQuery will show the matched rows, that’s all you need to do to get matched rows in two data sources. Hope this will help.

Share Story :

ADF’s Mapping Data flows – How do you get matched rows from the two data sources using Inner Joins?

Posted On March 23, 2021 by Sandip Patel Posted in Tagged in

In this blog, we will learn how to get matched rows from the two data sources using inner join in ADF’s Mapping Data flows step by step. Step 1: Add a data flow activity and name as “InnerJoin_Test”, in the settings tab add a new data flow. Select the Source Settings tab, add a source transformation, and connect it to one of your datasets. Step 2: In the Data preview tab you can see your data. Step 3: Add another source and name “Employee2”, in the source settings tab connect it to one of your datasets. Step 4: In the Data preview tab you can see your data. Step 5: Add a Join transformation, named “InneJoin”. The Join transform will allow you to join 2 Data flow. In the Join settings tab set left the stream and right stream and select join type as inner. Apply to join conditions on the unique field, in this demo I pick up “Emp Id” as a join condition. Step 6: In the Data preview tab you can see the matched rows result, that’s all you need to do to get matched rows in two data sources. Hope this will help.

Share Story :

ADF’s Mapping Data flows – How do you get distinct rows and rows count from the data source?

Posted On March 23, 2021 by Sandip Patel Posted in Tagged in

In this blog, we will learn how to get distinct rows and rows count from the data source via ADF’s Mapping Data flows step by step. Step 1: Create an Azure Data Pipeline. Step 2: Add a data flow activity and name as “DistinctRows”. Step 3: Go to settings and add a new data flow. Select the Source Settings tab, add a source transformation, and connect it to one of your datasets. Step 3: In the Projection tab, it allows you the change the column data type. Here I have changed my Emp ID column to Integer. Step 4: In the Data preview tab you can see your data. Step 5: Add an Aggregate transformation, named “DistinctRows”. In the group by settings, you need to choose which column or combination of columns will make up the key(s) for ADF to determine distinct rows, here in this demo I pick up “Emp ID” as my key columns. Step 6: The inherent nature of the aggregate transformation is to block all metadata columns not used in the aggregate. But here, we are using the aggregate to filter out non-distinct rows, so we need every column from the original dataset. To do this, go to the aggregate settings and choose the column pattern. Here, you will need to make a choice between including the first set of values from the duplicate rows, or the last. Essentially, choose which row you want to be the source of truth. Step 7: That’s all you need to do to find distinct rows in your data, click on the Data preview tab to see the result. You can see the duplicate data have been removed. Step 8: The row counts are just aggregate transformation, to create a row counts go to Aggregate settings and use the function count(1). This will create a running count of every row. Hope this will help.

Share Story :

Changing the Process of a Project already created in Azure DevOps

While setting up and working on a project in Azure DevOps sometimes we realised that the process we have selected is not what we need for the current project and we want to change the process in AzureDevOps without loosing any task. Before making any changes, we should keep a check i.e. The inherited process you are trying to move the project should contain at least one of the expected work items Open the project and click on the project setting as shown in the screenshot below In the project setting you can see the project process as “Agile”, click on it. As you click on it, you will be redirected to the below page where you have to select the project and click on the three vertical dots for changing the process. As you click on change a process, a panel will be open from the left-hand side from where you need to select the process. As you select the process and click on save, the project process will get change You can go back to the project and see, all the tasks visible which you can move as per the new process.

Share Story :

Let’s get started with Azure Function for Dynamics 365 CRM: Part 2 [Cloud Deployment]

In the previous blog, we have learned how to create an azure function to connect with Dynamics 365 CRM and create an account record whenever an azure function is triggered by the HTTP request. [Link]. In this blog, we will learn how to deploy the Azure Function App to Azure Cloud so that we can trigger that function anywhere. Prerequisite 1. Microsoft Azure Account 2. Active Subscription [Create a Free trial or Pay-as-you-Go Subscription] If you are creating an azure function for learning purposes, then go with Free Trial but if you are working on the development for your organization or client then go for a Pay-as-you-Go subscription. Step 1: Create a resource on Azure Portal for Azure Function Deployment. Login to the portal.azure.com with your account and click on Create a resource: Create a resource for Function App as mentioned below screenshot: Here we will create a new resource group but If you don’t have any existing resources. Following are configuration details that you need to fill during the creation of the resource. Function Name: It is a global URL to access the Azure Function App and it must be unique. Publish: You can directly publish your code or use Docker Container for publishing your Azure function Runtime stack: Here we are building the .net application so we will choose .Net as our Runtime stack to support Azure Function. We have multiple options we can create an azure function for Nodejs, Python, Java, Powershell core or you can use customer handler. After configuration are you ready to click on “Review and Create” It will take a few minutes to create and deploy the Azure Function App in the cloud. Step 2: Publishing the Azure Function from Visual Studio: Open the Azure Function project in Visual studio. Right-click on the Project and click on the Azure. You will get the below screen on that we need to click on the Start. Now we will choose Azure as we are deploying the Azure Function on the Azure Cloud and click on Next: Once you select the azure then it will open the configuration screen. In the configuration window, you need to log in with Azure Account with Active Subscription. Select the resource group and select the function app as mentioned below screenshot. After the configuration is finished click on Publish. It will take a few minutes to deploy the application to the cloud. Step 3: Get Azure Function URL from Azure Portal. Open newly create Function App in the Azure portal app and click on the Function App: You will find the function1 has been deployed. Click on that function to open it. To Get the Function URL click on the Get Function URL and copy the URL. Testing We will require the API testing tool, here I am using Postman and the following is the link to download “Postman”. https://www.postman.com/downloads/ Open the Postman and click on the create a new tab. Select request as POST and paste the URL: After pasting the URL, click on Send: Now, we will take look at Dynamics 365 CRM environment and check whether the account is created or not. Before After: Stay Tuned for the next Blog. In the next blog, we will create a Contact Us form in HTML and post data using Azure Function Dynamics 365 CRM to store the responses from your website. External Links: Azure Function Pricing: https://azure.microsoft.com/en-in/pricing/details/functions/

Share Story :

How to GET records from Salesforce using Logic App

Learn how to fetch Salesforce Records in Azure Logic App using two different ways.

Share Story :

How to GET records from Salesforce using Logic App

Learn how you can easily use SOQL query to fetch Salesforce records in Logic App

Share Story :

Integrated Security Used in Azure Function

Azure Functions Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. · Integrated security — Protect HTTP-triggered functions with OAuth providers such as Azure Active Directory, Facebook, Google, Twitter, and Microsoft Account. Azure function has function access key mechanism which provide security to access the endpoint. Function have different level of access level. Like Function, Anonymous, Admin. General Security Flow of the Azure Function In this, Azure AD, Managed Identities, Key Vault, VNET and firewall rules are used. To create dedicated and isolated Azure Functions, it can also be decided to create a separate App Service Environment (ASE). However, an ASE can be difficult to manage and more costs are involved.  How to Secure Storage Account of Azure Function Provide Security to Request and Response Provide Security to Database keys provide a default security mechanism; you may want to consider additional options to secure an HTTP endpoint in production. For example, it’s generally not a good practice to distribute shared secret in public apps. If your function is being called from a public client, you may want to consider implementing another security mechanism. Authorization scopes (function-level) There are two access scopes for function-level keys: Function: These keys apply only to the specific functions under which they are defined. When used as an API key, these only allow access to that function. Host: Keys with a host scope can be used to access all functions within the function app. When used as an API key, these allow access to any function within the function app. Each key is named for reference, and there is a default key (named “default”) at the function and host level. Function keys take precedence over host keys. When two keys are defined with the same name, the function key is always used. Master key (admin-level) Each function app also has an admin-level host key named _master. In addition to providing host-level access to all functions in the app, the master key also provides administrative access to the runtime REST APIs. This key cannot be revoked. When you set an access level of admin, requests must use the master key; any other key results in access failure. System key Specific extensions may require a system-managed key to access webhook endpoints. System keys are designed for extension-specific function endpoints that called by internal components. For example, the Event Grid trigger requires that the subscription use a system key when calling the trigger endpoint. Durable Functions also uses system keys to call Durable Task extension APIs. Authentication/authorization While function keys can provide some mitigation for unwanted access, the only way to truly secure your function endpoints is by implementing positive authentication of clients accessing your functions. You can then make authorization decisions based on identity. Enable App Service Authentication/Authorization The App Service platform lets you use Azure Active Directory (AAD) and several third-party identity providers to authenticate clients. Security in App Service App service is used to secure Function app by using built in App Service features. When you create App in default domain of the Azure like <app_name>.azurewebsites.net which is by default secure with the HTTPS in addition if you secure your app using custom domain you can also provide SSL/TLS level of security. Following are the certificates supported by App service. Free App Service Managed Certificate App Service certificate Third-party certificate Certificate imported from Azure Key Vault

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange