Azure Archives - Page 6 of 7 - - Page 6

Tag Archives: Azure

Connect Azure Databricks to Power BI

Open Power BI and Click on Get Data and Search for Azure Databricks and click on connect.  It will ask for below details,  Server Hostname  HTTP Path    Now we will see, how to get above details,  Go to Azure Databricks and click on Clusters  Once clusters is opened the go to Advanced setting > JDBC/ODBC. Under this we can get Server Hostname and HTTP Path, which can be used in above steps.  Fill the details and click on OK, It will ask for user credentials, after that it will open a pop up  asking to select the from List of Tables. Select Tables and click on load  In this way we can create Power BI report based on the current data received from Azure Databeicks.  In this way we can create Power BI report and create fields above it. azure BI

Create Linked Service with Salesforce in Azure Data Factory

Pre-requisite: Salesforce Account (In this case we are having Developer trial license) Azure Data Factory in Azure. Steps To create linked service for Salesforce in Azure Data Factory we need username, password and secret key. The username and password is the same credential we use for login in Salesforce and we can create secret key by following these steps: Go to Setup Go to Personal Information > Reset My Security Token and click on Reset Security token. Once we clicked on reset Security token we will get an email which contains security token, below is sample email Now Go to Linked Service and create new connection, and Search for the Salesforce connector When we select the Salesforce connector it will ask to enter credentials Now connection gets created, Once we created connection create Datasets and then create pipeline. Review the mapping and run the pipeline.

Triggering Azure Pipeline from on premise SQL Server

In this blog, we are going to Trigger ADF Pipeline whenever there is insert or update operation is performed on on-premise SQL Server. Steps: Create ADF Pipeline.In this case we have already created Pipeline. And below is Dataflow. Create PowerShell Script (for Authentication and Triggering Pipeline).Below is the code for authentication and triggering of Pipeline, to do so we should have following details: Tenant ID, Application ID ,Client Secret, Subscription ID, Resource group Name, API version and pipeline name. Create Job in SQL and Trigger on Table where Insert update or delete. Make sure that SQL server agent is running. Create a SQL Job with following Job StepEnter step name, select type as PowerShell, select account will be used to run PowerShell script and enter the code in command section. Next we will create trigger on the tablesWe are creating triggers on the tables on which insert, update, delete operations will be performed. Now whenever we will do insert, update or delete in the table the pipeline will automatically get executed.We have following data in Accounts table, Now we updated currency of Account ID 1, the moment we updated the record pipeline gets automatically triggered, After some time check status, Hope above helps!

Creating Azure Data Factory

Login to azure portal, click on create a resource>Analytics>Data Factory Enter the required details and data factory gets created, enable git option will prompt to enter the git repository details which can later be used for CI/CD. In this way we can create Azure Data factory.

Creating Azure blob container

Azure blob container is a service for storing large amount of unstructured object data, text files or binary. We can publicly expose or can use blob storage privately. Below are the steps to create Azure blob container: Go to Storage account, select storage account (In this case cloudstorage123) > overview and click on containers. When we go inside of container, click on ‘+ container’ to create new container, then it will prompt to enter the container name and public access level. Below are public access levels, Private: By Default, it is private, and it is accessible to account owner. Blob: Allows public read access to blob. Container: Allows public read and list access to entire container. after entering above details container gets created, Now we will create blob storage, to do we have to click on upload Now we can see that Csv file uploaded and under the Input folder. Each step description, Select file from local system. In advanced section, Select Authentication type either using account key or Azure AD user account. Select the blob type: It can be Block blob, page blob and append blob. Block blob: It stored text and binary data, up to 4.7 TB Block blobs are made up of blocks of data and managed individually. Appends blob: This blob is like Block lob and it is optimized to appends operations. e.g. it can be used for logging Page blob: It stored random access file up to 8 TB in size, page blobs stores virtual hard drive and serve as disks for VM. Block size: select as per requirement, it can be 64KB,128KB,256KB,512KB,2MB,4MB,100MB Enter the folder name in which we wanted to upload blob, if we enter the folder name which does not exists it will create new folder. Hope above helps!

Incremental refresh in Azure Data Factory

In this article we are going to do Incremental refresh for Account entity from Dynamics 365 CRM to Azure SQL. Steps: Create Linked Service for Azure SQL and Dynamics 365 CRM and create a table in Azure SQL DB Now we will create pipeline, in the pipeline we have two blocks, one is for getting data from last modified date and other block is used to Copy data from D365 CRM to Azure SQL recordmark block:In the Query Section we are retrieving latest modified data, if there is no data in Azure SQL database for modified on column it will take ‘01-01-1999’ and start integrating data after this date. In preview section we can see the last modified data. Copy data from D365 CRM to ASQL block Source In this block we will get last modified date from recordmark block and based on this it will fetch data for account, In the query section we are retrieving the account data from D365 CRM. SinkNow we will create UDT (User Defined Datatype) in Azure SQL, it will take care of update operation of existing records and insertion of new records based on “code” column using user defined stored procedure. MappingBelow is mapping details, once completed with above steps we are good to run Pipeline. first we will check last modified date in Destination Now we will create Account in CRM Now we will run the pipeline. Now will check in destination, as we can see modified date/time is “2020-04-13 07:08:35.000” Now let us modify this record(Change Phone number to “987654321” Now run the pipeline again, Now we will check the data in Destination(Azure SQL Database) In this way we can perform incremental refresh in Azure data factory.

Connect your Azure Machine Learning Predictive Solution to Power BI

Introduction: Azure Machine Learning Studio is an amazing tool that lets us create efficient ML experiments with simple drag and drop features. We can predict anything from Flight Predictions to Churn Analysis. But what if we want to represent this predicted data a more visually appealing format? Well it is possible to do this by representing your predictions on Power BI! Pre-Requisites: Basic Understanding of Azure Machine Learning Studio. Basic Understanding of Power BI. A Blob Container created on Azure Storage.   Steps: Create your Azure Machine Learning Experiment on Azure Machine Learning Studio. Convert your Training Experiment to a Predictive Experiment and Deploy it as a Web Service. We will create a Console application in Visual Studio and copy paste the code inside Batch Execution. For automation we can create automated data pipelines but for now we will just use a simple Console application. Remove the existing code from the Console Application and copy paste the Batch Execution code. Install the necessary Nuget Packages and also update the following parameters. – BaseURL will be the same. – Storage Account Name, Storage Account Key and Storage Container Name will be parameters that can be found in your Azure Blob Storage which was created. – Api Key can be found in the Web Experiment Page in Azure Machine Learning Studio. – The input path is the path where you have saved your input csvfile for Batch Execution. Your Input csv file should have all the features which you have used to train your experiment After you run your Console application a new output1results.csv file should get generated in your Blob Container. The output results should include the labels which your experiment generates in it’s output. It should include the Scored Labels and Scored Probabilities labels as well. Now you can get your data using Azure Blob Storage as your source in Power BI and use the columns in the output1result.csv file to generate your ML Predicted Reports. The Report can look something like this. I hope this blog helps you to combine Azure Machine Learning Studio and Power BI to create a powerful predictive solution.

Commands to Create .bacpac file from Azure SQL to SQL Server

Introduction: This topic explains how to export a Microsoft Dynamics 365 for Finance and Operations, Enterprise edition database from an environment that is based on Microsoft Azure. Steps: 1. Create a duplicate of the source database. <“CREATE DATABASE MyNewCopy AS COPY OF axdb_mySourceDatabaseToCopy”> To monitor the progress of the copy operation, run the following query against the MASTER database in the same instance. 2. Run the Sql Server script to Prepare the database. –Prepare a database in SQL Azure for export to SQL Server. –Disable change tracking on tables where it is enabled. declare @SQL varchar(1000) set quoted_identifier off declare changeTrackingCursor CURSOR for select ‘ALTER TABLE ‘ + t.name + ‘ DISABLE CHANGE_TRACKING’ from sys.change_tracking_tables c, sys.tables t where t.object_id = c.object_id OPEN changeTrackingCursor FETCH changeTrackingCursor into @SQL WHILE @@Fetch_Status = 0 BEGIN exec(@SQL) FETCH changeTrackingCursor into @SQL END CLOSE changeTrackingCursor DEALLOCATE changeTrackingCursor –Disable change tracking on the database itself. ALTER DATABASE — SET THE NAME OF YOUR DATABASE BELOW MyNewCopy set CHANGE_TRACKING = OFF –Remove the database level users from the database –these will be recreated after importing in SQL Server. declare @userSQL varchar(1000) set quoted_identifier off declare userCursor CURSOR for select ‘DROP USER ‘ + name from sys.sysusers where issqlrole = 0 and hasdbaccess = 1 and name <> ‘dbo’ OPEN userCursor FETCH userCursor into @userSQL WHILE @@Fetch_Status = 0 BEGIN exec(@userSQL) FETCH userCursor into @userSQL END CLOSE userCursor DEALLOCATE userCursor –Delete the SYSSQLRESOURCESTATSVIEW view as it has an Azure-specific definition in it. –We will run db synch later to recreate the correct view for SQL Server. if(1=(select 1 from sys.views where name = ‘SYSSQLRESOURCESTATSVIEW’)) DROP VIEW SYSSQLRESOURCESTATSVIEW –Next, set system parameters ready for being a SQL Server Database. update sysglobalconfiguration set value = ‘SQLSERVER’ where name = ‘BACKENDDB’ update sysglobalconfiguration set value = 0 where name = ‘TEMPTABLEINAXDB’ –Clean up the batch server configuration, server sessions, and printers from the previous environment. TRUNCATE TABLE SYSSERVERCONFIG TRUNCATE TABLE SYSSERVERSESSIONS TRUNCATE TABLE SYSCORPNETPRINTERS –Remove records which could lead to accidentally sending an email externally. UPDATE SysEmailParameters SET SMTPRELAYSERVERNAME = ” GO UPDATE LogisticsElectronicAddress SET LOCATOR = ” WHERE Locator LIKE ‘%@%’ GO TRUNCATE TABLE PrintMgmtSettings TRUNCATE TABLE PrintMgmtDocInstance –Set any waiting, executing, ready, or canceling batches to withhold. UPDATE BatchJob SET STATUS = 0 WHERE STATUS IN (1,2,5,7) GO 3. Export the database from Azure SQL cd C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin SqlPackage.exe /a:export /ssn:.database.windows.net /sdn:MyNewCopy /tf:D:\Exportedbacpac\my.bacpac /p:CommandTimeout=1200 /p:VerifyFullTextDocumentTypesSupported=false /sp:EG=ajgU8!Fx=gY /su:sqladmin Where : ssn (source server name) – The name of the Azure SQL Database server to export from. sdn (source database name) – The name of the database to export. tf (target file) – The path and name of the file to export to. sp (source password) – The SQL password for the source SQL Server. su (source user) – The SQL user name for the source SQL Server. We recommend that you use the sqladmin  

Setup custom domain and sub-domain name in Azure/GoDaddy Part -1

In this Blog we will walk-through how to add custom Domain/sub-domain in Azure web app with the help of GoDaddy. Problem Statement: We have a domain cloudfronts.in, purchased from GoDaddy and we want that it should point to website hosted on Azure. We have divided this blog in two different parts. Part 1: Add new custom Domain Part 2: Add new custom Sub-Domain Pre-Requisite: Azure Subscription with Web apps (Service administrator/Co-administrator) GoDaddy Account with new Domain (Admin) Next we need to follow the steps given below Steps 1: Login to your GoDaddy account with admin credentials and click on Manage domains. Steps 2: Click on domain name that you want to configure. In Our case we want to configure for cloudfronts.in. Double click on domain name and wait for new window. Step 3: Click on DNS Zone file and then edit option. Steps 4: We need some additional information from Azure web apps, that we will add in DNS configuration. Login to your Azure account and navigate to web site that you want to configure. Web site must be in share mode or standard mode to configure domain name. Change apps service plan pricing ties accordingly. Please refer following URL for more information http://azure.microsoft.com/en-in/pricing/details/app-service/ Navigate to Dashboard and click on Manage domains to add new custom domain name for your web site. Steps 5: Copy IP address of Azure web site. This information will need, when we add DNS record in GoDaddy. Steps 6:  Add new A(Address) record and point to IP Address with below configuration. Record Type: A (Host) Host: @ Point to: 192.198.15.14 (Azure website IP Address) If @ record in already present, then edit that record. Add new CName record with below configuration. Record Type: CNAME (Alias) Host: www Point to: cloudfrontsdev.azurewebsites.net (Domain Name of Azure website) Kindly edit this record if already present. Steps 7: Add CNAME(Alias) record that will point to Azure website. This information is required  by azure for verification purpose. Add two CNAME(Alias) record given in Image. Record Type: CNAME(Alias) Alias Name: awverify.www Points to Host Name: awverify.cloudfrontsdev.azurewebsites.net Record Type: CNAME(Alias) Alias Name: awverify Points to Host Name: awverify.cloudfrontsdev.azurewebsites.net Steps 8: Save all changes by clicking on Save Zone File. It might take 10-15 mins to update DNS entry. Steps 9: Add domain name in manage domain section(Azure) and click on ok. Reference https://azure.microsoft.com/en-in/documentation/articles/web-sites-custom-domain-name/  

SEARCH :

FOLLOW CLOUDFRONTS BLOG :

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange