Latest Microsoft Dynamics 365 Blogs | CloudFronts

Develop D365 Finance, SCM and Retail Reports using Azure Data Lake Gen2

The BYOD feature for the Dynamics 365 for finance and Operations was release in 2016, it’s provide feature for the customer to bring their own Azure SQL Database, but the drawback through this was Entity Store only accessible for the inbuilt data warehouse only which means it is not accessible to outside D365. The new feature for the Data Lake Gen2 makes the Entity Store get Sync with Data Lake. Following are the steps to setup the environment. Let’s create the Storage Account for the Data Lake Gen2 Click on the Create Resource and search for Storage accounts and go to the Storage Account and click on Add. Choose the Subscription and Resource group, resource group is the container that holds the resource. Here, we are creating the new resource for our Data Lake. Make sure you select the Location same as your Power BI dataset environment and set the other option as shown below. Once storage account gets created, we can access it from the resource group open the storage account Now let’s take a copy of the connection string since it will be required later, the connection string can be access by going to setting under the storage account. Let’s create the Key vault resource to store the secret and create the secret for the connection string. Secret can be created by going to the key vault by going to the Key under the Settings tab, click on the generate the key. We are creating the secret for the connection string that we have copied earlier, set the value for the connection string as the key that we have copied and click on the create. Once we have created the app secret for the connection the next step is to authorize the user and resource to the request. Here we are going to register app for the authorization of the D365FO environment. To register the app, go to the Azure Active Directory and Select the app registration and click on the new registration. Fill the application name of your choice and redirect URI is set to WEB and select the D365 environment URI as shown below. Once we register the app the next step is to grant the API permission as a part of consent process. Grant all the permission that application should require. From Azure Key Vault select the Permission as user_impersonation which provide full access to Azure Key Vault service. The next step is to create client secret and make sure to note down the value of the app generated since we are going to use that in the D365FO environment. Next is to add the D365FO in access policy list and select the Key and Secret permission to Get and List from the drop down. Once we have added D365FO in access policy list the next step is to add the Application ID and Application Secret of Azure Key Vault in Data connection in D365FO environment, which can be access by going to Module > System administration > Setup > System parameters > Data connections tab. Note:- Make sure if data connection tab is not visible to you which means you are missing some configuration setting during environment setup, to make the data connection tab available we need to check the Value of CDSAMaster in SQL Studio Management Studio. Open SSMS go to the AxDW database and check the value in SYSFLIGHTING for CDSAMaster if not present then insert the value in table. SELECT * FROM SYSFLIGHTING  /*To check the Flightname values*/ INSERT INTO SYSFLIGHTING VALUES (‘CDSAMaster’,1,12719367,5637144576,5637144589,1) /*     CDSAMaster FlightName     1 Enabled     12719367 FlightserviceID     5637144576 Partition     5637144589 RecID     1 Recversion */ Before Update After Updating Once Data connections is added fill the Application ID, Application Secret, DNS name and connection secret name as shown below and make sure to enable the Data Lake Integration. Test the Azure Key Vault and Azure Storage. Note:- If you are getting error 401 which means Azure user in which data lake is hosted has no access to D365FO environment in that case you need to import the user in environment and assign role as Administrator. Create Reports using Azure Data Lake Gen2 Once done with above steps the next step is to configure the storage account, following are the requirement for the Power BI. The storage account must be created in the same AAD tenant as your Power BI tenant. The storage account must be created in the same region as your Power BI tenant. The storage account must have the Hierarchical Name Space feature enabled (Make sure to enable this at time of storage account creation) Power BI service must be granted a Reader role on the storage account. You must have a Global Administrator account, this account is required to connect and configure Power BI to store the dataflow definition, and data, in your Azure Data Lake Storage Gen2 account  As we have created storage account previously lets, grant the Reader role to the storage account, in Azure portal Go to the Storage account > Access control > Role Assignments then click on Add role. Once we assign role to the Storage Account the next step is to Assign the directory level permission to the Storage that we have created. Here we are granting Power BI permission to the file system. We need to get the collection of IDs of the Power BI. Which can get by navigating to the AAD > Enterprise Application > All Application, copy the Object IDs of the Power BI Premium, Power BI Service and Power Query Online. For each Power BI Object that we have collected in previous steps grant the below access for each of the object. Once we granted the access to the storage the next step is to connect your Azure Data Lake Storage Gen 2 to Power BI Go to your Power BI service > Click on Admin portal navigate to Dataflow settings then Select the Connect your Azure Data Lake Storage Gen2 button. The following … Continue reading Develop D365 Finance, SCM and Retail Reports using Azure Data Lake Gen2

Share Story :

Schedule Serverless CRON Job to Pull data from REST APIs (Part – 1)

REST API is an Application Program Interface that uses HTTP request to GET, PUT, POST and DELETE data, it’s an architecture style approach to communicate with third party application. In order to integrate our module with third party applications or in order to pull data from third party application to our database for analysis purpose the REST APIs are useful. For the analysis purpose we can consume REST API with SSIS using third party connector, but the problem with this approach is that, it will be required On Premise server for package deployment and job schedule. The alternate approach is to use serverless CRON expression. Since, mid 2018 the serverless development methodology is ditching the traditional development. In this blog we are going to see how to consume REST API with Serverless CRON expression. Here, we are using Zoho People API for integrating HR modules data to Azure DB. With Zoho People API, you can extract employee’s data and form data in XML or JSON format to develop new applications or integrate with your existing business applications. Zoho People API is independent of programming languages which helps you to develop applications in any programming languages (reference). Implementation Authentication token generation In order to access Zoho People API, Zoho People authentication token is required. The token can be generated using Browser mode and API mode. For the API mode each request has Username or Email and Password needs to include in the POST body Another approach is to register your app with Zoho by going to zoho.com/developerconsole and Add Client ID, Once added it can be used to generate new access_token. We can check the response by use using POST request, Open any REST Client and send request by using any REST API method, here we are send request to get leave data and we are using VS Code REST Client. If the request is valid then the we can get response in following format containing the data requested. 3. As we are getting proper response, the next step is to create CRON expression (Serverless approach) to pull the data from REST API, here we are going to use App services on Azure to create Timer Trigger Function App. The CRON expression is a time-based job schedular. which has six field to define time precision in following format. {second} {minute} {hour} {day} {month} {day-of-week} Each field can have one of the following types of values: Type Example When triggered A specific value “0 5 * * * *” at hh:05:00 where hh is every hour (once an hour) All values (*) “0 * 5 * * *” at 5:mm:00 every day, where mm is every minute of the hour (60 times a day) A range (- operator) “5-7 * * * * *” at hh:mm:05, hh:mm:06, and hh:mm:07 where hh:mm is every minute of every hour (3 times a minute) A set of values (, operator) “5,8,10 * * * * *” at hh:mm:05, hh:mm:08, and hh:mm:10 where hh:mm is every minute of every hour (3 times a minute) An interval value (/ operator) “0 */5 * * * *” at hh:05:00, hh:10:00, hh:15:00, and so on through hh:55:00 where hh is every hour (12 times an hour) 4. The next step is to create function app, for development purpose we are going to use Visual Studio Community 2019/2017. In Visual Studio create a project by selecting File > New > Project Select Visual C# (here we are going to use C# for development you can choose Php, Python or F#) any of your choice. Under the Visual C# select the Azure Functions and click on Next The next step is select a new Azure Function Application, Select the Timer trigger function choose your Azure Function version (v1, v2 or v3). Select Authorization level as Anonymous as we don’t want to include API Key for the function and kept the other settings as it is. After clicking on Create button, it will create the Azure Function App Solution for us, the directory structure is as follow. The next step is to publish the function on Azure, it will deploy your function on IIS and Azure. That we will see in next part of this blog.    

Share Story :

Schedule Serverless CRON Job to Pull data from REST APIs (Part – 2)

In this blog we’ll see how t publish the and deploy the Azure function on App service. With Continuing with the previous part of this blog Right Click on the solution and click on Publish, it will ask to select the proper subscription, resource group. Select the option relevant to you. Following are the Publishing option available for us. Publish on Azure App Service Azure Virtual Machines File system Local webserver (IIS) Note:- Make sure you have sign in with proper credential and cloud explorer is connected. Once we publish the function, we can check Azure function by going to the https://portal.azure.com/ then search for the Function App. We can see our function app in read only mode, since we have deployed it from visual studio environment. The next step is to create the development logic for the data pull from the Zoho People. Make sure to add all reference libraries required for the development, we can manage the NuGet package manager by going Right clicking on the solution and select manage NuGet package. Make sure to add all required reference to your script file. using System; using Microsoft.Azure.WebJobs; using Microsoft.Extensions.Logging; using Newtonsoft.Json; using System.Data.SqlClient; using System.Net.Http; using System.Net.Http.Headers; using System.Collections.Generic; using System.Linq; As our destination is Azure DB so, we need to create the environment variable to store ADO.NET connection string in settings.json file or by clicking on Azure App Service Settings Then click on Add Setting and then add the setting name like in following diagram we are creating setting for the Azure DB connection. Click on ok and provide the value for the setting. We can get azure connection string by going to the Azure Portal search the DB name and then click on the connection string and copy the ADO.NET connection string, replace the {your_password} with your server password. After adding the environment variable, we can check this in settings.json file The Next step is to implement the code logic and debug the code on local. That we’ll see in final part of this blog.

Share Story :

Schedule Serverless CRON Job to Pull data from REST APIs (Part – 3)

In this part we’ll see how to Debug the code on local. Once we done with the configuration settings the next step is to call the REST API in our code, here we are using C# for the development. Once we done with the code logic, we can run the code on local by adding the breakpoint The Azure Function app will launch in a local Functions host, and your trigger will be available locally on http://localhost:7071/api/Function1 default listening port is 7071, we can also change the port number by going to the project setting and setting Application arguments to the project settings. host start –pause-on-error –port 7079 We can also launch multiple apps at the same time, we can do this by adding multiple breakpoint at the same time and then start the function without debugging and start another function through debugger and attached the debugger to function app by, Select Debug > Attach to process. Search for fun The greyed-out function is in running state and other is running through debugger. Select the other and click Attach and then we can open the browser and run the both function app different port gets assigned to each function. Publish the App on the Azure and make sure to add the all the parameters on the configuration by going to the Configuration on the Function App.   Note:- For every changes on app we need to build and publish the app.

Share Story :

Auto scale the Power BI Embedded capacity using Job Scheduler in Azure

Power BI Embedded is a Microsoft Azure service that is useful for the ISVs and developers to embed visuals, reports and even dashboard into the application. As Power BI Embedded is a PaaS analytics solution which provide Azure based capacity, Power BI embedded charge customers on an hourly basis there are n annual commitment for the Power BI Embedded service. As Power BI Embedded charges on hourly basis and there is no direct Auto Scaling feature available on Azure but, we do have API provided by using which we can scale the capacity. In this blog we are going to see how scale the Power BI Embedded capacity using PowerShell script. Before going to start we’ll first quick list the set up the prerequisites: You will need an Azure account, if you are implementing the PowerShell script for your organisation then you must have co-administrator role assign kindly keep in mind that if you have contributor role assign then you’ll not be able to make Automation account.(we’ll see about the Automation account in the later part of this blog.) Power BI Embedded subscription. Automation Account. I’m assuming you already have Azure account along with the subscription for the Power BI Embedded. Steps:- Create Automation Account:- Automation account is use to manage the Azure resource across all the subscription for the given tenant. To create Automation click on the create resource in your Azure portal as shown below and search for Automation account. Or you can type in search box Automation Account. 2. Click on create Automation Account and make sure to fill the following details. If you have multiple subscription then make sure to select proper subscription from drop-down. Make sure create Azure Run As account is selected to Yes (if you are co-administrator or administrator then it will by default selected to Yes). Once we create Azure automation account it will show under automation account. 3. Open the Automation account and go to the Connections and add below connection and types as shown below (Click on Add a connection and type the name and type as shown below) 4. For the AzureClassicRunAsConnection set the CertificateAssetName to AzureRunAsCertificate. 5. Add the Power BI Embedded subscription to your resource group. 6. Once we have Automation account ready go to the Runbooks under Process Automation in Automation Account. Runbook is useful for the routine procedures and operations. We can also use Azure Function app instead of Runbook. 7. Click on the Create a runbook and use fill following details. 8. Once we open runbook make sure to import the Module AzureRM.PowerBIEmbedded which can be installed by going to Module under Shared Resources then click on Browse gallery and search for the AzureRM.PowerBIEmbedded module. 9. Use the below PowerShell script which can also be found on the Power BI discussion site. $resourceGroupName = “<your resource group>” $instanceName = “<Power BI embedded instance name>” $azureProfilePath = “” $azureRunAsConnectionName = “AzureRunAsConnection” #”PowerBIAutoscale” $configStr = “ [ { Name: “”Weekday Heavy Load Hours”” ,WeekDays:[1,2,3,4,5] ,StartTime: “”06:45:00″” ,StopTime: “”23:45:00″” ,Sku: “”A4″” } , { Name: “”Early AM Hours”” ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: “”00:00:00″” ,StopTime: “”04:44:00″” ,Sku: “”A1″” } , { Name: “”Model Refresh”” ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: “”04:45:00″” ,StopTime: “”06:45:00″” ,Sku: “”A3″” } , { Name: “”Weekend Operational Hours”” ,WeekDays:[6,0] ,StartTime: “”06:45:00″” ,StopTime: “”18:00:00″” ,Sku: “”A3″” } ] “ $VerbosePreference = “Continue” $ErrorActionPreference = “Stop” Import-Module “AzureRM.PowerBIEmbedded” Write-Verbose “Logging in to Azure…” # Load the profile from local file if (-not [string]::IsNullOrEmpty($azureProfilePath)) { Import-AzureRmContext -Path $azureProfilePath | Out-Null } # Load the profile from Azure Automation RunAS connection elseif (-not [string]::IsNullOrEmpty($azureRunAsConnectionName)) { $runAsConnectionProfile = Get-AutomationConnection -Name $azureRunAsConnectionName Add-AzureRmAccount -ServicePrincipal -TenantId $runAsConnectionProfile.TenantId ` -ApplicationId $runAsConnectionProfile.ApplicationId -CertificateThumbprint $runAsConnectionProfile.CertificateThumbprint | Out-Null } # Interactive Login else { Add-AzureRmAccount | Out-Null } $fmt = “MM/dd/yyyy HH:mm:ss” # format string $culture = [Globalization.CultureInfo]::InvariantCulture $startTime = Get-Date Write-Verbose “Current Local Time: $($startTime)” $startTime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($startTime, [System.TimeZoneInfo]::Local.Id, ‘Eastern Standard Time’) Write-Verbose “Current Time EST: $($startTime)” $scheduleTimeMidnight = ($startTime).Date Write-Verbose “Schedule Time Base (Midnight): $($scheduleTimeMidnight)” $currentDayOfWeek = [Int]($scheduleTimeMidnight).DayOfWeek Write-Verbose “DOW: $($currentDayOfWeek)” $stateConfig = $configStr | ConvertFrom-Json #| Select-Object Sku, WeekDays, Name, StartTime, EndTime #, @{Name=”StartTime”; Expression={[DateTime]:Smiley TonguearseExact($_.StartTime, $fmt, $culture)}}, @{Name=”StopTime”; Expression={[DateTime]:Smiley TonguearseExact($_.StopTime, $fmt, $culture)}} Write-Verbose “Writing Config Objects…” foreach($x in $stateConfig) { Write-Verbose “Name: $($x.Name)” Write-Verbose “Weekdays: $($x.WeekDays -join ‘,’)” $x.StartTime = ($scheduleTimeMidnight).AddHours([int]$x.StartTime.Split(“{:}”)[0]).AddMinutes([int]$x.StartTime.Split(“{:}”)[1]).AddSeconds([int]$x.StartTime.Split(“{:}”)[2]) Write-Verbose “Start Time: $($x.StartTime)” $x.StopTime = ($scheduleTimeMidnight).AddHours([int]$x.StopTime.Split(“{:}”)[0]).AddMinutes([int]$x.StopTime.Split(“{:}”)[1]).AddSeconds([int]$x.StopTime.Split(“{:}”)[2]) Write-Verbose “End Time: $($x.StopTime)” } Write-Verbose “Getting current status…” # Get the server status $pbiService = Get-AzureRmPowerBIEmbeddedCapacity -ResourceGroupName $resourceGroupName switch ($pbiService.State) { “Scaling” { Write-Verbose “Service scaling operation in progress… Aborting.” end } “Succeeded” {Write-Verbose “Current Status: Running”} Default {Write-Verbose “Current Status: $($pbiService.State)”} } Write-Verbose “Current Capacity: $($pbiService.Sku)” # Find a match in the config $dayObjects = $stateConfig | Where-Object {$_.WeekDays -contains $currentDayOfWeek } # If no matching day then exit if($dayObjects -ne $null){ # Can’t treat several objects for same time-frame, if there’s more than one, pick first $matchingObject = $dayObjects | Where-Object { ($startTime -ge $_.StartTime) -and ($startTime -lt $_.StopTime) } | Select-Object -First 1 if($matchingObject -ne $null) { Write-Verbose “Current Config Object” Write-Verbose $matchingObject.Name Write-Verbose “Weekdays: $($matchingObject.WeekDays -join ‘,’)” Write-Verbose “SKU: $($matchingObject.Sku)” Write-Verbose “Start Time: $($matchingObject.StartTime)” Write-Verbose “End Time: $($matchingObject.StopTime)” # if Paused resume if($pbiService.State -eq “Paused”) { Write-Verbose “The service is Paused. Resuming the Instance” $pbiService = Resume-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -ResourceGroupName $resourceGroupName -PassThru -Verbose } # Change the SKU if needed if($pbiService.Sku -ne $matchingObject.Sku) { Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to $($matchingObject.Sku)” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } else { Write-Verbose “No Interval Found. Checking current capacity tier.” if($pbiService.Sku -ne “A2”) { Write-Verbose “No Interval Found. Scaling to A2” Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to A2” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } } else { Write-Verbose “No Interval Found. Checking current capacity tier.” if($pbiService.Sku -ne “A2”) { Write-Verbose “No Interval Found. Scaling to A2” Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to A2” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } Write-Verbose “Done!” 10. Above script not includes Capacity pause, we can add that in the script. 11. Once we done with the script click on Save and the Publish the script. 12. Create the Schedule under the Shared Resources … Continue reading Auto scale the Power BI Embedded capacity using Job Scheduler in Azure

Share Story :

How to set the interaction between visual in Power BI

Posted On October 10, 2019 by Yogesh Gore Posted in

Power BI provides interactive features that allow for easy navigation and filtering of the visual on click. But, we can also set the interaction between the visuals. We can have some of the visuals default to a filter functionality whereas other visuals use the highlight method. Type of Interaction No-way – no interaction between visuals One-way – interaction occurs from one visual to another, but not in the reverse direction Two-way – interaction occurs between one visual and another in both directions How to set the interaction? Go to the Edit Interactions button on the Format tab of the ribbon in Power BI Desktop. Once you click the Edit interactions button, you can now edit the interactions between the different visuals. As we can see for the Day wise Revenue card we disabled the interaction for the Month range filter and for Total Month Revenue we disabled interaction for the Day range filter. As if we select the day from the dropdown filter we can see that there is no change in Total Month Revenue. Hope this helps!

Share Story :

How to Sync Slicer in Power BI?

Posted On October 10, 2019 by Yogesh Gore Posted in

This blog will explain how to synch Slicer in Power BI desktop. In slicer syncing, all the pages where slicer has been applied will get synchronized. How to apply slicer syncing? Step1: Select “Date Selection” slicer > Open View Menu > Click “Sync slicer”. Step 2: When we click on slicer syncing it will open the setting for the slicer. Based on our requirement we can select the different combinations to apply slicer and it will get synch across all tabs. Hope this helps!

Share Story :

Remove Duplicate in Power Query

Posted On October 10, 2019 by Yogesh Gore Posted in

One can make better and faster-informed decisions when they use Microsoft Power BI Integrations. The business data that is there on your systems have a lot of information that can help you to scale up your business. You can unlock its true potential when you use business intelligence software. Microsoft is the leading company when it comes to creating products that help a business to expand beyond leaps and bounds. It is easy to learn and use. But, unfortunately not many people struggle with the software once they install it as they do not know how to use it. For example, removing duplicate in power query might look like a real struggle if you do not know how to do it. In this blog, we are going to see how to remove duplicate from the column using Power Query, as we know that we can directly apply a transformation on the column and then use remove duplicate. But there is situation when remove duplicate does not work. Situation 1:- Consider the following scenario, we have the same GUID but the case is different in that case duplicate will not get removed. Solution:- Power Query is case sensitive language here both abcdefg123  and Abcdefg123 are considered as different. If you are going to do remove duplicates despite their case of letters, then you have to apply a transformation to change them all to one case; either UPPERCASE or lowercase. Situation 2:- In other scenarios sometimes we have data with leading or trailing space is present so changing it to UPPERCASE, you would still have the extra space, which makes the two texts differ. So, to deal with that we have to use Trim Transformation to remove extra space.

Share Story :

How to set the default value in Report filter pane?

Posted On October 10, 2019 by Yogesh Gore Posted in

Microsoft Power BI Integrations, in a nutshell, is nothing but a Power BI (business intelligence) software that provides a business analytics solution. It helps companies to visualize their data and to share observations and insights across all the departments in the organizations. Management teams can make crucial decisions based on the information that comes from these reports.  Your decision making is going to be a lot better and faster when you use this particular software. In other words, it is going to make the life of every employee better as they can see where they are doing well and where exactly they need to improve. Companies can scale up high using the data.  Here’s everything that you would want to learn about this product In this blog, we are going to see how to set the default value for value in report filter pane in Power BI. Sometimes we came with a requirement to set default values in a day filter to Current Month as default. Since we cannot use DAX in report level filter directly but we can work around this issue in the following way by creating the following DAX Default Filter = IF(MONTH(Calendar[Date])=MONTH(NOW()) && YEAR(Calendar[Date])=YEAR(NOW()),”This Month”,””) You can drag the filter in the report level filter and select the value as “This Month”.

Share Story :

How to set up POS Hybrid App on Android

Microsoft Dynamics 365 for finance and operations is an ERP solution that Microsoft has created. It is useful for both medium and large enterprises that want to excel and grow. ERP is an acronym for Enterprise Resource Planning. It is true that there are so many ERP solutions in the market. But, Microsoft Dynamics is one of the best ERP solutions that you can find in the market.  The unique features that they offer through this solution or software are numerous. By using this particular software you will make your staff become a lot more effective and efficient. Learning this software is not a difficult task. There are so many resources available online that you can check if you want to excel in it or if you have any doubts.  In this blog, we’ll see how to set up the Hybrid App on Android using Xamarin. Hybrid App is the app that combines the elements of both native and web apps. The native app builds for a specific platform. Web applications are generalized for multiple platforms and not installed locally but made available over the Internet through a browser. Xamarin is a cross-platform implementation of the Common Language Infrastructure (CLI) and Common Language Specifications The element that used to develop an application for android C# language Mono .NET Framework Compiler Following are the steps to follow in order to configure set up the POS Hybrid App Install Xamarin Xamarin provides a common development experience for creating cross-platform mobile applications. Xamarin facilitates the development of Android and iOS applications by providing the Xamarin.iOS and Mono. Android libraries. These libraries are built on top of the Mono .NET framework and bridge the gap between the application and the platform-specific APIs. Open the visual studio installer you will see the below screen. Make sure the below component are selected 2. Xamarin Configuration Xamarin.Android uses the Java Development Kit (JDK) and the Android SDK to build apps. During installation, the Visual Studio installer places these tools in their default locations and configures the development environment with the appropriate path configuration. You can view and change these locations by clicking Tools > Options > Xamarin > Android Settings. 3. Android SDK Manager Visual studio include SDK manager that lets you download android SDK Uncheck everything. Then check only the following: Android SDK Tools Android SDK Platform-tools Android SDK Build-tools (I don’t think this is necessary) 4. Select the Android SDK Manager 5. You can always debug using actual Android device instead of using Emulator for debugging. If you need to turn ON Hyper-V in your machine, you will need to go with Visual Studio Android Emulator. To open/configure Visual Studio Android Emulator, go to Tools>Visual Studio Emulator for Android You will be able to launch or configure the emulator in the new dialog popup. To make Android SDK Emulator run without lagging, we will need to install Intel HAXM. However, Intel HAXM cannot work together with Hyper-V. So, we need to turn OFF Hyper-V in the machine Go to control panel turn windows feature on and off and uncheck hyper v Make sure Virtualization Technology settings are Enable in BIOS settings. Once we install HAXM we can proceed to an android emulator which will open the Android virtual device manager dialog box. The Android Emulator can be run in a variety of configurations to simulate different devices. Each configuration is called a virtual device. When you deploy and test your app on the emulator, you select a pre-configured or custom virtual device that simulates a physical Android device. 6. SDK Configuration Setup In the Retail SDK folder, open SampleExtensions\HybridApp\Android\solution. Build and deploy using the emulator and verify that everything appears as it should. 7. Export the android app Specify the Application Icon Version the Application – An integer value (used internally by Android and the application) that represents the version of the application. Most applications start out with this value set to 1, and then it is incremented with each build. Version Name – A string that is used only for communicating information to the user about the version of the application these values can be set in the Android Manifest section of project Properties, as shown in the following screenshot 8. Shrink the APK Xamarin.Android APKs can be made smaller through a combination of the Xamarin. Android linker, which removes unnecessary managed code, and the ProGuard tool from the Android SDK, which removes unused Java bytecode. 9. Archive for Publishing To begin the publishing process, right-click the project in Solution Explorer and select the Archive. When we select Ad hoc visual studio will open the signing dialog box app required to sign with a signing key or if we have already signed we can use import key option. After clicking on the new sign in create key store dialog box shown we can fill the required information it will create a key store and will be shown undersigning section as we are creating ad hoc publish we click on save it will create achieve file, and it will ask for password if not set earlier. 10. The final step is to distribute the app. There are different way to publish the app google pay, web or independent 11. Sign In with the credential device will get register and we can see the POS Retail UI  

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange