Blog Archives - Page 125 of 171 - - Page 125

Category Archives: Blog

Error Resolution to “Form.RunModal is not allowed in write transaction” in Microsoft Dynamics NAV

Introduction: Scenario: I had created an action button Print to run a report using REPORT.RUNMODAL, while running this report from the Windows Client an error is thrown as below. Pre-requisites: Microsoft Dynamics NAV 2017 Cause of this error: RUNMODAL stops the  transaction and waits for the User interaction. Hence, all users are blocked (who need the table). During a transaction, we are not allowed to open a object with RUNMODAL. Resolution to the error: Use COMMIT statement before you call the REPORT.RUNMODAL. What does COMMIT statement do? When the system enters a C/AL codeunit, it automatically enables write transactions to be performed. When the system exits a C/AL code module, it automatically ends the write transaction by committing the updates made by the C/AL code. This means that if you want the C/AL codeunit to perform a single write transaction, the system automatically handles it for you. However, if you want the C/AL codeunit to perform multiple write transactions, you must use the COMMIT function to end one write transaction before you can start the next. The COMMIT function separates write transactions in a C/AL code module. Example The metasyntax below contains two write transactions. As execution begins, a write transaction is automatically started. Using the COMMIT function, you tell the system that the first write transaction has ended and prepare the system for the second. Once execution has been completed, the system automatically ends the second write transaction. BeginWriteTransactions (C/AL Statements) // Transaction 1 COMMIT (C/AL Statements) // Transaction 2 EndWriteTransactions

Share Story :

Azure Machine Learning Cheat Sheet

Introduction: Microsoft released a PDF cheat sheet of which machine learning algorithms can be used on Azure Machine Learning Studio. This Microsoft Azure Machine Learning Algorithm Cheat Sheet helps you choose the right machine learning algorithm for your predictive analytics solutions from the Microsoft Azure Machine Learning library of algorithms. The algorithms have been grouped in 5 different groups. These groups are: Regression: For predicting values. For Example when predicting a stocks price. Anomaly detection: For finding unusual data points. For example, any highly unusual credit card spending patterns which deviates from the normal credit card spending patterns. Clustering: The data points have no labels associated with them. Instead, the goal of an unsupervised learning algorithm is to organize the data in some way or to describe its structure. For example, discovering companies with similar marketing strategies. Two-class classification: When there are only two choices, it’s called two-class or binomial classification. For example distinguishing between a Cat or Dog. Multi-class classification: For predicting three or more categories. For Example predicting the winner of a Race. To read the cheat sheet, read the path and algorithm labels on the chart as “For <path label>, use <algorithm>.” For example, “For speed, use two class logistic regression.” Sometimes more than one branch applies. In this case it is better to create scored models with both the algorithm and compare both of their accuracy to decide which algorithm is the better fit. Even a beginner can easily use the cheat sheet provided to select which algorithm is apt for creating their predictive solution. There are some generalizations and oversimplifications, but it points you in a safe direction. It also means that there are lots of algorithms not listed here but these many algorithms are more than enough to give you a good head start in the ML world.

Share Story :

Blanket Sales Order Dynamics NAV

Introduction: A blanket sales order represents a sales agreement between the company and a customer. It typically involves one item with multiple shipments at predetermined quantities, price and delivery dates. Scenario: Customer orders 500 units of item that will be delivered 100 units for each week. Steps: 1) In the Search box, enter “blanket Sales orders”, and select the related link. 2) Click on new to create new blanket Sales order. 3) On the General FastTab, in the Sell to Customer No. field, select Customer. 4) Keep the Order Date field blank. When the separate Sales orders are created from the blanket order, the program will set the order date of the Sales order equal to the current date. 5) On the Lines FastTab, in the Type field, select Item. 6) In the No. field, select item. 7) In the quantity field, specify quantity 100. 8) Specify date in Shipment Date field. 9) Create four more lines and specify 100 quantity and shipment date in each line. 10) Now in Qty. to Ship field, keep the quantity of 100 for the first line and delete the quantity to ship in the other four lines. 11) On the Home tab, click Make Order. 12) Click Yes to create an order. 13) You will get message that a Sales order is created from the blanket order. 14) To open the Sales order, select the first line on Blanket order. 15) On the Lines FastTab, point to Line, then to Unposted Lines, and then click Orders. 16) On Home tab of the Sales Lines page, click Show Document. Then the Sales order will appear. Conclusion: By using Blanket Sales Order organization can sell a specified quantity or amount by using multiple Sales orders over time.

Share Story :

Data Loss Prevention in Office 365

Introduction: Data loss prevention (DLP) is a strategy for making sure that end users do not send sensitive information outside the corporate network. You can set up policies to help make sure information in email and docs isn’t shared with the wrong people. With a DLP policy, you can identify, track, and protect sensitive information across Office 365. Create a DLP Policy in Office 365 Security & Compliance center: Go to Office 365 Admin Center > Security & Compliance > Data Loss Prevention. You can choose to create a policy from a template or create a custom policy. In the next step, you need to name your policy. The next step is to choose location, whether it should be for all locations or for specific. If you select, Let me choose specific location you will getting option in below image. Under policy settings, you can choose base setting (Find content that contains) or you can Use advanced settings. If you choose advanced settings then you can customize a New Rule. By clicking New Rule, you will get options to create a rule. Provide the conditions and actions. In conditions you can add sensitive information types which is available or you can select Label which has been applied to the document for data classifcation.Labels need to be created and published first in order to use it in a DLP Policy. You can create Labels from Office 365 Security & Compliance. Labels can be applied to the documents in OneDrive and SharePoint Online. You can also configure other settings like User Notification, User overrides and incident reports. After creating a Rule, Save the changes. In the Conditions option you can see the Label (see below image) which has been applied to the DLP rule “Cloud Sensitive Information”, which has been published first and then applied to the document. In the below image, showing Label which has been applied to the Cloud DLP Policy. After creating the policy, it may take upto 24hrs for the changes to take effect. Testing DLP Policy: After creating policy, if user will try to share the document with external users he will be getting policy tips (as shown in below image). Also, if you try to send the sensitive information of your organization on an email outside your organization, policy tips will be shown (see below image). If the user will override the policy tip, then he has to enter a business justification or report it as a false positive. Conclusion: This is how you can create DLP Policies and prevent your organisations classified data from leaking.

Share Story :

How to issue and redeem Gift Card in Dynamics 365 For Finance and Operations

Introduction: In this blog we shall see how to Issue and redeem Gift Card in Dynamics 365 for Finance and Operations. Issue Gift Card: On POS go to Sales. Go to Action and select Gift Card. Click on Issue Gift Card. Enter the Gift Card Number. Enter the Amount for gift Card. Select any Tender to pay. You Can check the details of the Gift card under Gift card in Dynamics 365 FOE. Redeem Gift Card: On the Transaction Screen select the Products. To redeem the gift card select on Pay gift card. Enter the Gift card number. You can check the balance by clicking on check balance. Proceed to pay. You can manually enter the amount and remaining due can be paid by other mode of payment. You can check the details of the gift card under gift card in Dynamics 365 FOE.

Share Story :

Connect your Azure Machine Learning Predictive Solution to Power BI

Introduction: Azure Machine Learning Studio is an amazing tool that lets us create efficient ML experiments with simple drag and drop features. We can predict anything from Flight Predictions to Churn Analysis. But what if we want to represent this predicted data a more visually appealing format? Well it is possible to do this by representing your predictions on Power BI! Pre-Requisites: Basic Understanding of Azure Machine Learning Studio. Basic Understanding of Power BI. A Blob Container created on Azure Storage.   Steps: Create your Azure Machine Learning Experiment on Azure Machine Learning Studio. Convert your Training Experiment to a Predictive Experiment and Deploy it as a Web Service. We will create a Console application in Visual Studio and copy paste the code inside Batch Execution. For automation we can create automated data pipelines but for now we will just use a simple Console application. Remove the existing code from the Console Application and copy paste the Batch Execution code. Install the necessary Nuget Packages and also update the following parameters. – BaseURL will be the same. – Storage Account Name, Storage Account Key and Storage Container Name will be parameters that can be found in your Azure Blob Storage which was created. – Api Key can be found in the Web Experiment Page in Azure Machine Learning Studio. – The input path is the path where you have saved your input csvfile for Batch Execution. Your Input csv file should have all the features which you have used to train your experiment After you run your Console application a new output1results.csv file should get generated in your Blob Container. The output results should include the labels which your experiment generates in it’s output. It should include the Scored Labels and Scored Probabilities labels as well. Now you can get your data using Azure Blob Storage as your source in Power BI and use the columns in the output1result.csv file to generate your ML Predicted Reports. The Report can look something like this. I hope this blog helps you to combine Azure Machine Learning Studio and Power BI to create a powerful predictive solution.

Share Story :

Connecting to On-Prem SQL from Azure Web App

Background: When an enterprise transitions to Cloud, it may still need to leave some assets on-premises for technical or security reasons. Typically SQL DBs will be On-premises for most enterprises. But this should not stop the enterprise from having their Web apps, APIs, services and mobile apps on cloud. The major hindrance in this scenario will be the feasibility for connecting the Cloud based services to On-Prem SQL for seamless transition. Azure allows you to create layer on top of this On-prem assets while safely connecting to them back on your premises using Hybrid Connections. Supported assets include MS SQL Server, MySQL or any resource that runs on static TCP Port. Prerequisites: Visual Studio 2013 or later SQL Server 2008/ 2012 with SQL server authentication Azure SDK Microsoft Azure Subscription Steps: Create SQL Server DB and table. Cerate an SQL User to connect which will be used in the .NET application. Also create some sample data in the table. Create a .NET web application which will read data from table create in Step 1. The connection string will look something like below. Host the application on Local IIS and ensure it works and can connect to SQL. Now host the application on Azure as web-app. You can refer the below link for steps to create Azure Web app. https://github.com/Microsoft/HealthClinic.biz/wiki/Create-and-deploy-an-ASP.NET-web-app-in-Azure-App-Service You will notice that the application will throw error because it will not be able to connect to the On-prem SQL. We will now create a Hybrid connection to the SQL DB. Navigate to App Service which we created in Step 4 in Azure, and navigate to Networking. Click on Hybrid Connections > Configure your Hybrid Endpoints Create New Hybrid Connection. Enter the details for Hybrid connection like below: Note: usually the TCP Port no for SQL is 1433. Please check for the SQL you are configuring. Download the Hybrid connection manager and install on the SQL server or any server on the same network. Open the installed Hybrid Connection Manager UI, and enter the connection string of the Hybrid connection we created in Azure. You can get the connection string of the Hybrid connection by clicking on it like below. Enter the Connection String in Hybrid Connection Manager UI. If everything is proper, you should see the status as Connected Like below in the tool as well as in Azure. In Azure: Other Notes: If you are facing issues with connection, you can restart the Hybrid Connection service from Local services. Please comment below in case of queries.

Share Story :

Multi-select data elements using Power BI Desktop on Visuals

Posted On February 27, 2018 by Admin Posted in

Introduction: In latest Power BI update, Microsoft introduced new feature as Multi-select data elements on visuals. In Power BI Desktop, we can highlight a data point on visuals by simply clicking on the data point in the visual. Means, if we have an important bar chart, and we want other visuals on the report page to highlight data based on our selection, we can click the data element in one visual and see results reflected in other visuals on the page. This is basic. Till now we are only able to filter the data on single-select highlighting. See below screen capture: New feature: But by using the new feature of the Power BI we can multi-select data element on visuals, we can now select more than one data point in Power BI Desktop report page, and highlight the results across the visuals on the page. To multi-select data points in visuals, simply use CTRL+Click to select multiple data points. See below screen capture for multiple data points select on visual(multi-select):

Share Story :

Using Integrated GP Customers for creating Sales Order through TIBCO Cloud Integration

Posted On February 27, 2018 by Admin Posted in

Introduction: Recently, we came across an issue where we were integrating Customer and Sales Order from Dynamics 365 CRM and GP. Customers and Sales Orders were successfully integrated except when Sales Order was created using the Integrated Customers; an error regarding Credit Limit would occur. The TIBCO Cloud Integration GP connector does not support the ability to set a customer credit limit, which in turn will not allow you to use that Customer for creating a Sales Order. Resolution: In GP Web Service Security Console, under Create Customer Policy: Change the Behaviour of Customer Class Defaulting from “Do Not Use Customer Class” to “Use Customer Class”. In the Scribe Mapping, pass a ClassKey_Id. For ClassKey_Id, setup a Customer Class with no Shipping Method or Tax schedule ID and specify the credit Limit properties.

Share Story :

Bug fixing for the error ‘The request for path /NAV2018/dev/apps failed with code 422.’

Introduction: Work around to solve the error ‘The request for path /NAV2018/dev/apps failed with code 422. Reason: The value “1473493” can’t be evaluated into type Date’. This is a bug accepted by Microsoft. Thus this blog demonstrate the workaround to the bug until it gets fixed. Pre-requisite: NAV 2018 Demonstration: Converting the text files to AL files from C/AL suing TEXT2AL utility, Thus, InitValue property of the field is set as per the C/AL object but, it throws the following error Workaround to this is either comment out the code and validate the field and add the data. Conclusion: This is a bug and the following method is only a quick fix until a new version is released.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange