Expense management in NAV 2018
Introduction: In previous versions of NAV, employees had to be set up as vendors to post expenses. Now, NAV 2018 allows expenses to be posted against employee cards, with a range of tools to simplify the process. Features: In NAV 2018, in the Account type field a new value of Employee is added. Create Journal entries directly for employees – only supports local currency (LCY) Employee posting group, definable on the employee card. Make payments to the employee in the Payments Journal. Get payment suggestions with a full list of outstanding employee payments. Apply payments to open employee entries in one go, linking the payment to the employee journal entry and closing both. Correct mistakes with unapply payments. Posting a general journal line with Employee as account type or balancing account type will generate an employee ledger entry. When posting a general journal line for an employee, the Document type field must be either blank or set to Payment. Conclusion: This feature is very useful for Reporting purpose as well as to maintain Employee wise details in the system.
Share Story :
Process to Provision Target Tool in Dynamics 365 Finance and Operations
Introduction: When Copying a database between environments, you will need to run the Environment Re-Provisioning tool before the copied Database is fully functional, to ensure that all the Retail Components are up-to date. Steps: Steps to Process Re-Provisioning Tool: 1. Log into your LCS Project. 2. Go to Asset Library. 3. Select “Software Deployment Package”. 4. Click on Import. 5. Select “Environment Provisioning Tool”. 6. Click Pick. 7. Once this Package is available in your Asset Library, Upload this package to your system and schedule it to run as you do for any other deployable package. 8. Go to Maintain and Apply updates.
Share Story :
How to configure Visual Studio Code(VS Code) for Extensions in Microsoft Dynamics NAV 2018
Introduction: Developement in Microsoft Dynamics NAV 2018 through extensions is done using Visual Studio Code. This article gives the steps how to configure Visual Studio Code on Microsoft Dynamics NAV 2018 for Extensions. Pre-requisites: Microsoft Dynamics NAV 2018 CU 1 Visual Studio Code. Steps: 1. Run the Microsoft Dynamics NAV 2018 setup, click on Add or remove components then click on the Modern Development Environment and select Run from My Computer. 2. Open Dynamics NAV Administration Console, select the server instance and click on Edit. 3. Under the Development tab at the bottom, Enable Developer Service Endpoint and Enable loading application symbol references at server startup. 4. Download Visual studio code from here and install VS Code. Method 1: 1. Open Microsoft VS Code and click on the Extension button at the left hand corner. 2. Install AL Formatter and AL Language. 3. Click on File and click on Open folder and select your folder containing objects in .al format 4. In the launch.json file enter the below credentials. 5. Click on Ctrl+Shift+P and click on download Symbols. This will download the symbols successfully. and VS Code is configured for extension in NAV 2018 . Method 2 : 1. Unstall AL Formatter and AL Language. 2. Open Microsoft VS Code and click on Extensions under View Menu. 3. This will prompt you to select .vsix file. 4. Navigate to C:\Users\<user name>\Downloads\CU 01 NAV 2018 NA\NAV.11.0.19846.NA.DVD\ModernDev\program files\Microsoft Dynamics NAV\110\Modern Development Environment and select the .vsix file. and click on OK. 5. In the launch.json file enter the below credentials. 6. Click on Ctrl+Shift+P and click on download Symbols. This will download the symbols successfully and VS Code is configured for extension in NAV 2018.
Share Story :
Converting NAV C/AL Objects into AL using TXT2AL converter tool
Introduction: When migrating from NAV 2017 to NAV 2018 i.e moving from C/AL to AL the created objects need not be created again from scratch in AL. This blog demonstrates how we can create the AL objects from C/AL using Txt2Al in Command Prompt. Pre-requisite: NAV 2018. Backup of Objects in TEXT or FOB format. Steps: 1. Import objects in NAV 2018 from the created .TXT or .FOB files. In NAV Development Environment goto, Files > Import > Choose the .FOB or .TXT file > Replace all the objects 2. Export the objects in New Syntax to .TXT files. Create a directory to store the objects in .TXT format which will be exported as new syntax. Here, ‘source’ is the name of the directory used. Use ‘mkdir source ‘ Create a directory to store the .AL that will be created by TXT2AL Converter utility. Here ‘target’ is the name of the directory used. Use ‘mkdir target’ To export the objects in .TXT format in NewSyntax use the following command where Source Directory = ‘source’ Filename = ‘CU0-5.txt’ Database Name = ‘Demo Database NAV (11-0)’ SQL Server Instance = ‘.\NAVDEMO’ Type of Object = ‘codeunit’ ID = ‘50000..50005’ —-is the ID range Similarly, for Tables: Type of Object = ‘table’; ID=’50000..50005′;filename=’TAB0-5.txt’ Pages: Type of Object = ‘page’; ID=’50000..50005′;filename=’PAG0-5.txt’ Queries: Type of Object = ‘query’; ID=’50000..50005′;filename=’Query0-5.txt’ Reports: Type of Object = ‘report’; ID=’50000..50005′;filename=’REP0-5.txt’ Menusuite: Type of Object = ‘menusuite’; ID=’50000..50005′;filename=’MS.txt’ XMLPort: Type of Object = ‘xmlport’; ID=’50000..50008′;filename=’XMLPort.txt’ NOTE: TXT2AL requires objects exported in New Syntax 3. Generate the .AL object files using the TXT2AL converter. To generate the .AL file from the .TXT files from source folder, run the following command where Source Directory = ‘source’ Target Directory = ‘target’ –rename = Used to create new .AL file per new object and give it the appropriate name of the ID and name. Conclusion: Thus, in this way, we can convert the objects in C/AL to AL format. Although the conversion is not 100% perfect but, most of the redundant work can be avoided.
Share Story :
Power BI Reports sharing with Teams and Partners
Introduction With the latest release of Power BI, new feature has been added to share the power BI reports as well. Yes, you heard it right, now we can share the Power BI Reports as well to teams and partners. How it works: Go to your workspace and open the report that needs to be shared. Click on the share option. Enter the email address and click on share, if the send email notification option is checked then notification will be send to users. Conclusion: This is a superb feature that is added to power bi, allowing organization to share reports as well with dashboards for detailed information.
Share Story :
SSIS package Deployment using file System
Introduction: In this blog we will see how we can deploy SSIS integration package individually using file System Deployment. Steps: Convert the current project to package deployment model. After clicking the project deployment model, the screen will pop up with the compatibility check for all the packages with below screen, click ok if all is good. Below icon and name will be changes for the solution when you click Ok button from above screen. Create your package and test it in visual studio, save your package. Now we go for deployment of package: Capture the path for the package: “D:\PackageDeploy\Package.dtsx” Deploying Package to SQL Server and Configuring it to SQL Agent Job: Create New job Create the Steps in JOB Crete the Step. Select the Step type to SQL Server Integration Services Package Select the Package source type: File System as this is a package Deployment model. Provide the package path Note: While providing the package path we need to make sure that SQL server can communicate to the path as this will be running inside the SQL server. Execute package and check the Job status: Conclusion: With the above steps you should be able to deploy the SSIS package using package deployment model, there are many more methods to deploy the package, like- by using manifest file, Project deployment models and so on, stay tuned for upcoming blogs.
Share Story :
Hide Report page in Power BI report
With Power BI latest release, you can now hide the page in Power BI reports, this page is available while you are developing and when you hide it by right click on the package and selecting Hide Page, then that page will not be available in Reading mode. When you select the Hide page option then the option is checked as hide page. Page is marked as dull colour to show that it is hidden. When you publish this to your workspace then page will not be displayed in reading mode: So, you can technically hide page in your power bi report if you don’t want to use them, but if the user has rights to edit the report then he can change the settings.
Share Story :
Xrm.Panel in D365
Overview Xrm.Panel is a new additional to the client-side scripting in Dynamics 365. This feature is still in preview in the D365 December 2016 update. Panel is a simple static place on your D365 web client which loads a Web Page inside it. Perhaps, the best use of the same could be a Web Chat application implementation. Implementation Simple – I’ve made a JS Web Resource that has a function to call the Xrm.Panel.loadPanel(url, title); method and I’ve invoked the same on change of a Phone Number. Here’s a simple JS snippet I’ve written: oAccountCustomization = { loadPanel: function() { “use strict”; Xrm.Panel.loadPanel(“https://cft89.crm.dynamics.com//WebResources/new_SamplePage”, “Hello”); } }; And redirected it to another HTML page in my own D365. Then, I’ve added this to the onChange of the PhoneNumber on the Account record. And saved and Published the customization. Seeing it work: Now, when I change the Phone Number field on the Account, the slider appears on the right hand side of the page. You can click on the arrow to bring it in focus. Here we go! The Panel is now seen. Remember, this works only on the Web Client.
Share Story :
Multi-Factor Authentication for external user’s – SharePoint Online
Introduction: Many of the organizations are using SharePoint Online in Office 365 as their content management system and it is essential to protect data so that the sensitive data does not slip into false hands. It is here we can use Multi-Factor Authentication and we can do this through Azure AD for that tenant by creating a Dynamics group for ‘External users’ and then create a conditional access policy and apply it to SharePoint Online. Creating a Group for External Users: Login to Azure AD Portal, and go to Azure AD > Users and Groups > All Groups and click New Group. Provide a name and description to this group and select membership type (Dynamic User). Click Add query -> Create to make the group dynamic. It will take some time for the group to populate. After Group is created, you need to provide Conditional access to this Group. Create a Conditional Access Policy for SharePoint Online: Login to Azure AD Portal, go to Enterprise Application > Conditional Access and click on New Policy. Provide the name to the policy. Under Assignment > Users and Groups, select Include > Select Users and groups > Select, and then chose the group whom you want to provide the policy (External users). Under Assignment, go to Cloud Apps >Include > Select, and then choose the application (Office 365 SharePoint Application). Under Condition, select Condition if you want. Under Access Control, go to Grant and select Grant access and then choose Require multi-factor authentication. At last, toggle the Enable policy switch to ON and click Create. To verify if the policy is created, navigate to Conditional Access and check the policy name and if it is enabled. Wait for few minutes for the policy to take effect, after that you can check by sharing a document from SharePoint to external user. It will ask for authentication (see below image). Conclusion: In this way, you can create a conditional access policy and protect the sensitive data in your SharePoint Online. Hope this will be useful.
Share Story :
SQL Server 2017 | New function : TRIM
TRIM Function has haunted SQL DBA for ages. If you have been using SQL Server for a while, you will totally agree with me over here. In this blog post we will see how the new feature TRIM of SQL Server 2017 works in few simple words. SQL Server DBAs and Developers have always dealt with SQL Strings and the leading and trailing spaces often makes them crazy. Data query may not be at it’s best if there are leading spaces around strings and which are not useful for data comparison and well as storage point of view. In the previous server versions developers used to two different functions LTRIM and RTRIM around the string to get necessary results. However now in SQL Server 2017 we have a new feature introduced which is TRIM(). This function works just like LTRIM and RTRIM together. When you run above script you can see that when we use function TRIM around the string it removes leading and trailing spaces. Trim function is combination for LTRIM & RTRIM, and only available in SQL server 2017. For earlier version, LTrim and RTrim is available.