Blog Archives - Page 81 of 174 - - Page 81

Category Archives: Blog

Create An Editable Grid View In PowerApps

Introduction: In this blog, we will learn how we can create an editable Grid View in PowerApps. Steps: 1.Set up a gallery in your Powerapps.     Insert a new gallery – Insert > Gallery > Vertical 2.Add Data Source to the Gallery you added.    Go to Properties > Click Data Source you want. 3.Delete the Label from the Gallery. 4.Add Text input control in the PowerApps Grid.    I have added 3 Text input control inside the Grid. 5. For each text input box: Set TextInput.Default = ThisItem.<fieldName> For eg: TextInput1.Default = ThisItem.Description 6.The output screen after adding the Default property. 7. You can change the field here. 8. To Save the changed value into the Data source, set the following: Set TextInput.OnChange = Patch(Products, ThisItem, { <fieldName>: TextInput.Text }) This will change and save the value into the CRM.

Share Story :

“What If” Parameter in Power BI

This blog will explain how to use “What If” Parameter in Power BI desktop. Via What If parameter in Power Bi can easily give you the ability to dynamically transform your data. Using this parameter will allow to demonstrate how your data change under various scenarios. For example, how much revenue would you have if your products were at 5%, 10% or 15% of the retail price. Another scenario would be to show create a marketing mix to show how profit would change due to different investment in each channel and also if company increased or decreased its budget then how the revenue amount will be change etc. How to use What IF parameters in Power BI Step 1: Click the Modeling tab in the top ribbon. Step 2: Click the What IF parameter from the top ribbon. Step 3: The What If parameter window will open, provide details such as Name, Data Type, Minimum, Maximum and Default number. Step 4: Lastly, you can add an optional slicer. Step 5: A table with a calculated measure will be created A generated series that spans the specifications of your parameter. A selected value function that changes as the parameter changes. Let’s do it practically Scenario: You have a list of Azure usages details like server name, VM name and cost. Company would like to create a parameter that allows them to apply usage so that company can see the overall cost of each year when different usages are applied. Company would like to have a parameter that spans from 0% to 50% with a 5% increment. You can see how this parameter is created by viewing the new table: Once the What If parameter is created you have your generated series that looks like: Usages Percentage = GENERATESERIES(0, 0.5, 0.05) And you will have a selected value functions that looks like this: Usages Percentage Value = SELECTEDVALUE(‘Usages Percentage'[Usages Percentage],0) Both of these are automatically created for you. Apply the parameter to your data In this case, company want to apply the usage to total cost. This can be easily done with a calculated measure. Usages = SUM(AzureUsages[TotalCost])*’Usages Percentage'[Usages Percentage Value] The final result you can pull into a cluster column chart or table so that company can see how the cost is affected by the usage parameter as you slide it to different usage values. I hope this is helpful. Check out my other blog here https://www.cloudfronts.com/embed-secure-power-bi-report-using-python-web-application-with-flask-in-visual-studio-2015/

Share Story :

Top 20 Best Practices of Power BI

In today’s business intelligence world, Power BI has become a beloved among many. In this blog, I am going to cover Power BI best practices, that will assistance to you while developing the Power BI report. Company Logo Use the logo in their background. Data Timestamp to show, when it was refreshed last We can implement it, to show the end user, when the data was refreshed last. Less use of scrollable Page End users always feel difficulty when the report has a scrollable page. We do not recommend making the page scrollable if not necessary. Instead of making page scrollable, use Bookmark and Selection Pane. Census dashboard doesn’t have a scrollable page which is good. Use Basic Reporting Filter Power BI provides the following type of filters. Visual Level Page Level Report Level Use the filters wisely as per your requirement. Pull data from views, not tables Importing data from tables in a SQL Server, MY SQL or Oracle database creates strong dependencies between the physical data model and the reporting engine. Whenever table structures change, it’s best to pull relational data from views. Filter before import If you’re importing data into Power BI instead of a live connection, it’s best to limit the amount of processes that happen inside the tool. Power BI has a limit for the amount of data that’s allowed to be imported, so any steps to avoid reaching that limit will be a plus. Narrow tables are faster than short and wide tables If the performance is slowing down as you’re adding data, it’s mostly due to wide tables.  Power BI reacts much faster using narrow and long tables versus short and wide ones. Remove unused fields Whenever adding more and more data, you will notice the pbix file size increasing.  One of the best and quickest ways to reduce the pbix file is to remove any unused fields. How: Click Edit Query > then select the table you want to remove the fields from > Click Choose Columns. Label all of your steps As you’re going through and modifying the imported data, Power BI creates a history that allows you to seamlessly go back and remove any changes that might break the datasets.  Furthermore, labelling each of these steps allows you to easily remember what each does. Limit the visuals in dashboards and reports The Microsoft Power BI performance best practices highlight that placing many visuals in a single report is responsible for it. This is what you need to do in order to limit the number of visuals in dashboards and reports: Limit to a minimum of eight widget visuals in every report page and keep the grids to a minimum of one in every page The pages should be limited to no more than 30 points (cards: 1, gauges: 2, charts: 3, maps: 3, grids: 5) Keep the tiles limited to no more than 10 per dashboard. Remove unnecessary interactions between visuals Do you know the secret of improving Power BI report performance? Here’s a clue! You can make that possible by removing unnecessary interactions between visuals. This is possible because of the reason that all visuals on a report can interact with one another by default. The interactivity should be controlled and modified for optimal performance. Further, you can reduce the number of queries fired at the backend and improve report performance by disabling unwanted interactivity. Enable Row-Level Security (RLS) Power BI only imports the data that the user is authorized to view, with RLS that restricts user access to certain rows in a database depending on the characteristics of the user executing a query. But how to attain substantial performance gains? You can enable this by combining Power BI roles with roles in the backend. Moreover, you need to test all roles prior to rolling out to production. Use Microsoft AppSource certified custom visuals The Power BI certified custom visuals are verified by Microsoft to have robust as well as well-performing code. These AppSource visuals have passed rigorous quality testing and are the only custom visuals that can be viewed in Export to PowerPoint and email subscriptions. Avoid using hierarchical filters We recommend, not to use any hierarchical slicers in the report. Currently, we have observed that no hierarchical slicers are used in the report. Experience an enhanced performance in Power BI by using multiple filters for the hierarchy. Categorize the data for Power BI reports One of the best practices in Power BI is to provide data categorization for the Power BI reports (HBI, MBI, LBI). The Power BI data classification enables you to raise user awareness about the security level that is required to be used. This also helps you to understand the way reports should be shared inside as well as outside the organization. The categories can be listed as: HBI or High Business Impact data, that requires users to get a policy exception to share the data eternally. LBI or Low Business Impact as well as MBI or Medium Business Impact, that do not require any exceptions. Use the On-premises data gateway It is suggestible as well as one of the best practices to use on-premises data gateway instead of Personal Gateway for it takes data and imports it into Power BI. But why Enterprise Gateway? It is more efficient while you work with large databases as Enterprise Gateway imports nothing. Use separate Power BI gateways for “Direct Query” and “Scheduled Refresh” Using the same gateway for Scheduled Data Refresh and Live Connection slows down the Live Connection performance when the Scheduled Data Refresh is active. It is suggestible for you to create separate gateways for Live Connection and Scheduled Refresh to avoid such issues. Test each custom visual on a report for ensuring fast report load time The Power BI team doesn’t thoroughly test the custom visuals that are not certified. So, while handling large datasets or complex aggregations, the custom visuals might perform poorly. What should you do when the chosen visual … Continue reading Top 20 Best Practices of Power BI

Share Story :

Secure Input/Output in Power Automate Run History

Isn’t it just too easy to see from Flow (Power Automate’s) Runs what data was passed on? A simple switch in the Power Automate designer will secure this. Default Behavior By default, if you have access to the Flow, you can simply go in and see the inputs Secure Input / Output In the Flow designer, you can select and step and go to Settings And turn on Secure Input / Output depending on what you want. It says this is still in Preview as of the day of writing this post. Once this is Active, it is denoted by a Lock symbol on the step you enabled it on. And now, when you try to look at the data, it will hide away the information Note: Please note that this will run only for the Run History records after this was turned on. The previous records will continue to show the data.   Hope this helps!!

Share Story :

Find hidden entities in CRM using Metadata Document generator from XRM ToolBox

Problem Statement: I had a requirement where I needed to check the fields on the Entity POST, however in CRM customization, I could not find the entity POST though it was visible in advanced Find. Solution: To view all the data related to any hidden field, you can use the Metadata Generator in XRM Toolbox Connect to your environment on XRM Toolbox. Search for Metadata Document Generator In Metadata Document Generator: Retrieve Entities and languages Select the entity you want to generate metadata for and enter the file path of the excel document where the data needs to be stored. Generate Document 4.  Open the document, you will be able to see all the data related to the Entity. Conclusion: XRM Toolbox- Metadata Generator is helpful in case the entity cannot be viewed in CRM.

Share Story :

RSAT (Regression Suite Automation Tool ) implementation and configuration for Finance and Operations

Purpose The Regression suite automation tool (RSAT) significantly reduces the time and cost of user acceptance testing. This tool enables functional power users to record business tasks using the Finance and Operations Task recorder and convert these recordings into a suite of automated tests without the need to write source code. Test libraries are stored and distributed in Lifecycle Services (LCS) using the Business Process Modeler (BPM) libraries. These libraries are also fully integrated with Azure DevOps Services (Azure DevOps) for test execution, reporting and investigation. Test parameters are decoupled from test steps and stored in Microsoft Excel files. Prerequisites Dynamics 365 for Finance and Operations test environment (Demo or tier 2 UAT environment Excel Azure DevOps: You will need an Azure DevOps Test Manager or Test Plans license. For example, if you have a Visual Studio Enterprise subscription, you already have a license to Test Plans. Pricing-https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/ For a demo environment, you don’t need to buy any license. Authentication Certificate: To enable secure authentication, RSAT requires a certificate to be installed on the RSAT client computer. The RSAT settings dialog box allows you to automatically create and install the authentication certificate. Installation Download Regression Suite Automation Tool.msi to your machine RSAT requires Selenium and web browser driver libraries. RSAT will prompt you if needed libraries are missing and will automatically install them for you. Configuration For RSAT Open RSAT application. Select the Settings button in the upper right to configure RSAT. And next steps will help you to find those required fields input. Go to project settings of Lcs for your projects. Go to Visual Studio Team Services. Here you need to mention the Azure DevOps project in the Azure DevOps site URL field. In order to do that, click on https://www.visualstudio.com Open Azure DevOps and create a new organization if there is not an existing one. Now create a new project as shown below Now you need to set up a security token by clicking on  account info>security Once you create the token, save it as you will not be able to access it again when you want to use it. Once that is done, go back to the main page and create a new test plan. Name it as RSAT-TT (or you can use any name) Now right click on RSAT-TT and create a new suite you can name it ‘Demo’. Azure DevOps setup is done. In Azure DevOps site URL mention Organization name that you set up in Azure DevOps. And in Personal access token field paste the token that you had earlier saved. Click on continue to select the project and continue, Save. Now you need to deploy it to the environment Next, open the Regression Suite Automation Tool, Go to settings in Azure Dev Ops Url field copy it from the LCS Access token should be the security token you had copied. Click on Test connection so the Project name and Test plan will populate. Now run VM. You will find Hostname and SOAP Hostname by going to IIS and then right-clicking on AOSService>Edit bindings. Copy both the Hostname and in Hostname and SOAP Hostname fields paste these values Admin username should be the username you use to login to your environment. To generate Thumbprint click on New and save at any location and then copy the generated certificate to the VM Open the copied certificate and install it to the local machine at personal and Trusted Root Certification Authorities locations.Now Open the wif file in admin mode in notepad from the given location of VM In wif file find CN name=127.0.0.1 exists or not. If not, copy the selected portion and paste it below the same authority block. Now add modify those lines as follows: <authority name=”CN=127.0.0.1″>             <keys>               <add thumbprint=”F46D2F16C0FA0EEB5FD414AEC43962AF939BD89A”/>             </keys>             <validIssuers>             <add name=”127.0.0.1″ />             </validIssuers>             </authority>  ( Note: Add thumbprint of installed Certificate in wif as shown)   Final steps include Copy thumbprint from RSAT settings (which was generated when you click on New) and paste it in wif file in your VM Then Mention the company name And Working directory Set default browser as internet explorer Save as and ok Next, Go to LCS open business process modeler and create a new Library Name it as RSAT, go to edit and rename the process as required and you may add a child node to it by clicking on the Add process.  Now go to Finance and operations, go to test recorder  Create recording by clicking on create a recording and perform the operation and then click on the stop button. Name it as per your need then Save it to Lifecycle services or Save this to PC option. Click ok Now go back to LCS in the project library and click on the requirement, tab check it’s syncing  Now Sync test cases and VSTS sync Next, go to Visual studio DevOps, test cases, click on Add existing Then click on the run query and click on Add test case  Now go to regression suite automation and load the test and download test cases. select test and click on new and generate test execution parameter files Then click on edit option for the older version to edit values in excel For older version For newer version Now edit metadata for the test in excel file and save and close Now Run the test after this step, automatic session for the test is handled by selenium where the browser will perform steps as test cases Then run the test and after it’s completed successfully click on upload (Note the result as passed)

Share Story :

How To Enable / Disable Maintenance mode Using SQL query in Dynamics 365 for Finance & Operations

In finance and operations for making any changes in License configuration form, you need to enable maintenance mode and after making desirable changes you need to disable to this mode. You can enable or disable maintenance mode using SQL query as well as command prompt. In this blog, we are performing this operation using the SQL query. The following are steps to enabling/disabling maintenance mode:- Open SSMS(Microsoft SQL Server Management Studio) in your server. Click on New Query and enter the following query to enable maintenance mode:- update dbo.SQLSYSTEMVARIABLESset dbo.SQLSYSTEMVARIABLES.VALUE =1 where dbo.SQLSYSTEMVARIABLES.PARM = ‘CONFIGURATIONMODE’   You can verify status using following command:-SELECT * FROM [AxDB].[dbo].[SQLSYSTEMVARIABLES] After this restart IIS services in some cases, you need to restart the server. perform your changes to the License configuration form. after desired changes are made you can disable mode using the following command update dbo.SQLSYSTEMVARIABLESset dbo.SQLSYSTEMVARIABLES.VALUE =0 where dbo.SQLSYSTEMVARIABLES.PARM = ‘CONFIGURATIONMODE’ Now again you can verify status using the same query in step 3 and again repeat step 4.I hope this blog will help you

Share Story :

Attach Custom Generic event to lookup field (one or multiple same field) in D365 portals

Posted On December 27, 2019 by Admin Posted in

Sometimes we may get some requirements with multiple lookup fields on the same Entity form on D365 Portals. Also, we may have to perform some operations on click of search button on these Lookup fields. Here, it is not possible to achieve this without writing javascript or jquery code mean-while, we also have to make sure that the click event should be generic (single event working for all similar lookup fields). This blog will guide you to attach generic click event on all the similar lookup field using jquery. Below is the sample code for the same.   $(document).ready(function () { $(“.genericContact”).parent().find(“button[title=’Launch lookup modal’]”).each(function () {//click for all lookup $(this).click(function () { //your  code here }); }); }); In the above code, selector genericContact is a custom css class added to all the similar lookup fields on an entity form. To see how to add css class to any attribute click here.

Share Story :

Supply Chain Management (SCM) with Microsoft Dynamics 365 Business Central

With Supply Chain Management based on Dynamics 365 Business Central (formerly Dynamics NAV), mapping processes along the entire supply chain is highly efficient. Microsoft Dynamics 365 Business Central ensures smooth processes in purchasing, sales, manu­facturing, logistics and warehouse. You can use Supply Chain Manage­ment (SCM) to adjust and control the work­flows customized to specific requirements. Supply Chain Manage­ment in Dynamics 365 Business Central can be configured flexibly and helps you integrate all business partners in the best possible way. For example, procure­ment, processing and delivery times can be reduced and optimized.   SCM functions in Dynamics 365 Business Central: Optimize inventory levels: Use built-in intelligence to predict when and what to replenish. Purchase only what you need with dynamically updated inventory levels.   Avoid lost sales and reduce shortages: Maintain the right amount of inventory by automatically calculating stock levels, lead times, and reorder points. Suggest substitutes when requested items are out of stock.   Maximize profitability: Get recommendations on when to pay vendors to use vendor discounts or avoid overdue penalties. Prevent unnecessary or fraudulent purchases through approval workflows.   Sales order management: Manage sales orders, blanket sales orders, and sales order processes.   Purchase order management: Manage purchases, blanket orders, and purchase order processes.   Warehouse management (Basic and Advanced): Warehouse functionality in Business Central can be implemented in different complexity levels, depending on a company’s processes and order volume. The main difference is that activities are performed order-by-order in basic warehousing when they are consolidated for multiple orders in advanced warehousing.   Item transfers: Track inventory as it’s moved from one location to another and account for the value of inventory in transit at various locations.   Locations: Manage inventory in multiple locations that may represent a production plant, distribution centre, warehouse, showroom, retail outlet, or service car.   Assembly Management: To support companies that supply products to their customers by combining components in simple processes without the need of manufacturing functionality, Business Central includes features to assemble items that integrate with existing features, such as sales, planning, reservations, and warehousing.

Share Story :

Dynamics 365 Business Central for the service sector

Maximize efficiency of your customer service Gain a complete overview of the tasks and workloads in your service department to efficiently allocate resources and accelerate the answer of requests. With an increasing number of different business contacts, it is necessary to handle them professionally in order to generate extensive customer trust through constructive communication and reliable processing of service cases. Service Management functions in D365 Business Central: Manage forecasting to fulfilment: Use sales forecasts and expected stock-outs to automatically generate production plans and create purchase orders.   Run your warehousing efficiently: Get a holistic view of inventory for efficient order fulfilment. Track every item transaction and movement by setting up bins based on warehouse layout and storage unit dimensions.   Reach optimal output levels: Calculate and optimize manufacturing capacity and resources to improve production schedules and meet customer demands.   Service orders: Register your after-sales issues including service requests, services due, service orders, and repair requests.   Service item management: Record and keep track of all your service items, including contract information, component management, and BOM reference and warranty information.   Service contract management: Record details on service levels, response times, and discount levels, as well as on the service history of each contract, including used service items and parts and labour hours.   Planning: Assign personnel to work orders and log details such as work order handling and work order status.   Dispatching: Manage service personnel and field technician information, and filter according to availability, skills, and stock items.   Service price management: Set up, maintain, and monitor your service prices.  

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange