Category Archives: Blog
Reconfigure Voice of the Customer
Introduction: If the Voice of Customer solution is accidentally deleted or misconfigured, it might not be able to receive survey response. You can repair or restore the Voice of the Customer configuration. Steps for restoring configurations are listed below for Web client and Voice of Customer app. Web client: Sign in to Dynamics 365. Go to Settings > Voice of the Customer Configurations. Select Configure from the toolbar at the top of the screen. Voice of the Customer app Open Voice of Customer app. Go to Settings > Configurations. Select Configure from the toolbar at the top of the screen. Hope you find this helpful!
Share Story :
How to send Email Notification to Users using Workflow in Dynamics NAV
Introduction: Many a times in an Approval workflow, we need to notify the user by email to the Approver that approval request is sent also the sender needs to be notified whether the approval request sent is approved or cancelled. Hence, we need to setup Email Notification. Pre-requisites: Microsoft Dynamics NAV 2016 Office 365 Account Steps: 1. Setup SMTP Mail Setup Navigate to SMTP Mail setup from the Search bar. Click on Apply Office 365 Server Settings. Enter the UserID and Password (This is the sender’s email id. Mostly company email) 2. Setup Notification Template and Notification Setup Navigate to Notification Template and set Notification Method to E-mail for Type Approval Navigate to Notification Setup, for Notification Type Approval Schedule it Instantly.It can also be schedules Daily, Weekly, Monthly 3. Setup emails in the Approval User Setup In the Approval User Setup, enter the email id of each user in the email field. 4. Enable Mail Notify Job Queue. Navigate to the Job queue and enable Notify Job queue. The Mail Notify job queue is a standard job queue. When workflow is triggered having response Send approval request for the record and create a notification, Notification Entry is created in the Notification entries Record. This job fetches records from Notification entries and sends email to the recipient. The sent email can be viewed in the Sent Notification Entries page. Note: Delete the pre-existing Notification Entries of users with no email id or entries prior setting up emails 5. Sending email to the approver from workflow I’ve enabled standard Purchase Quote workflow. Open Purchase Quote and click on Send Approval Request. The status of the Purchase Quote changes to Pending Approval A notification email is sent to the approver.
Share Story :
Reading more then 10K records in D3FOE OData API
Introduction: We all know Dynamics 365 Finance and Operations has limitation of 10K records to be fetched at a time using Data Entity. While reading records from D3FOE for CRUD operations in OData API you will face issue if you want to process more than 10K records at a time. You can resolve this issue by using query parameter ‘skip’ and ‘top’. Use the below syntax to query records while reading from D3FOE: DataServiceQuery<[EntityName]> EntityObject = context.[EntityName].AddQueryOption(“$skip”, 10000).AddQueryOption(“$top”, 10000); where, EntityName = Name of data entity from where you want to read records Skip = it will skip the number of records you specified from the fetched records Top = It will select the number of records specified after skipping the records Example: If there are 30K records in CustCustomers then above line will skip 10K records and then pick the next top 10K records. So records selected will be from 10001 to 20000. DataServiceQuery<CustCustomers> EntityObject= context.[CustCustomers].AddQueryOption(“$skip”, 10000).AddQueryOption(“$top”, 10000); You can also get the total count of records and then run a loop until max count of records and increment the skip with 10K on each loop and top with 10K. Using this you can read ‘n’ number of records.
Share Story :
Paging in D365 Customer Engagement v9.0
Introduction: The Xrm.retrieveMultipleRecords method is used to retrieve a collection of records in Dynamics 365 Customer Engagement . In this blog we will demonstrate how we can use paging and fetch more than 5000+ records. In CRM when we fetch records using code we only get the first 5000 records, in some cases there are more than 5k records that need to be fetched, we can achieve this using paging. In this blog for the demonstration purpose, we will fetch 3 records per page so that we can see how the paging functionality works in D365 v9.0 Implementation: Step 1: The syntax is as shown below: Xrm.WebApi.retrieveMultipleRecords(entityLogicalName, options, maxPageSize).then(successCallback, errorCallback); Here the in options parameter we specify the query. In our example we will be fetching all the accounts in the system. In the maxPageSize parameter we specify the number of records to be returned per page. By default the value is 5000. In this example we set the maxPageSize as 3 which will return 3 records per page. As the total number of records being fetched are more than 3, the nextLink attribute is retuned with the link to fetch the next set of records. The value of the nextLink attribute returned is already encoded. Before we pass the link to fetch the next set of records we have to make sure to only set the query in the options parameter. We also store all the values returned in a separate variable so that it can be used later. Step 2: The code is shown below. The allaccounts variable will store all the accounts fetched at the end as we keep on concatenating the received results. Code: var query = “?$select=name”; var allaccounts = null; var scripting = { retrieveMultipleContacts() { var url = Xrm.Page.context.getClientUrl() + “/api/data/v9.0/accounts”; Xrm.WebApi.retrieveMultipleRecords(“account”, query, 3).then( function success(result) { var resultRetrieved = result; allaccounts = resultRetrieved.entities.concat(allaccounts); if (result.nextLink != undefined) { console.log(“Next page link: ” + result.nextLink); query = result.nextLink; var splitValue = query.split(url); query = splitValue[1]; scripting.retrieveMultipleContacts(); } }, function (error) { console.log(error.message); } ); } } Step 3: To test this out we can simply trigger this code to run on the change of form fields and while debugging we can check the values returned and stored in the allaccounts variable. Conclusion: The new D365 v9.0 Xrm.WebApi.retrieveMultipleRecords method simplifies the whole process of fetching records using paging.
Share Story :
Set up Dynamics 365 connection in Microsoft Social Engagement
Introduction: This blog explains how to Set up Dynamics 365 connection in Microsoft Social Engagement. Steps to be followed: Go to settings Under connections tab go to Microsoft Dynamics 365. Click in + to create connection and then click on Accept. Select the connection type. You can click on CHECK INSTANCES it will load the instances. Or you can enter the URL of the instance in Dynamics 365 instance URL box. Finally give name to your connection. If you want to make this as your default connection. click ON the set as default option. Save.Note: You can only connect to those instances who are in same office 365 tenant.
Share Story :
Voice of the Customer failed to install
Introduction: Many people face issues in installing Voice of Customer solution on v9 environment and trying repeatedly to install the solution but failed. Below is the error that usually people get. How to install? Installing Voice of Customer is an easy process. If you are facing some error, you might be doing something wrong. The steps on how you can install Voice of Customer solution are listed below. The Voice of the Customer for Dynamics 365 solution can be installed from Dynamics 365 Administration Center. It is compatible with Dynamics 365 version 8.2 and later. Sign in to your Dynamics 365 solution as administrator. Go to the Dynamics 365 Administration Center, and then select the Applications Select the application row titled Voice Of The Customer, and then select Manage. From the Dynamics 365 Instancedrop-down list, select the instance where you want to install the solution. Accept the license terms. Select Install. By following the above steps you can see the solution gets configured properly. Note: You must be a tenant administrator to install the solution. The list of instances only displays the organizations with Dynamics 365 version 8.2.
Share Story :
Scribe Insight AX as a Web Service Find Block issue
Introduction: If we need to look up for any value from AX then we do it by using a Find Block in Scribe Insight Eg: BasicHttpBinding_ItemService_find Every Find block has 2 components and they are, Query Criteria – Used for specifying the Table Name, Field Name and the Fields Value for finding the record Return Value – Used to fetch the needed value back If for any reason either of the component is missing, then you cannot lookup in AX and search the required value back. Consider the following Scenario where we have a SalesOrderService Find block with only the Query Criteria component but without the Return Value component. Following are the steps to resolve the issue. Steps: Go to your AX Web Service Connection and click on Edit Click on change connection Proceed further by clicking OK and then click on your connection and click on Edit Select the Configuration Tab on top Select the Find Block under Method that is not showing the required Return Value. Here that would be the BasicHttpBinding_SalesOrderService_find Method. Make sure that the value of QueryCriteria_CriteriaElement and ReturnValue_SalesTable both are 1. Note: If you want to look up with additional parameters then increase the QueryCriteria value. Validate your Web Service Connection and restart your DTS Your issue should be resolved and the Return Values should be visible Conclusion Now you should successfully be able to look up and find a value from any of the Return Values in the Sales Order successfully.
Share Story :
Using Variable Connector In TIBCO Cloud Integration
Introduction: The Variable Connector, created as part of the Scribe Labs initiative, adds a much-needed feature to TIBCO Cloud Integration i.e. to store and retrieve variables in a Scribe Map. However, keep in mind, these variables cannot be shared between maps or solutions. Steps For Installation: To begin using the connector, install it from the Scribe Marketplace. Go to Marketplace. Search for ‘Variables’. Select Scribe Labs – Variables and click ‘Install’. The Connector will install for your Organisation in a few minutes. Steps To Create A Variable Connection: From the ‘More’ dropdown menu, click on ‘Connections’. Click on the plus sign (+) on the right of the page to add a new connection. Select your Connector Type, input the name of the variable connection and select the Agent. Click OK. Steps to use in a Map: Add the variable connection to your map. To store a value in the variable, select the Upsert block. In the General Tab, select the data type of the variable you want to store from the Entity dropdown menu. In the Fields Tab, input the name of the variable in the ‘name’ field and the data you want to store in the variable in the ‘val’ field. Click OK. To retrieve a variable, use the Lookup Block from the Variable Connection. Select the data type of the variable in the Entity tab. In the Lookup Criteria tab, lookup the name of the variable you had set. Select the ‘val’ field in the Field List tab. Click OK. You can now use the data stored in the ‘val’ field of the variable in your map. Conclusion: I hope this helps understand the usage of the Variable Connector in TIBCO Cloud Integration. This feature is very useful when one needs the functionality of a variable while using TIBCO Cloud Integration.
Share Story :
Generating Azure Blob file SAS key using Azure SDK
With a demand for concrete strategy to determine how applications, workloads & data remain available during downtime organizations need a disaster recovery & business continuity strategy which is given by Microsoft azure. Azure BCDR covers people, communication, transportation that includes physical facilities & information technology. Even the smallest of the outage can prove to be a major setback for your business and with a business continuity plan and a disaster recovery system all your major information technology systems can be well saved without the expense of another secondary infrastructure. With the implementation of BCDR strategy the workload and applications of the organization is kept up and running in the occurrence of any outage. It is an effective and cloud based data recovery solution that is simple to implement and cost effective. Why do we need SAS key for Blob File? Azure Blob is massively scalable object storage for unstructured data. Blob Storage can handle all your unstructured data, scaling up or down as your needs change. You no longer have to manage it You only pay for what you use and you save money over on-premises storage options. A shared access signature (SAS) provides you with a way to grant limited access to objects in your storage account to other clients, without exposing your account key. A SAS gives you granular control over the type of access you grant to clients who have the SAS, including: The interval over which the SAS is valid, including the start time and the expiry time. The permissions granted by the SAS. For example, a SAS for a blob might grant read and write permissions to that blob, but not delete permissions. An optional IP address or range of IP addresses from which Azure Storage will accept the SAS. For example, you might specify a range of IP addresses belonging to your organization. The protocol over which Azure Storage will accept the SAS. You can use this optional parameter to restrict access to clients using HTTPS. Using SDK to Generate SAS key from C-Sharp code: You need to include the Azure SDK DLLs: You can get the DLLs from Nuget package manager as well. Refer below names to search in Nuget: WindowsAzure.ConfigurationManager WindowsAzure.Storage The first step is to connect to the Blob storage using Connection String. You can add the Blob connection string in the App.config and use that in code to create connection like below: Your connection string should be of the format: The below code creates the connection: //Parse the connection string and return a reference to the storage account. CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting(“StorageConnectionString”)); //Create the blob client object. CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); Once connection is created, we can retrieve the blob container which contains the blob: We can use the container reference, blob file name and the connection to create the SAS key like below: I have uploaded the entire code to my Github as well for reference. Why we can’t use Azure Storage Explorer tool to do this: The Azure storage explorer also has the option to connect to your blob and generate SAS key with clean User Interface. But recently for last 2-3 months, I am facing issues with generating Key with Azure Storage Explorer. This is the reason I implemented this code to remove dependency on the tool.
Share Story :
D365 PSA: Impact of Project Closure on Bookings
Introduction: Let’s look on a high-level of what Project Closure has impact on the Bookings on the Schedule Board. Now, while working in D365 PSA, I encountered that I needed to mark a Project as Completed even when I had some bookings in the future. This led me to notice its effect on the Bookings (Bookable Resource Bookings records) I’ve created which also span a little into the future. Project Booking for a Resource: Now, let’s see the below scenario, Brian is booked on Archer Pens Project according to the following 13 and 5 hours (8 hours) on 9th July 2018 10 hours on Monday, 10th July 2018 10 hours on Tuesday, 11th July 2018 5 hours on 12th July 2018 9th July 2018- 10th July 2018- 11th July 2018- 12th July 2018- Now, if I mark the Project as completed today i.e. 10th July 2018 and the End Date being tomorrow i.e. 11thJuly 2018. Let’s see what happens – I marked the Project as Completed (went past the Complete stage in the Business Process Flow) as seen below – Deletion of Bookings: Once I mark the Project as Completed, the Bookings for today, 10th July and all the future Bookings will be deleted. Only historic Bookings before today will remain in the system. If I check back, 10th July will have no Bookings 11th July will have no Bookings And 12th July too will have no Bookings anymore What remains is only for 9th July because it is before the date I closed the Project Note: I also tried to observe if today’s bookings have any end time before I actually Closed the Project. Meaning, a booking ending at 10 am and I Close the Project later at 12pm, it would still delete today’s Booking for the day. So, close Projects carefully if you feel you need to close them in advance, just in case.
