Category Archives: Uncategorized
Resolve the dependency between multiple solutions in D365 Customer Engagement / CRM Solution using Solution Component Mover.
You have might question in your mind that why we need to move the components from one solution to another solution in D365 Customer Engagement So, let’s consider a scenario you and your team is working on D365 CRM customization and created the two solution — ‘ Solution A’ and ‘Solution B’. While customization development when you are moving the ‘Solution A’ on the Production instance but you are not able to move it. Because some of the missing components are present in the ‘Solution B’. Then you have decided to move the ‘Solution B’ first, but again while moving the ‘Solution B’ its failed because of some of the missing components present in ‘Solution A’. It means ‘Solution A’ and ‘Solution B’ are dependent on each other and you can’t move either of the solutions in the Production Environment or Target Environment. There are two solutions to the above problem Add the missing components in the one solution and move that solution to production. Merge the dependent solution into one solution using Solution Component Mover. Now, the First solution is time-consuming as well as effort making and developers need to track all the missing components and add them manually. But using Solution Component Mover, you can merge solutions in 10 to 15 min just by selecting the component from Source Solution and Target solution to which you need to move the components. So, let us see how to do it. Perquisites: XrmToolBox You can download the XrmToolBox from https://www.xrmtoolbox.com/ Steps to follow: Open the XrmToolBox and connect to your D365 CRM environment. Search for the Plugin “Solution Components Mover” Image: Search Solution Components Mover in XrmToolBox 3. Once the plugin will be load, click on load solution — it will load all the solutions present in the Environment. Image: Click on Load Solution After solutions are loaded you can see I have two solutions in my Environment “Solution A” and “Solution B” which have dependent components and one “Target Solution” on which I m going to copy the component so that “Target Solution” will become a master solution. Solution A Image: Solution A has Account entity and it’s subcomponents Solution B Image: Solution A has Case entity and it’s subcomponents Target Solution Image: Target Solution doesn’t have any entity or component. 4. Move the solution component by select the Source solution and Target Solution 5. Click on the Copy component, a popup will open where you can select the component type to move to the target solution. 6. Click on “Ok” and component from both solutions will be moved to Target Solution. You can see the following screenshot in which Target Solution has a component from “Solution A” and “Solution B”. You can see XrmToolBox Plugin how it helps to reduce your time and effort to which are required to move the component from solution to solution manually one by one.
Share Story :
How to Reset Environment in CRM D365
Introduction : In this blog, we will see how to reset CRM D365 Environment. Requirement of Client is Sales App but while installing Environment what if we installed all Apps ? Solution : We can reset the Environment. In this scenario, Our client requirement is only Sales App and we have installed all Apps. Step 1: Go to portal.office.com -> admin centre -> Dynamic 365 We will see that our environment is installed as Production and we don’t see option to reset the Environment. Step 2 : Now go to -> https://admin.powerplatform.microsoft.com/ -> Environments -> Select Environments -> Edit and make it as Sandbox and Save it. Once we Save, we can see Reset option. Step 3: Then click on RESET Step 4: When we click on the Reset, Form will be opened and we can see options to redeploy the Apps. Select Apps we want to redeploy. In our scenario, our requirement is Sales App only , we have selected only Sales App. Click on Reset and confirm. Result : Hope this helps to view only those apps which you need !
Share Story :
How to connect Power BI to MySQL
In this blog we’ll see how to Connect Power BI to the MySQL . Before connecting to MySQL make sure you have MySQL connector install on your machine. Follow following steps to connect to the MySQL Open Power BI Desktop and select Get Data > Database > MySQL Database. For example, if you have a local MySQL server running and can connect to it by entering “localhost” or name of local machine, enter name of database and press OK button. If you are getting error saying “DataSource.Error: Object reference not set to an instance of an object.” To resolve this issue, go to File | Options and Settings | Data source settings and edit your data source and set the credentials to “Database credentials”, not Windows credentials. You can get data from MySQL views, use the “SQL statement (optional)” and enter a SQL query. In next blog we’ll see how to connect the Power BI to MySQL if MySQL is running on the cloud.
Share Story :
Steps to Configure Environments through Life Cycle Services (LCS)
Configuration of Environment through LCS. After we purchase licence, Login the LCS through Admin account. You can see the follow link to complete setup environment. Before configuring the environments there are some pre-requisites need to be performed. Declaration of project milestone. Click on setup milestone, Enter the end date for each milestone and save. 2. VSTS Setup. Before this we need to follow the below steps: Login in Azure DevOps. Create a project. 3. Create personal access token. Save this token. Click on “Setup Visual Studio Team Services” a. Enter the site Enter the AzureDevOps url, which consists of https://organizationname.visualstudio.com/ and click on continue. Enter Personal access token generated above in Azure DevOps. b. Select the project Select the project from the list and click continue. c. Review and Save 3. Project configuration and project on-boarding. Click on “Complete project configuration”(This is one time setup) And click on “Project onboarding” Check all the 12 points by clicking on next and then finish the complete onboarding review page. And click on configure button of environment Enter the name of environment and select the region. Then you can see the status of environment in queued state. After 7-8 hours you can login to your environment.
Share Story :
Making SQL Server Accessible Over Internet
We can Make the SQL Server Accessible over Internet by following steps: 1. System should have static IP where SQL server is installed. 2. Open SSMS, right click on server > properties and check SQL Server and Windows Authentication mode 3. Go to Server > Security > Logins and configure password for users, who will be accessing the SQL server remotely. In this we are setting up password for the use ‘sa’. 4. Go to Server property and check “allow remote connections to this server” 5. Go to SQL server configuration manager > SQL server network configuration > Protocols for SQL and make sure TCP/IP is enabled 6. Click on TCP/IP and enter port number in IPAll section 7. In firewall setting create Inbound rule for the port through which it will listen, in this case we are selecting port 4729 8. In next steps we will enable port forwarding (i.e. It will redirect request received on public IP and port to another IP and port combination) In mapped IP we can mention static IP of system also (i.e. 192.168.1.30) 9. Now from other system outside of current network, access the SQL server and enter server details and credentials. Enter server name eg 113.143.120.100, 4729\SQL Hope above Helps!
Share Story :
How to Create Storage Account in Azure Portal
Hi, in this blog we will see how to create a storage account in azure portal. On the Azure portal menu, select All services. In the list of resources, type Storage Accounts. As you begin typing, the list filters based on your input. Select Storage Accounts. On the Storage Accounts window that appears, choose Add. Select the subscription in which to create the storage account. Under the Resource group field, select Create new. Enter a name for your new resource group, as shown in the following image. Select Review + Create to review your storage account settings and create the account. Hope this helps.
Share Story :
Conditional Formatting by Row in a Matrix
Introduction: This blog will show you how you can color individual rows differently based on different conditions and the row headers in matrix (Not alternating rows). Our Scenario: I want to apply colors to different rows of the following Buckets: Current – No Color 1-30 Days Past Due – Yellow 31-60 Days Past Due – Orange 61-90 Days Past Due – Red 91 or More Days Past Due – Red Step 1: Create a new calculated column in your data source which applies a numeric value to each header type that you would like to have highlighted. We have created a Calculated Column using the following query. Step 2: Select the Matrix to which you want to apply the formatting and go to conditional formatting section in the Format Tab and turn the Background Color Option “On”. Step 3: The Conditional Formatting is applied for different fields in the Values section in the Matrix. So we will apply conditional formatting according to No. field first. Select Format by “Rules”. In Based on field select “Sum of Color Column” and in Summarization select “Sum”. In the Rules section add the Rule as shown in the Screenshot. Step 4: Apply the other rules for different colors same as above. Step 5: The Colors have been applied to different buckets according to our rule for “No.” Column. Step 6: Repeat the same steps by selecting different fields from the drop down under Conditional Formatting, one by one. Step 7: Thus we have colored the different rows of the Matrix successfully based on our condition.
Share Story :
Trigger Power Automate on Condition
Introduction This blog explains how we can trigger Power Automate based on the required condition. Suppose you have a requirement where you want your flow should get trigger based on some condition then you can check condition on “Trigger” itself rather than adding a new action to check condition. Use Case: Trigger flow when Lead is created and Lead Source Type is “Trade Show” Trigger flow when Lead is created and Flag is “Yes”. Trigger Flow when Lead is created, Lead Source Type is “Trade Show” AND Flag is “Yes”. Trigger Flow when Lead is created, Lead Source Type is “Trade Show” OR Flag is “Yes”. Steps to be followed: Trigger flow when Lead is created and Lead Source Type is “Trade Show” Click on Ellipsis (…) –> Click on Settings Go to Trigger Conditions and Click on “+Add” Add condition. @equals(triggerBody()?[‘leadsourcecode’],7) NOTE: “Lead Source” is an option set field add value of that option to check. After adding condition click on Done. Test your flow. Trigger flow when Lead is created and Flag is “Yes”. Repeat step number 1 to 3. NOTE: Flag is the “Two Option” field for two option set fields use true or false in condition. @equals(triggerBody()?[‘cf_flag’],true) Trigger Flow when Lead is created, Lead Source Type is “Trade Show” AND Flag is “Yes”. Repeat step number 1 to 3. AND Condition @and(equals(triggerBody()?[‘cf_flag’],true), equals(triggerBody()?[‘leadsourcecode’],7)) Trigger Flow when Lead is created, Lead Source Type is “Trade Show” OR Flag is “Yes”. Repeat step number 1 to 3. OR Condition @or(equals(triggerBody()?[‘cf_flag’],true), equals(triggerBody()?[‘leadsourcecode’],7))
Share Story :
Accessing SharePoint Recycle Bin using Microsoft Flows
Introduction: This blog provides steps to follow in order to access the SharePoint Recycle Bin in MS Flows Steps: 1) In Actions, select “Send an HTTP request to SharePoint” 2) Enter the below details: Site Address: Select your site address from the list. Method: GET Uri:_api/web/recycleBin?$filter=DeletedDate eq datetime’Time Deleted’ and DeletedByName eq ‘Deleted by’ and Title eq ‘Filename with extension’. Headers: Accept application/json Pls. Note: The parameters marked in yellow are taken from the trigger “when a file is deleted in SharePoint” 3) Select Action “Parse JSON” 4) In parse JSON, the content is the body of HTTP Response Enter the below Schema: { “type”: “object”, “properties”: { “odata.metadata”: {“type”: “string”},“value”: {“type”: “array”,“items”: {“type”: “object”, “properties”: {“odata.type”: {“type”: “string”},“odata.id”: {“type”: “string”}, “odata.editLink”: {“type”: “string”},“AuthorEmail”: {“type”: “string”},“AuthorName”: { “type”: “string”},“DeletedByEmail”: {“type”: “string”},“DeletedByName”: {“type”: “string”}, “DeletedDate”: {“type”: “string”},“DeletedDateLocalFormatted”: {“type”: “string”}, “DirName”: {“type”: “string”},“DirNamePath”: {“type”: “object”,“properties”: { “DecodedUrl”: {“type”: “string”}}},“Id”: {“type”: “string”},“ItemState”: {“type”: “integer”}, “ItemType”: {“type”: “integer”},“LeafName”: {“type”: “string”}, “LeafNamePath”: {“type”: “object”,“properties”: {“DecodedUrl”: {“type”: “string”}}}, “Size”: {“type”: “string”},“Title”: {“type”: “string”}},“required”: [ “odata.type”,“odata.id”,“odata.editLink”, “AuthorEmail”,“AuthorName”,“DeletedByEmail”, “DeletedByName”,“DeletedDate”,“DeletedDateLocalFormatted”,“DirName”,“DirNamePath”,“Id”, “ItemState”,“ItemType”,“LeafName”,“LeafNamePath”,“Size”,“Title”] }}}} Sample Output: Conclusion: In cases where we need to get the original location of file after it is been deleted from SharePoint, we can access Recycle Bin of SharePoint and obtain the original path.
