Category Archives: Power BI
Email Migration from D365 CRM v8.2 to D365 CRM v9 using TIBCO Cloud Integration: Attachments & Status Update
Introduction: In this blog, I will outline how to migrate Email Attachments and update the status if an email. In my previous blogs, I have shown how to migrate the body of an Email and its Activity Parties from one CRM to another using Scribe. Email Attachments: Below, is the map used to migrate Email Attachments. As you can see, it is pretty straightforward, barring a few things to keep in mind while mapping. 1) Email Attachments are stored in the ‘activitymimeattachment’ entity. 2) I did not map the ‘attachmentid’ field as it produced an error as well as there is probably no reason one would need the GUID of the attachment. Not mapping attachmentid will create a new GUID for the attachments being migrated. 3) Most data regarding the Attachments migrated along with the first map migrating ‘Email’ activity. 4) That is why, in this map, we just migrate the subject, filename and body fields along with ‘objectid’ and ‘objecttypecode’. 5) The ‘objecttypecode’ tells which entity the attachment belongs to and its GUID. Once you run the map successfully, you will see the attachments displayed in the email. This includes image attachments as well. Target: Email Status Update: As for most Activity entities, while migrating, we migrate with an ‘Open’ status. This is done to ensure the record does not become read-only which would not allow us to migrate the corresponding Activity Parties and Attachments. This could lead to an inconsistency in data in Source and Target. Once the Activity Parties and Attachments have been migrated to the record, we can now update the Status of the Email to what it is in the Source environment. This is a basic but fundamental step to ensure no data inconsistency. Sample State Code & Status Code Values: In this map, all we have to map are the ‘Status Code’ and ‘State Code’ as it is in the Source Environment. This will update the status of the email. In the screenshot below, you can see that the Status has been updated to ‘Sent’. Conclusion: This completes the process of creating TIBCO Cloud Integration Maps for Email Migration from CRM to another. I hope this and my two preceding blogs provide a sufficient outline for the process of Email Migration.
Share Story :
Power BI Service Live Connection
Introduction: In this blog you will see how you can use power BI as existing dataset to create the report in Power BI desktop. Service Live Connection: You can establish connection to a shared dataset in the Power BI service, and create many different reports from the same dataset. This means you can create your perfect data model in Power BI Desktop, publish it to the Power BI service, then you and others can create multiple different reports (in separate. pbix files) from that same, common data model. This feature is called Power BI service Live connection. To create Shared data set you need to, create a dataset and report and publish it in a workspace which is common to all. Select the workspace that is shared and where the report needs to be deployed. Report will start to Puclish to workspace. You will get below confirmation, when the report is sucecssfully published. Establish a Power BI service live connection to the published dataset If you’re not signed in to Power BI, you’ll be prompted to do so. Once logged in, you’re presented with a window that shows which workspaces you’re a member of, and you can select which workspace contains the dataset to which you want to establish a Power BI service live connection. Click on load and the dataset will be loaded and you can create the reports and publish it. Below are some known limitations to this as well: Read-only members of a workspace cannot connect to datasets from Power BI Desktop. Only users who are part of the same Power BI service workspace can connect to a published dataset using the Power BI service live connection. Users can (and often do) belong to more than one workspace. Try it out and put you question below if there is anything.
Share Story :
Azure Machine Learning Cheat Sheet
Introduction: Microsoft released a PDF cheat sheet of which machine learning algorithms can be used on Azure Machine Learning Studio. This Microsoft Azure Machine Learning Algorithm Cheat Sheet helps you choose the right machine learning algorithm for your predictive analytics solutions from the Microsoft Azure Machine Learning library of algorithms. The algorithms have been grouped in 5 different groups. These groups are: Regression: For predicting values. For Example when predicting a stocks price. Anomaly detection: For finding unusual data points. For example, any highly unusual credit card spending patterns which deviates from the normal credit card spending patterns. Clustering: The data points have no labels associated with them. Instead, the goal of an unsupervised learning algorithm is to organize the data in some way or to describe its structure. For example, discovering companies with similar marketing strategies. Two-class classification: When there are only two choices, it’s called two-class or binomial classification. For example distinguishing between a Cat or Dog. Multi-class classification: For predicting three or more categories. For Example predicting the winner of a Race. To read the cheat sheet, read the path and algorithm labels on the chart as “For <path label>, use <algorithm>.” For example, “For speed, use two class logistic regression.” Sometimes more than one branch applies. In this case it is better to create scored models with both the algorithm and compare both of their accuracy to decide which algorithm is the better fit. Even a beginner can easily use the cheat sheet provided to select which algorithm is apt for creating their predictive solution. There are some generalizations and oversimplifications, but it points you in a safe direction. It also means that there are lots of algorithms not listed here but these many algorithms are more than enough to give you a good head start in the ML world.
Share Story :
Connect your Azure Machine Learning Predictive Solution to Power BI
Introduction: Azure Machine Learning Studio is an amazing tool that lets us create efficient ML experiments with simple drag and drop features. We can predict anything from Flight Predictions to Churn Analysis. But what if we want to represent this predicted data a more visually appealing format? Well it is possible to do this by representing your predictions on Power BI! Pre-Requisites: Basic Understanding of Azure Machine Learning Studio. Basic Understanding of Power BI. A Blob Container created on Azure Storage. Steps: Create your Azure Machine Learning Experiment on Azure Machine Learning Studio. Convert your Training Experiment to a Predictive Experiment and Deploy it as a Web Service. We will create a Console application in Visual Studio and copy paste the code inside Batch Execution. For automation we can create automated data pipelines but for now we will just use a simple Console application. Remove the existing code from the Console Application and copy paste the Batch Execution code. Install the necessary Nuget Packages and also update the following parameters. – BaseURL will be the same. – Storage Account Name, Storage Account Key and Storage Container Name will be parameters that can be found in your Azure Blob Storage which was created. – Api Key can be found in the Web Experiment Page in Azure Machine Learning Studio. – The input path is the path where you have saved your input csvfile for Batch Execution. Your Input csv file should have all the features which you have used to train your experiment After you run your Console application a new output1results.csv file should get generated in your Blob Container. The output results should include the labels which your experiment generates in it’s output. It should include the Scored Labels and Scored Probabilities labels as well. Now you can get your data using Azure Blob Storage as your source in Power BI and use the columns in the output1result.csv file to generate your ML Predicted Reports. The Report can look something like this. I hope this blog helps you to combine Azure Machine Learning Studio and Power BI to create a powerful predictive solution.
Share Story :
Multi-select data elements using Power BI Desktop on Visuals
Introduction: In latest Power BI update, Microsoft introduced new feature as Multi-select data elements on visuals. In Power BI Desktop, we can highlight a data point on visuals by simply clicking on the data point in the visual. Means, if we have an important bar chart, and we want other visuals on the report page to highlight data based on our selection, we can click the data element in one visual and see results reflected in other visuals on the page. This is basic. Till now we are only able to filter the data on single-select highlighting. See below screen capture: New feature: But by using the new feature of the Power BI we can multi-select data element on visuals, we can now select more than one data point in Power BI Desktop report page, and highlight the results across the visuals on the page. To multi-select data points in visuals, simply use CTRL+Click to select multiple data points. See below screen capture for multiple data points select on visual(multi-select):
Share Story :
Using Integrated GP Customers for creating Sales Order through TIBCO Cloud Integration
Introduction: Recently, we came across an issue where we were integrating Customer and Sales Order from Dynamics 365 CRM and GP. Customers and Sales Orders were successfully integrated except when Sales Order was created using the Integrated Customers; an error regarding Credit Limit would occur. The TIBCO Cloud Integration GP connector does not support the ability to set a customer credit limit, which in turn will not allow you to use that Customer for creating a Sales Order. Resolution: In GP Web Service Security Console, under Create Customer Policy: Change the Behaviour of Customer Class Defaulting from “Do Not Use Customer Class” to “Use Customer Class”. In the Scribe Mapping, pass a ClassKey_Id. For ClassKey_Id, setup a Customer Class with no Shipping Method or Tax schedule ID and specify the credit Limit properties.
Share Story :
Email Migration from D365 CRM v8.2 to D365 CRM v9 using TIBCO Cloud Integration: Activity Parties
Introduction: In this blog, I will detail how to migrate Activity Parties of Emails from one CRM to another. In my previous blog, I outlined the first step of the Email migration process which is migrating the body of the email. Migrating the corresponding Activity Parties of an Email is the second step of this process, as the Email body now exists in the Target CRM. What are Activity Parties? Other than the Body, an Email Activity consists of: Sender: The person(s) sending the email. Recipient: The person(s) receiving the email. CC & BCC: The person(s) that are copied in the email. Owner: The person who is the owner of the email. Regarding: This generally links to an entity in CRM which pertains to the email. For example, a Case or a Project in CRM. ‘Sender’, ‘Recipient’, CC’, ‘BCC’, ‘Owner’ and ‘Regarding’ are each stored in CRM as a separate Activity Party of that email with a ‘Participation Type’ code (field name: ‘participationtypemask’) to establish the field that specific party belongs to i.e. 1 = Sender, 2= To Recipent and so on (as shown below). Generally, in an Activity Party, the person(s) are either System Users or Contacts. This is specified in the field ‘partyobjecttypecode’ as shown above. Keeping this in mind, one can lookup to these entities to obtain the corresponding GUIDs in the Target System and map it as the ‘partyid ‘. After the Activity Party is created, the owner of the Activity Party should be updated as per its owner in the Source environment. The ‘Owner’ Activity Party is automatically created by CRM as the same User as the Owner of the Email (configured when you migrate the Email Body in Step 1). The ‘Regarding’ Activity Party links to a Case/ Project and not a ‘person’, however, the same logic applies i.e. map the required GUID and its type. Migrating Activity Parties is not as complicated once understood. Unfortunately, not much is easily available online about this. I hope this blog demystified a few concepts about Activity Parties of an Email and how they can be migrated from one CRM to another. My next blog will detail how to migrate Email Attachments and update the status of an Email.
Share Story :
Email Migration from D365 CRM v8.2 to D365 CRM v9 using TIBCO Cloud Integration: Email Body
Introduction: Data migration can be a little challenging, especially when it comes to Emails. In this blog, I will outline the steps that need to be followed to successfully migrate Emails as well as important things to keep in mind during the process. Steps: There are four main steps to follow to successfully migrate an Email from Source to Target: Send the body of the Email. Send all the related Activity Parties. Send the details of the related Email Attachment(s). Update the Status of the Email. In this blog, we will be dealing with the first step i.e. creating the map in TIBCO Cloud Integration to send the Body of an Email. Migrating the body of the Email is straightforward compared to the next step but there are a few aspects to keep in mind: 1) Send the email as Open so that Activity Parties and Attachments can be migrated in the following steps. Not sending the email with an “Open” status could lead to Activity Parties and Attachments not being migrated to the corresponding email. 2) When an email is migrated, the owner of the email will be the User configured in the CRM Connection in Scribe. In order to maintain the same owner as in the Source, you can update the email with the correct owner after it is created. In the screenshot below, I am using a Lookup Table in Scribe to map the User GUID of the Target System. 3) If you want the GUID of the email to remain the same in Source and Target, do not forget to map the ‘activityid’ of the Email entity. Conclusion: I hope this blog provided some insight into the migration process for Email Activities. In the next blog ‘Email Migration from CRM v8.2 to CRM v9 using TIBCO Cloud Integration: Activity Parties‘, I will talk about migrating ‘Activity Parties’ which can be the most challenging part of Email Migration.
Share Story :
Rollling up the multiple rows into a single row and column for SQL Server.
Problem: Need a way to roll-up multiple rows into one row and one column. There is a way we can roll-up multiple rows into one row using pivot, but we need all of the data concatenated into a single column in a single row. Solution: To achieve this we will use For XML Path Clause and STUFF Commands. STUFF() Function STUFF function is used to insert a string into another string. Basically, it deletes the characters from a source string and inserts another string at the specified position. Syntax: STUFF(Expression,Start, Length,Replacement_expression) Here, Expression is an expression of the character data to be modified. Start is an integer, which specifies start position in Expression to delete and insert another string (i.e. Replacement_expression) from here and length is an integer, which specifies the number of characters to be deleted. Replacement_expression is a character expression to be inserted at the start position. Example: Data in table like this: Query : Output :
Share Story :
Connecting Dynamics 365 Contact packs in Power BI
Introduction: In this blog you will see how we you can use content pack for D365 in Power BI. Content package that are created are easy for your team to find — they are all in AppSource. Because they’re part of Power BI, they leverage all the features of Power BI, including interactive data exploration, new visuals, Q&A, integration with other data sources, data refresh, and more. Steps: Login to your Power BI Account > Click on Get Data. Click on Services. Select the proper content pack from the AppSource. Click on Sales Analytics for Dynamics 365 > and click on “Get it now”. It will as about the CRM account, and the Fiscal Year End Month Number, fill in the details and click next. By default, it will use OAuth with the current account that is logged in. After successfully login, dashboard will start with pumping up with data. Importing data will take some time, and after successful login you will get the ready content package Dashboards and report. You are set and good to go, try other content pack as well. Currently there are 21 Content pack available for different flavour of D365, try them there are superb.