Power BI Archives - Page 17 of 21 - - Page 17

Category Archives: Power BI

Step by Step instructions for Scribe Insight Version Upgradation

Posted On March 29, 2017 by Admin Posted in

In this blog article, I am explaining about upgradation of Scribe insight version. Upgrading scribe insight 7.8.0 version to latest scribe insight 7.9.2 version. Step 1: Backup Export existing scribe Console Open scribe console -> right click -> Import/Export -> Export a package -> Select package -> Next -> Give specific name and path to store the backup files -> Next -> Finish This will create a .spkz file in the specified path. Take Backup of Collaboration folder Make a copy of collaboration folder. Path eg: C:\Users\Public\Documents\Scribe Take a backup of Scribe Serial Number Open Scribe Workbench -> Help -> About Scribe Insight -> Copy the Serial Number Step 2: Unregister the existing Scribe version Open Scribe Workbench -> Help -> Unregister this Computer -> Yes After successful unregistering, you will get a pop up window as Step 3: Stop the Scribe services From Task Manager -> End the Scribe.UpdateService.exe (if it exists) Step 4: Download current version of Scribe Insight Site: https://openmind.scribesoft.com/html/insight_download Step 5: Download Microsoft.NET Framework 4.5.2 Site: https://www.microsoft.com/en-us/download/details.aspx?id=42643 Note: Select scribe update notifier when asked during installation process. Restart the computer/remote server. Step 6: Run the Application Click on Run -> Next -> specify the path for unzip -> click Unzip -> Close. Step 7: Installation of Latest version of Scribe Insight Run Setup file Open ScribeInsight792_x64 folder (which is stored in the path specified during unzip) -> Run Setup.exe Uninstall existing adapters we need to uninstall the existing adapter eg: CRM , AX ,etc. Control Panel -> uninstall/ program Also, the system will automatically uninstall the adapter. Install Adapters Click on check box Insight adapters then click on Start installation Below window will appear Click on Install After Visual studio installation below window will pop up Restart the computer Open the setup.exe file again -> Click Next -> Accept the terms -> Next -> Install -> Next -> Accept license agreements -> Install -> Finish Setup wizard will pop up -> cancel username window -> select the adapter you want to install -> Next -> Finish Start Scribe Services Start -> Run -> services.msc -> Select Scribe Services one by one -> click on LogOn -> Click This Account -> Click Browse -> Browse the name -> Insert Password -> Apply Check dependencies and make sure all the dependent services are started before you start the Scribe Services. Do this for all the Scribe Services. (Services starting with name ‘Scribe’) Step 8: Register the Scribe Insight Select Register Online Copy the Serial Number which you took backup before. You will get details window You will get product update status Click Next Registration Successful window will appear -> Click Finish Step 9: Verify the version of installed Scribe Insight Open Scribe Workbench -> Help -> About Scribe Insight  

Share Story :

Integrating Sales Order line items of Dynamics 365 Financials using TIBCO Cloud Integration

Posted On March 27, 2017 by Admin Posted in

In this era where consumers are more aware, informed and expect just a little bit more with any service they opt for, you must be on your toes all the time if you want to satisfy them at every point. With Microsoft dynamics 365 for sales, go for the salesforce automation that will let you catch the nerves of what your consumers demand and expect. This will not only help you in engaging well with your customers, but will also help you in satisfying their business needs that will in turn help you to attract more new customers by satisfying the current ones.  The accelerated performance and boosted productivity will be witnessed from time-to-time enabling your customers to build a solid business foundation of their own Introduction: In Dynamics 365 Financials, Sales order line items cannot be integrated in a single step. As, it takes only item number in a create () and rest all the fields are considered as an Update. Prerequisite: Dynamics 365 Financials. TIBCO Cloud Integration Subscription. Purpose of the Setup: Our goal is to integrate Sales Order Line item from Microsoft Dynamics CRM Online to D365 Financials through TIBCO Cloud Integration. Steps: Configure a connection for your Microsoft Dynamics CRM Online by providing the required information. Configure a connection for your D365 Financials by providing the required information. Create a new Solution and specify the Solution name and the Agent. Note: Here, we are using a Cloud Agent. Now create an Integration Map for integrating Sales Order Header and its Line items from CRM Online to Dynamics 365 Financials. Sales Order Header: Use Create Block and create Sales Order Header by mapping required fields. Sales Order Lines: Use For each child Block for looping the Sales Order Lines. Use Create Block to Create Sales Order Line by mapping primary fields like Document_No,Document_Type,No Use Update Merge Block to map rest of the fields of Sales Order Lines like quantity,Unit_price etc along with ETag and ETagConcurrencyBehavior Note: (As specified by Scribe Software) The OData Protocol uses Entity Tags (ETag) to determine how to update or delete data. The Dynamics NAV/Financials Connector exposes these ETags as fields: ETag — Available as a source and target field, contains the ETag value provided by the Dynamics NAV/Financials OData service. ETagConcurrencyBehavior — Available as a target field, used to determine how data conflicts are handled. Possible values for this field are: Always Overwrite (default) Overwrite If Match Overwrite If Not Match

Share Story :

Integration between source as CRM and target as Multiple CRM’s

Posted On March 9, 2017 by Posted in

Use case:  Let’s say there are multiple service providers (SP) who are on CRM platform and send out the data to single repository Target which is also a CRM system. Based on certain validations and Business logic, the data that is sent from SP can be accepted or rejected. In this case the repository CRM system needs to communicate back to service provider about the status of their data submission. We are using Scribe Insight as the middleware or Integration tool. So how does the Scribe determines, the message that is sent from repository CRM is intended for which service providers CRM? This is done using Scribe Publisher. What is Scribe Publisher? Scribe Publisher is a component in Scribe console that produced XML messages based on real time triggering events. Example: If an account is created or updated in CRM then it will trigger publisher to generate XML message. This XML message is then forwarded to Scribe In Queue. This message is then processed by Integration process. Integration Approach: To integrate between a single CRM source to Multiple CRM instance as Target. Data needs to be pushed correctly to Target CRM instance based on a value of the data field in the source CRM. Create a CRM Publisher with same entities published multiple times. Each time it will be process based on the source field value and will output the message to Scribe In Queue with a different message label. Create Multiple DTS each with different Target CRM, where the data is intended to be pushed. Create Integration Process (IP) with this DTS and each IP will process the message based on the specific message label. In the above screen capture, publisher will generate XML message whenever a new account is created or existing account is updated. However based on the value of the field that holds the data for the Target Service Provider (In above example – “description” field in repository CRM holds target CRM of which this account belongs to), publisher will publish a message with different message label. (FromSP1_account.xml, FromSP2_account.xml etc) These messages come to Scribe in Queue. Integration Process can be then created that will consume these messages based on their message label. Above Integration was tested on Scribe Insight version 7.9.2.

Share Story :

Power BI new updates: New Quick Calcs, Word wrap on matrix row headers and X-axis & Y-axis font size control

Posted On February 27, 2017 by Admin Posted in

In this blog article, I will explain about the new updates of Power BI related to New Quick Calcs, Word wrap on matrix row headers and X-axis & Y-axis font size control. New Quick Calcs: Percent of row total & percent of column total In this Power BI update, they are introducing two new Quick Calcs. These are as follows: Percent of row total Percent of column total Under the field pane -> right click on value field -> Quick Calc. Then, under Show value as, you will see Percent of column total and Percent of row total If you convert a matrix visual with one of these new Quick Calcs, the calculation will convert these totals into a percent of grand total. Example: In below figure Cost column shows the cost for sub category. After applying the Quick Calcs as Percent of column total the matrix is looks like follows: Word wrap on matrix row headers: In earlier update, the word wrap was added for table header only but now it’s added for matrix row header as well. You will see the Word wrap toggle under the Formatting pane -> Row header card -> Word wrap. Once you turn on word wrap, the row headers will word wrap to fill the space they have. X-axis and Y-axis font size control Earlier we were not able to control the font size of X-axis and Y-axis. But now we can control the X-axis and Y-axis font size. Now you can find the text size slider under the X- axis and Y- axis card in the formatting pane. After changing in font size of X- axis and Y-axis.  

Share Story :

Power BI Embedded inside of SharePoint Online

Posted On February 27, 2017 by Posted in

In this blog article, I will explain you on the new web part which enables SharePoint authors to embed Power BI reports directly in SharePoint Online pages with no code required. Below are the steps to connect over SharePoint Online: Publish your Power BI report to your Power BI account Get the URL to the report Add the Power BI (preview) web part to your SharePoint Online page Paste the URL of the report when prompted To finish, save and publish your page! As per below screenshot I have one Reports publish on my Power BI account, so at report level in the File menu we need to select “Embedded in SharePoint Online” option. One Pop-up window appear for the URL which we need to copy for the SharePoint online site. Now login to your company SharePoint Online account. And create new page inside of existing Page or Separately. As per below screen shot click on “+” sign and select Power BI option, after clicking new report window appears. So, when you click on Add Report button, you found one window appears on left side where you need to paste that powerbi URL inside of first block. And it will automatically detect all the reports related to that Power BI account. Power BI reports appears inside of SharePoint Online account as below. You can also share the same Reports to other colleagues inside of organization.  

Share Story :

Configuring Text file as a Source in TIBCO Cloud Integration

Posted On February 23, 2017 by Admin Posted in

In this article, we are going to see how to configure CSV file as source in TIBCO Cloud Integration. Prerequisite: TIBCO Cloud Integration Subscription. CSV file (Here, Journal.csv) Steps: Provide access to your Text files. The FolderAccess.txt file controls access to the files that can be opened by TIBCO Cloud Integration or processing. Open the FolderAccess.txt file in Notepad. Location for FolderAccess.txt: Here: C:\Program Files (x86)\Scribe Software\TIBCO Cloud IntegrationAgent\Connectors\Scribe.Connector.Text Add a line in the FolderAccess.txt file for each directory path that contains text files. Open ODBC Data Source Administrator. In the System DSN tab, add a driver of Microsoft Access Text driver Add Data Source name and description. Select directory where your CSV file is stored. ‘Save as ‘the .CSV file as schema.ini file. Login to TIBCO Cloud Integration URL: https://app.scribesoft.com/ Create a new connection by selecting the connector type as Text as Source. Fill in the required details. In the Location Tab, paste the path where your .csv file is stored. In the Entities tab, give the entity name and fill the details. In the Fields tab, click on settings icon and Refresh schema. All the fields in the CSV file will appear. The data type of all fields are by default ‘Text’, you can change the datatype of the fields. Test the Connection and press ‘OK’ to save the connection.

Share Story :

ODBC Error Scribe Insight

Posted On February 16, 2017 by Posted in

Error in Scribe Insight: “Conversion failed when converting date and/or time from character string” when connected to SQL Adapter. When do we get this error? When we create a SQL connection using ODBC or OLEDB adapter in Scribe Insight and use the same either as Source or Target connection in Scribe DTS. What is the root cause of this error/issue? It’s the date-time format of the machine where the scribe Insight DTS is being run. How can we reproduce the error? In the windows, DateTime settings, change the format for time with separator as (.) instead of (:). Open the Scribe Workbench and create a sql connection using odbc\oledb connection, create a target connection, map some fields and run the DTS. You will get the error. Resolution: Simply change the time format to have the colon separator instead of (.) Hope this helps. I have found that having the time format other than colon separator creates error while working with other ETL tools as well.

Share Story :

Migrating the attached files from Notes in CRM using SCRIBE Insight to SQL Table

Posted On January 31, 2017 by Posted in

In CRM, when a file is attached in Notes sections, it gets stored in “Annotation” Entity. Notes with Files attached have the isDocument set to 1 and without notes to 0. Since Annotation entity contains files attached in all other CRM entities, it is determined by “ObjecttypeCode”. Example for Account ObjectTypeCode is 1. ObjectId field in Annotation holds the GUID of the related entity’s Primary Key whose file is attached. Example Accounts Accountid is ObjectId of Annotation. When we create a CRM connection for Annotation, we get 2 fields as highlighted below “bodybinary” and “documentbody”. When we migrate from CRM   to SQL, we need to Map the source CRM bodybinary field to vfAttachment in Target. If you map to document body, the files gets migrated however when you open it, it looks like corrupted. (During CRM to CRM migration, ensure that both the fields objecttypecode and objectidtypecode are mapped, else migration will get fail.) If we need to pull the attached files from CRM and insert the same in SQL Table and write the same files in to Windows folder, things get bit tricky here. While creating the SQL Table what will be the datatype for column that will be storing the attached file?  The datatype you need to select is as “Image”. If you select longvarchar or text or binary, when the file is written on disk and open, you get a gibberish file with unreadable content. Please note even though the Data type of the SQL Table for DocumentBody is defined as “Image”, Scribe shows the datatype as “binary”. Also during mapping it is advised to map the documentbody field at the very end. We had encountered this issues of files getting corrupted and Scribe support team suggested this approach and it worked. With above tips, you can ensure successful migration of documents from CRM to SQL and later when writing to local drives.

Share Story :

Microsoft Visual Studio Team Foundation Server Data inside Power BI

Posted On January 30, 2017 by Posted in

In this blog article, we will explain you on how to connect Microsoft Visual Studio TFS inside of Power BI. Below are the steps to connect over Financial OData: Open the Power BI.com online account Select Get Data option from the bottom menu Click on the Get service option Select Visual Studio TFS from the list and click on “Get it Now”. New Connection window appears which asking for Account Name to connect TFS. You need to write only name of the account (beginning name of visualtstudio.com) and write specific project name or * if you want to get data for all existing projects. After Connecting it will ask for the Authentication so need to login with your id who has access on TFS. So finally, you get out of the box dashboard inside of Power BI. As per below screen shots, it gives so many individual sheets to give information on Open Bug Status, Latest Version Control, Top Updated users, Different Project Build analysis and all. We can also connect TFS from Power BI desktop tools as well, so using that we will design reports and charts based on our requirements. In the Power BI Desktop, we need to choose Visual studio team Services option from Online service. After that it will ask for same account and Project details of the TFS, and we can have list of tables objects for selecting data inside of Graphs or charts.  

Share Story :

Moving Data from SQL Server to Excel Using SSIS

Posted On January 16, 2017 by Admin Posted in

In this blog article, we will explain how to move data from SQL server to Excel file using SQL Server Integration Services (SSIS). For moving data from SQL Server to Excel you need to create package, to import data from SQL Server to Excel. Step 1: – Create a Project You need to create a project for moving data from SQL Server to Excel. For this go to visual studio-> File-> New-> Project Once you click on project then choose a business intelligence option to create SSIS project. Then select Integration Services. And then Click on Integration Server Project. Specify name to the project and then click on OK. Step 2: – Create a Data Flow Task We need to add a data Flow task to control flow tab of your package. After adding the data Flow task double click on it will open the Data flow tab. Step 3: – Creating the SQL Server Source Our aim is to move data from SQL Server to Excel. So here our source is SQL Server and Destination is Excel So, drag and drop the OLE DB Source component from the Source then right click on that component then click on edit. Below window will open. Then click on new Click on new Then select your Server name, Authentication type. Select your database name means from which database you retrieving the data. Then click on Test connection to verify that our connection is right or not. And then click on OK. Again, click on OK. Then select the table from Specified database. You can also verify data by clicking on preview button then click on OK. Step 4: – Creating the Excel Destination Our destination is to put the data in excel file. So, drag and drop the Excel destination component from the Destination. Then click on OLE DB Source component and then drag blue line and drop on Excel destination. then right click on that component then click on edit. Below window will open. Then click on new then set the path where you want to create an excel file. Click on OK. Then click on another new button to select the excel sheet name. Click on ok -> ok Then click on drop down bar choose excel file then click on mapping. and then OK. Here, your package is ready execute. Before executing the package verify the source and destination. In destination, there created one excel file with name that you specified in earlier in excel destination with no data. For execute a package, go to Solution explorer ->under the project name -> SSIS Packages -> Package.dtsx. Then right click on that .dtsx and the execute package. it will show the number of rows inserted in Destination i.e. in excel file. And then check the destination. All the data is integrated in destination.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange