Form Validation in Power Apps — Part 1
In this blog, we are going to see how we can do the implementation of form validation in Power Apps. Let get started, I have designed the Sign-Up page which has the following fields: Email Password and Confirm Password Phone Number Following is Screenshot of my Sign-Up Page build in PowerApps: Following is App Controls Structure which I have followed to Create Sign Up Form: I have used “ErrorText” which is HTML Label Control for Logging all the Validation Errors. Now, we will set the Properties on ErrorText control to display the error on the Form for Each Input Following is an explanation of each validation which I have used in this Power Apps: Email Validation Not(IsBlank(‘ControlName’)) — It will validate if Field is blank or not. IsMatch(‘ControlName’.Text , Match.Email) — It will validate the email format and return false if user enter invalid Email Address Code: Password and Confirm Password Validation Not(IsBlank(‘ControlName’)) — It will validate if Field is blank or not. ‘Confirm Password’.Text = Password.Text — It will validate if the password and confirm password are the same or not. It will return false if both password will not match. Code: Phone Number Validation Not(IsBlank(‘ControlName’)) — It will validate if Field is blank or not. IsMatch(‘Control’.Text, “Your country Phone Validation ReGex”) — It will validate if the entered phone number is valid or not using ReGex. I have used phone number validation ReGex for India — “^[6–9]\d{9}$” Following is an explanation of my ReGex: 5. Code: Following is full code for Validation you need to paste in Formula bar of HtmlText Property: Result: Blank Field Validation for each Fields — Email, Password, Confirm Password and Phone Number Email Format Validation Now, you can see after entering valid email error has gone Password and Confirm Password Match Validation Phone Number Format Validation Check After adding valid phone number error is gone Hope this helps, stay tuned for more blogs.
Share Story :
Form Validation in Power Apps — Part 2
In this blog, we will implement the following validation on our Sign Up Form in PowerApps: Disable Sign Up button if any invalid input is present in Form. Change Border or Fill Color Input which is invalid. Before moving forward please check out my previous blog because this blog is a continuation of my previous blog. Click Here Disable Sign Up button if any invalid input is present in Form Select the Sign-Up button in the form. 2. Select the property DisplayMode so that we can change Display Mode to disable if any invalid input is entered on Form 3. Add the following to in DisplayMode Formula Bar If(ErrorText.HtmlText = “”,Edit,Disabled) It means that when Error Text is empty it will enable the submit button or else it will disable it. Change Border or Fill Color Input which is invalid. Now we will add function in Fill and border to input if that input is Invalid. Email Input Select the Email Input and choice BorderColor Property. 2. Now, In BorderColor we will add the following code in the formula bar, so if Email Input is invalid or not in the format it will change the border color of input to red or you can add whatever color you want. 3. Code : 4. You can see in the following screenshot when an email is invalid or empty then the input border is red. Password and Confirm Password Input Password and Confirm Password Input 2. Now, In BorderColor we will add the following code in the formula bar, so if Password value doesn’t match with Confirm Password Input it will change the border color of input to red. 3. Code: 4. You can see in the following screenshot when password / confirm password does not match or empty then the input border is red 5. For Confirm password Field you can follow the same step as Password Field Phone Number Input Select the Phone Number Input and choice BorderColor Property 2. Now, In BorderColor we will add the following code in the formula bar, so if Phone Number Input is invalid or not in the format it will change the border color of input to red. 3. Code: 4. You can see in the following screenshot when phone no is invalid or empty then the input border is red. Here we finish with our blog, I hope this helps a lot and stay tuned for more blog like this. Thank you
Share Story :
Create Fruit Detection App inside Power Platform — PowerApps and AI Build
Before you begin with this Blog, please read my previous blog which was on How to Create Object Detection Model inside the Power Platform — PowerApps | AI Builder. In a previous blog, we have created the Object Detection Model that we are going to use in this blog. Step 1: Expand the AI Builder section in Power App and click on the Models. Open Fruit Detection Model that we have created in the Previous blog series. Click on the Use model and select the Create new App. It will redirect to the Power App Builder Studio. Step 2: You will see the Object Detector Control will automatically on the screen. Your application is ready you can run the detect the objects. Step 3: Now, we will insert the Gallery Control so that we can see the count of each object present in the images. Click on Insert and Select on the Add gallery. If you are not able to see the Option Expand the ribbon from Top right Side. Select Vertical and choose Title and Subtitle. Step 4: Click on the Gallery Control and set the property to ObjectDetector1.VisionObjects Click on the Title and set the property to ThisItem.displayName Click on the SubTitle and set the property to ThisItem.count We will do some changes in the font and alignment of the control so that the application should look simple, readable & accessible. Step 5: Click on File and Save the PowerApp. You can give the name and icon to your PowerApp. Your App is ready you can select your app and click on the Play. And Click Detect it will open your file explorer to select the images. If you are using this application on the mobile your camera will be open. Following is a screenshot of Application with detected Object Inside it. I hope this helps you to understand how to create an Object Detection models and use that Model in Power Apps.
Share Story :
Create Object Detection Model inside the Power Platform | Power Apps — AI Builder
In this blog, we are going to see how to create object detection model which can be used in PowerApps or MS Flow / PowerAutomate Step 1: Log in to portal.office.com. Select the PowerApps, If PowerApps is not visible then click on All Apps then you will able to see the PowerApps. Step 2: Expand the AI Builder Section and click on Build Section in PowerApps. Note: Please ensure that you are select the correct instance. Step 3: Click on the Object Detection Model We are going to create the Object Detection Model which can be used to created PowerApps or In MS Flow / Automate. Step 4: Name the AI Model and click on Create. Step 5: We will select the Domain of Model so we will go with Common Object. And click on Next. Step 6: Before moving forward, we will download the data which will be used to train and test the Model. Kaggle is the best source for the data to train the machine learning model. We will require the fruit images as we are designing the Fruit Detection Model, search fruit images and download the Dataset given in the screenshot. Select the Fruit Image for Object Detection Click on the Download Step 7: After downloading the Zip file extract it. You will see the following two folders — train_zip and test_zip respectively. Now, we will open the train_zip folder and you can see that there will be four categories of images Apple Banana Orange Mixed [Apple, Banana, Orange] Step 8: Let move back to the PowerApps Platform, 2nd step in the creation of Object Model is to define the object that we are going to detect. Here, we have three objects — Apple, Banana & Orange and click on next to move further steps. Step 9: We will require a minimum 15 images of each category to train our Object Detection Model. Now, we will click on Add Images and select Upload from local storage. Step 10: Select the images. Once all images are upload click on Close. Click on Next Step 11: Now, we are in the most important phase of the training where we provide the Tag or Label to the Images which we have uploaded. Click on the uploaded image and select the area where the object is present. Once you will select the area you will get the option to select the object is that present in the selection area. If an image is not suitable for the model to remove that particular image click on the “Don’t use Image ”. Click on remove to remove the image from the model data set. Step 12: Once you are done will the tagging or labeling the image click on the Done tagging. Note: Please ensure that you have a more tagged image for each model so that your model will work accurately. The more the label data more the accurate your machine learning model. Step 13: Click on the Train to start the training your model based on tagged images. Click on Go to Model It will take around 5–10 min to train based on your complexity of images, model and objects. Step 14: Publish the model. Once you model finished with done and you have published the Model. You can use that model to in Power App or MS Flow. We are going to see that in the next part of this blog.
Share Story :
Resolve the dependency between multiple solutions in D365 Customer Engagement / CRM Solution using Solution Component Mover.
You have might question in your mind that why we need to move the components from one solution to another solution in D365 Customer Engagement So, let’s consider a scenario you and your team is working on D365 CRM customization and created the two solution — ‘ Solution A’ and ‘Solution B’. While customization development when you are moving the ‘Solution A’ on the Production instance but you are not able to move it. Because some of the missing components are present in the ‘Solution B’. Then you have decided to move the ‘Solution B’ first, but again while moving the ‘Solution B’ its failed because of some of the missing components present in ‘Solution A’. It means ‘Solution A’ and ‘Solution B’ are dependent on each other and you can’t move either of the solutions in the Production Environment or Target Environment. There are two solutions to the above problem Add the missing components in the one solution and move that solution to production. Merge the dependent solution into one solution using Solution Component Mover. Now, the First solution is time-consuming as well as effort making and developers need to track all the missing components and add them manually. But using Solution Component Mover, you can merge solutions in 10 to 15 min just by selecting the component from Source Solution and Target solution to which you need to move the components. So, let us see how to do it. Perquisites: XrmToolBox You can download the XrmToolBox from https://www.xrmtoolbox.com/ Steps to follow: Open the XrmToolBox and connect to your D365 CRM environment. Search for the Plugin “Solution Components Mover” Image: Search Solution Components Mover in XrmToolBox 3. Once the plugin will be load, click on load solution — it will load all the solutions present in the Environment. Image: Click on Load Solution After solutions are loaded you can see I have two solutions in my Environment “Solution A” and “Solution B” which have dependent components and one “Target Solution” on which I m going to copy the component so that “Target Solution” will become a master solution. Solution A Image: Solution A has Account entity and it’s subcomponents Solution B Image: Solution A has Case entity and it’s subcomponents Target Solution Image: Target Solution doesn’t have any entity or component. 4. Move the solution component by select the Source solution and Target Solution 5. Click on the Copy component, a popup will open where you can select the component type to move to the target solution. 6. Click on “Ok” and component from both solutions will be moved to Target Solution. You can see the following screenshot in which Target Solution has a component from “Solution A” and “Solution B”. You can see XrmToolBox Plugin how it helps to reduce your time and effort to which are required to move the component from solution to solution manually one by one.
Share Story :
Test Automation on MS Dynamics CRM using TypeScript Library
Nowadays, Test Automation is the most important requirement for any company to check Quality assurance of their product or software. There are very few tools available for Test Automation of MS Dynamics CRM. In this blog, we are going to see how we can do Test Automation using TypeScript Library (D365-UI-Test). It is the opensource library you can edit as per requirement and if any issue is present in the repo you can create an issue on Github Repo. Florian Krönert will resolve the issue as soon as possible. GitHub Repo Link: https://github.com/digitalFlow/D365-UI-Test Setup and Installation: We will use Visual Studio Code because it is a lightweight Code Editor with integrated Git support. And we will require git to clone directory, and we can do operation directly from the VS Code editor itself. Installation Process: First, we will be required D365-UI-Test Source Files So, we need to clone that Repo from GitHub remote GitHub Repo to Local Machine in your working project directory. To open Terminal inside the VS Code use the keyboard shortcut Ctrl + ` In terminal run following git command to clone the directory git clone https://github.com/DigitalFlow/D365-UI-Test.git Now after cloning the repo, we need to install the required Node Packages that are required to run the Project. To Install the required Node Packages, navigate in the project directory in the VS Code and run the following command. npm install “npm install” command will install the required Node Packages with a specific version which will be defined in the “package.json” file. After the installation of all required packages. Now, we can see the sample test cases written in the “TypeScript” in “spec” Folder. To write your own test cases we need to add our test cases and run the Project. Following is the Folder structure of the D365-UI-TEST Project you will get after setup: In the above screenshot, you will able to see the “spec” folder, we can add our own Test Cases but there are some sample test cases available for the UCI “xrm-uci-ui-test.spec.ts” which is written in TypeScript. Now, we must add the settings.txt file in the folder structure so that we can pass the CRM Login Credential to the Test Case File. Or else you can directory pass credential in the Test Case itself. After Adding the setting file, we need to give the relative or absolute path of the Setting file as follows: Following is a test case to login to CRM instance and created the Account Record: To the Run the Open the Integrated Terminal to Visual Studio Code. OR Press Ctrl + ` And Run the following command: npm run test It will run all the tests and show the result as to which test as passed and failed. If the test is failed, then it will show the error of why the test has failed. Once the test is run from the terminal it will open the chromium-browser and start the test. You can see the Documentation of the Library in GitHub or in following mentioned link: https://digitalflow.github.io/D365-UI-Test/
Share Story :
How to make same record available in two different Organization / Environment? Part 2
Please refer to my previous blog better understand why the same records are required in two different Organizations or Environments. Click Here Import Phase: Login to Destination Environment in which you want to import the Data with the same GUID. In Destination Environment, Open the Entity View and click on right three dots for import options. Click on the > which is present on the right side of “Import from Excel” Click on the Import from CSV You will get the Screen like shown in below Screen and now you have to select the file and click on Next You will get the Screen as shown in the below Screenshot. Now, click on the Review mapping. After Clicking on the Review Mapping you must map the column header to fields of Entity. Now, here comes the most important of this blog. Here you have a map the column which has the GUID of the entity to Entity name present in the Destination Environment. Here I have mapped Customer Header (having GUID) to Customer (Entity Name). You can ignore the mapping of “Created On” Date or else you can map it will “Record Created On”. Click on the Finish Import and see the magic. All the Data will be imported with the same GUID as of source Environment. You can see the imported data in the following screenshot. To check whether the GUID of records present on both the Environment is the same or not. Just export data from both the Environment and match the GUID of each record. Below you can see the GUID of imported and exported records are the same. Following is exported Data from Source Environment. Following is exported Data from Destination Environment.
Share Story :
How to make same record available in two different Organization / Environment? Part 1
Why it is required? Let’s discuss the scenario where we will be required to have some records in the Multiple Environment. Let’s say we have workflow or flow which is configured or running with specific records. So, if we move the flow or workflow to another organization or Instance so the same record is not available with the same GUID. To run the flow or workflow we must change the flow or workflow. Now if we have 10 – 20 flow / Workflow like the same, then it will be so much time consuming So, the best solution to have the Same records with the same GUID in multiple instances. Prerequisite: The system must have the same entity and fields present in both the system in which you must transfer records. Because it will through an error while import to destination Environment Solution: Exporting Phase: Login to Source Environment from where you want to export the Data. Open the Entity of which data needs to be migrated. You can see I have an Entity called Customer which is having a few numbers of records present in the system. Now You need to export the Data from the source Environment. Export Data from the advance find or direct from the View. After Exporting the Data from the system, an excel file will be downloaded with the same columns as in the views. Now, open the exported Data file. The following is the screenshot of the file. You can see there will hidden Columns in Excel A, B, and C. Now to Unhide the Hidden Columns, select all Data Or Press Ctrl + A and Go to Format in Home Section > Under the Visibility Section Click on Hide & Unhide > Unhide Columns. Please see the following Screenshot for more reference. Now You can see all the three Columns as (Do Not Modify) Entity Name, (Do Not Modify) Row Checksum & (Do Not Modify) Modified On. You can see the Hidden Column in the screenshot after unhiding. Let’s move forward now, we must delete (Do Not Modify) Row Checksum & (Do Not Modify) Modified on Columns because we are not required those columns while importing to another environment. If we will import with those columns it will through the error while import Operation. Change the Header from (Do not Modify) Entity_name to Entity_name for convenience while mapping during importing and Save as CSV. Now Your File is ready to import in another system. Let’s go towards the importing phase.
Share Story :
How to avoid the reposting of Old Email Activity on the Activity Timeline of Lead when Lead is Assigned to new Owner
Customer service is the most important element in today’s business. If you do not take care of your customers well, they will not stay loyal to your brand or products. Hence, you need to do everything possible to provide them with the best service. Unfortunately, these days a lot of companies are struggling with this thing. Dynamics 365 for Customer Service from Microsoft is the best product as it will help you as a business to focus on the right aspect of the business. It will not only streamline the processes in your company but it will help you to become efficient in addressing issues that are of grave concern to the customers. Installing and using this product is easy. Since there are certain things that are confusing. Here is a blog explaining how to tackle a problem you will face. Problem: Whenever we are assigning the new owner to Lead present in the CRM, the Last Date Modified of the Email Activity gets changed to the current date and time when the new Owner is assigned. Now due to this, all the Old Email Activity with other owner get re-posted to the Activity Timeline of the Lead because email activity by default sort by the Last Date Modified. Now, above the situation is happened due to the 1: N relationship present in the CRM between Leads and Email Entity which has “Parental” Type of Behavior. So, when we assign a new owner then some of the fields get modified. Following is a step where you can find Lead to Email Relationship: Go to Solution and find the Lead Entity Click on Lead Entity and then open 1: N Relationship 3. Now open Lead Email Relationship, it will be same as the following: Now to Change the Current Behavior of the Relationship So, Email Activity should not be re-posted whenever the new owner is assigned to Lead. Step to change the Relationship Behavior: 1. Change the Type of Behavior to Configurable Cascading under the section of Relation Behavior After changing the behavior, change the Cascade All to Cascade none in Assign field. This will not change the owner of the Email Activity as well as the Last Date Modified of Lead’s old Email Activity when Lead is assigned to new Owner. Following is before and after of the configuration has been changed. If we don’t change Relationship behavior, the following is the scenario 1.1 Lead Email Relationship Configuration 1.2 Before assigning the owner 1.3 After assigning a new Owner If we change Relationship behavior, Parental to Configurable Cascading and Assign (Cascading All) to Assign (Cascading None) 1.1 Lead Email Relationship Configuration 1.2 Before assigning the owner 1.3 After assigning an Owner Hope this helps!