Blog Archives - Page 9 of 171 - - Page 9

Category Archives: Blog

Migrating Data from Azure Files Share to Azure Blob Storage Using C#

For growing businesses, efficient data management is as critical as streamlined processes and actionable reporting. As organizations scale, the volume and complexity of data stored in systems like Azure Files Share increase, necessitating robust, scalable storage solutions like Azure Blob Storage. Are you struggling to manage your file storage efficiently? If you’re looking to automate data migration from Azure Files Share to Azure Blob Storage using C#, this article is for you. Research shows that 70% of customers value seamless experiences with efficient systems, impacting brand loyalty. Businesses automating data management processes can reduce retrieval times by up to 90%, while organizations leveraging cloud storage solutions like Azure Blob Storage report a 25% increase in operational productivity and 60% improved satisfaction in data workflows. This article provides a structured guide to migrating data using C#, drawing from practical implementation insights to help Team Leads, CTOs, and CEOs optimize their data storage for scalability and efficiency. Why Migrate to Azure Blob Storage? Azure Files Share offers managed file shares via the Server Message Block (SMB) protocol, suitable for traditional file system needs. However, Azure Blob Storage excels in scalability, cost efficiency, and integration with advanced Azure services like Azure Data Lake and AI/ML workloads. Key benefits include: Migrating Data Using C#: A Step-by-Step Approach To migrate data from Azure Files Share to Azure Blob Storage programmatically, you can leverage C# with Azure SDKs. Below is a structured approach, referencing a C# implementation that uses a timer-triggered Azure Function to automate the process. Step 1: Set Up Your Environment Step 2: Design the Migration Logic The C# code uses an Azure Function triggered on a schedule (e.g., every 5 seconds) to process files. Key components include: Step 3: Execute the Migration Step 4: Optimize and Automate Step 5: Validate and Test A Glimpse of the C# Implementation The C# code leverages an Azure Function to automate migration. It connects to the file share, enumerates files, uploads them to a blob container, and deletes them from the source upon successful transfer. Key features include: This approach ensures minimal manual intervention and robust error handling, aligning with the needs of growing businesses. Benefits of Programmatic Migration Using C# for migration offers: To conclude, migrating data from Azure File Share to Azure Blob Storage using C# empowers growing businesses to achieve scalable, cost-efficient, and automated data management. By implementing a structured approach with Azure Functions, you can streamline operations and unlock advanced analytics capabilities. Evaluate your current data management processes and identify one area for improvement, such as automating file transfers with C#. Start today to enhance efficiency and customer satisfaction. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

US-Based Non-Profit Organization Partners with CloudFronts for a Managed Services Agreement   

We are pleased to announce that a leading US-based non-profit organization has partnered with CloudFronts for Dynamics 365 support & maintenance with a Managed Services Agreement (MSA).  Founded in 2010, the organization is headquartered in San Francisco, California, with additional offices in Amsterdam, Venlo, and Raleigh, North Carolina. It is dedicated to advancing sustainable product design through its Certified™ program, which emphasizes material health, product circularity, renewable energy, water stewardship, and social fairness. By supporting global organizations, the non-profit plays a key role in creating safer, recyclable, and more circular products that contribute to a sustainable future.  On this occasion, Priyesh Wagh, Practice Manager at CloudFronts, stated: ” Our first project with our client established a great way of working together, and we saw how we could take this implementation ahead and generate value through our work together. We are keen to look forward to building their systems that eases their customer service efforts. “  “Discover How We’ve Enabled Businesses Like Yours – Explore Our Client Testimonials!”    About CloudFronts  CloudFronts is a global AI- First Microsoft Solutions Partner for Business Applications, Data & AI, helping teams and organizations worldwide solve their complex business challenges with Microsoft Cloud, AI, and Azure Integration Services. We have a global presence with offices in U.S, Singapore & India.   Since its inception in 2012, CloudFronts has successfully served over 200+ small and medium-sized clients all over the world, such as North America, Europe, Australia, MENA, Maldives & India, with diverse experiences in sectors ranging from Professional Services, Financial Services, Manufacturing, Retail, Logistics/SCM, and Non-profits.    Please feel free to connect with us at transform@cloudfronts.com 

Share Story :

Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification

In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects.  To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation.  Challenge  Sales professionals routinely face challenges such as:  These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects.  How It Works  Platform  Data Sources  CloudFronts SmartPitch pulls information from the following knowledge sources:  AI Integration  Key Features  MQL – SQL Summary Generator  Users can request MQL – SQL document which contains   The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX.  HTTP Request to Logic App At Logic App we used ChatGPT API to fetch company and client information  Extract the company location from the company information, and similarly, extract the industry as well.  Render it to custom copilot via request to the Logic App.  Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions. Generative AI can also be instructed to directly create a formatted json based on parsed values.   This formatted Json can be passed to converted to an actual Json and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person. This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry.   This triggers an auto download of the MQL-SQL created as a PDF file on your system.   Content Search  Users can ask questions related to –  Users can ask questions like   “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume.  –Security & Governance  Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions.  To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com 

Share Story :

Understanding Legal Entities, Companies, and Organizational Hierarchies in Dynamics 365 Finance and Operations

If you’re just starting with Dynamics 365 Finance and Operations (Dynamics 365 Finance & Operations) and confused about what Legal Entities, Companies, and Organizational Hierarchies mean, you’re not alone! Let’s break it down in simple terms. What is a Legal Entity? In Dynamics 365, a Legal Entity is an organization that can: Think of a Legal Entity as a registered company or business under the law. Microsoft Docs Reference: Legal entities overview What is a Company in Dynamics 365 Finance & Operations? Each Legal Entity is also referred to as a Company in the system. In the interface, you switch between Companies (Legal Entities) using a 4-character company ID (like USMF or INMF). Tip: Even if you manage multiple companies (e.g., one in India, one in the US), D365 can consolidate and report across them — provided they are set up as separate legal entities. What are Organizational Hierarchies? This is where the real power lies! Organizational Hierarchies define how different parts of your business interact and report to one another. You can set up hierarchies for: Example: A retail chain may have a parent legal entity, and underneath, different divisions like wholesale, online store, and physical stores — all structured in a hierarchy. Microsoft Docs Reference: Organizational hierarchies Real-World Example Let’s say you’re working for a construction company that operates in three countries: You’d set up each country as a Legal Entity (Company). Now, you want: Organizational Hierarchies let you define that.  What Can Be Shared Across Legal Entities? Microsoft allows some data to be shared across companies:  Data sharing and integration To conclude, if you’re evaluating Dynamics 365 Finance and Operations and wondering how to structure your organization within the system, we’d love to help you design it the right way. Whether you’re a startup expanding internationally or an enterprise optimizing operations, your legal entity and organizational structure are the foundation of your Dynamics365 system. Let’s build that foundation together. You can reach out to us at transform@cloudfonts.com.

Share Story :

Power Pages + Power Automate: Retrieve SharePoint Files via HTTP Trigger Flow

When building a Power Pages site to fetch SharePoint files, I initially relied on the official Power Pages flow trigger—“When Power Pages calls a flow.” However, the flow didn’t trigger reliably when initiated from the site. Despite proper configurations, the trigger wouldn’t execute consistently, leading to broken file fetch operations. To overcome this, I replaced the unreliable trigger with a Power Automate flow using an HTTP request trigger. This method allowed me to invoke the flow through a JavaScript function executed on page load, passing the required record ID dynamically. The HTTP approach not only worked reliably but also gave me more control over the request and response. This blog post outlines that workaround, from setting up the HTTP-triggered flow to integrating it seamlessly with Power Pages. Background and the Problem Power Pages provides a native trigger to call Power Automate flows. While ideal in theory, this approach often fails in practice due to: After spending time investigating these issues without consistent results, I decided to switch to a more controlled and universally reliable method using a HTTP trigger. My Workaround – HTTP Trigger Flow Power Automate Flow Setup: Trigger:Start with the “When an HTTP request is received” trigger. Define the request schema to accept a recordId— in this case, an orderId. Compose (Request Variables):Add a Compose action to extract the incoming ID. List Rows – Document Locations:Use Dataverse → List rows to retrieve the SharePoint Document Location related to the Order (based on the passed orderId). This assumes your files are stored in folders linked to Dataverse records. Condition – Check If Folder Exists:Use a Condition to check if any document location was found: If record exists → proceed, If no records found → terminate the flow (folder doesn’t exist). Compose – Relative URL: Compose – Folder Path:Combine the folder path: Get Files (SharePoint):Use the SharePoint Get files (properties only) action with the dynamic path set to the DocumentPath variable. Return Response:Format the SharePoint file metadata (Name, Link, Type) and send it back using the Response action. JavaScript (Executed on Page Load) Why This Works: Pros: Cons: To conclude, if you’ve faced reliability issues with native Power Pages flow triggers, the HTTP request method can be a game-changer. It enabled me to deliver a seamless experience for retrieving SharePoint files, and it can do the same for your project. In future iterations, I plan to enhance this by adding bearer token authentication and caching metadata for even faster performance. Want to integrate Power Automate flows reliably with Power Pages? Reach out to CloudFronts—we help businesses implement scalable, reliable Power Platform solutions every day. You can contact us directly at transform@cloudfronts.com.

Share Story :

Top 5 Ways to Integrate Microsoft Dynamics 365 with Other Systems 

When it comes to Microsoft Dynamics 365, one of its biggest strengths—and challenges—is how many ways there are to integrate it with other platforms. Whether you’re syncing with an ERP, pushing data to a data lake, or triggering notifications in Teams, the real question becomes:  Which integration method should you choose?  In this blog, we’ll break down the top 5 tools used by teams around the world to integrate Dynamics 365 with other systems. Each has its strengths, and each fits a different type of use case.  1. Power Automate – Best for Quick, No-Code Automations  What it is: A low-code platform built into the Power Platform suite. When to use it: Internal automations, approvals, email notifications, basic integrations.  Lesser-Known Tip: Power Automate runs on two plans—per user and per flow. If you have dozens of similar flows, the “per flow” plan can be more cost-effective than individual licenses.  Advanced Feature: You can call Azure Functions or hosted APIs directly within a flow, effectively turning it into a lightweight integration framework. Pros:  Cons:  Example: When a new lead is created in D365, send an email alert and create a task in Outlook.  2. Azure Logic Apps – Best for Scalable Integrations  What it is: A cloud-based workflow engine for system-to-system integrations. When to use it: Large-scale or backend integrations, especially when working with APIs.  Lesser-Known Tip: Logic Apps come in two flavours—Consumption and Standard. The Standard tier offers VNET-integration, local development, and built-in connectors at a flat rate, which is ideal for predictable, high-throughput scenarios.  Advanced Feature: Use Logic Apps’ built-in “Integration Account” to manage schemas, maps, and certificates for B2B scenarios (AS2, X12). Pros:  Cons:  Example: Sync Dynamics 365 opportunities with a SQL database in real time.  3. Data Export Service / Azure Synapse Link – Best for Analytics  What it is: Tools to replicate D365 data into Azure SQL or Azure Data Lake. When to use it: Advanced reporting, Power BI, historical data analysis.  Lesser-Known Tip: Data Export Service is being deprecated in flavours of Azure Synapse Link, which provides both near-real-time and “materialized view” patterns. You can even write custom analytics in Spark directly against your live CRM data.  Advanced Feature: With Synapse Link, you can enable change data feed (CDC) and query Delta tables in Synapse, unlocking time-travel queries for historical analysis. Pros:  Cons:  Example: Export all account and contact data to Azure Synapse and visualize KPIs in Power BI.  4. Dual-write – Best for D365 F&O Integration  What it is: A Microsoft-native framework to connect D365 CE (Customer Engagement) and D365 F&O (Finance & Operations). When to use it: Bi-directional, real-time sync between CRM and ERP.  Lesser-Known Tip: Dual-write leverages the Common Data Service pipeline under the covers—so any customization (custom entities, fields) you add to Dataverse automatically flows through to F&O once you map it.  Advanced Feature: You can extend dual-write with custom Power Platform flows to handle pre- or post-processing logic before records land in F&O. Pros:  Cons:  Example: Automatically sync customer and invoice records between D365 Sales and Finance.  5. Custom APIs & Webhooks – Best for Complex, Real-Time Needs  What it is: Developer-driven integrations using HTTP APIs or Dynamics 365 webhooks. When to use it: External systems, fast processing, custom business logic.  Lesser-Known Tip: Dynamics 365 supports registering multiple webhook subscribers on the same event. You can chain independent systems (e.g., call your middleware, then a monitoring service) without writing code.  Advanced Feature: Combine webhooks with Azure Event Grid for enterprise-grade event routing, retry policies, and dead-lettering. Pros:  Cons:  Example: Trigger an API call to a shipping provider when a case status changes to “Ready to Ship.”  To conclude, Microsoft Dynamics 365 gives you a powerful set of integration tools, each designed for a different type of business need. Whether you need something quick and simple (Power Automate), enterprise-ready (Logic Apps), or real-time and custom (Webhooks), there’s a solution that fits.  Take a moment to evaluate your integration scenario. What systems are involved? How much data are you moving? What’s your tolerance for latency and failure?  If you’re unsure which route to take, or need help designing and implementing your integrations, reach out to our team for a free consultation. Let’s make your Dynamics 365 ecosystem work smarter—together.  We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Automated Email Reminders Based on Date Fields in Power Automate for Dynamics 365 CRM.

Managing reminders and deadlines can be tricky, especially when you’re juggling multiple tasks in Dynamics 365. But what if you could set up automatic email reminders based on specific dates? In this guide, I’ll show you how to use Power Automate with D365 CRM to send automatic email reminders when certain dates are entered, and follow up at 7, 14, 21, and 28-day intervals if another related date field isn’t filled. By the end of this post, you’ll learn how to create a simple workflow that keeps your team on track by sending timely reminders when needed. The Use-Case: Automatic Email Reminders for Unfilled Dates Imagine this: Your Project Manager fills in a date for a project milestone. But if the next milestone isn’t updated after a set period, your system will automatically send reminder emails to the right people. This saves your team from having to manually follow up and ensures that important dates are never overlooked. Here’s how you can set it up using Power Automate: Key Components of the Solution Follow the Power Automate step outlined below: Select the “When a row is added, modified or deleted” trigger from the Dataverse connector, set the Change Type to Modified, choose the Order Fulfillments table, set the Scope to Organization, specify the columns cf_submittalprotocolprocess and cf_initialshopdrawingssubmitted to trigger only on changes to those fields, and optionally use the Filter Rows field to apply additional conditions if needed. Click the ellipsis (three dots) on the top-right corner of the trigger card, select Settings, scroll to the Trigger Conditions section, and enter the following expression to ensure the flow only triggers when cf_initialshopdrawingssubmitted is not empty: Below is the condition that you need to add: @and(     not(empty(triggerOutputs()?[‘body/cf_initialshopdrawingssubmitted’])),     equals(triggerOutputs()?[‘body/cf_submittalprotocolprocess’],979570001) ) Add the “Get a row by ID” action from the Dataverse connector, set the Table Name to Opportunities, and use the dynamic value Opportunity (Value) for the Row ID to retrieve the corresponding Opportunity record related to the modified Order Fulfillment. Add the “Get a row by ID” action from the Dataverse connector, set the Table Name to Order Fulfillments, and use the dynamic value Order Fulfillment for the Row ID to retrieve full details of the modified Order Fulfillment record. Add a “Compose” action named Comments for Email, and provide a formatted list including Project Manager (Order Fulfillment), Opportunity Contact, Secondary Contact, a static email (e.g., testblog@gmail.com), and again Project Manager (Order Fulfillment) as an example. Add a “Filter array” action, set the From field to a coalesce(…) expression generating a list of participants, and in Basic Mode set the condition to Item is not equal to null to remove null entries. Use a coalesce(createArray(…)) expression to conditionally construct an array of activity parties based on field availability (Opportunity Contact, Secondary Contact, Owner ID, Project Manager), falling back to a default address (CRMAdmin@gmail.com) if the Project Manager is null. coalesce(     createArray(         if(             not(equals(outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value’], null)),             json(concat(                 ‘{“participationtypemask”: 2,”partyid@odata.bind”: “contacts(‘, outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value’], ‘)”}’             )),             null         ),         if(             not(equals(triggerOutputs()?[‘body/_ow_secondarycontact_value’], null)),             json(concat(                 ‘{“participationtypemask”: 2,”partyid@odata.bind”: “contacts(‘, triggerOutputs()?[‘body/_ow_secondarycontact_value’], ‘)”}’             )),             Null ),         if(             not(equals(triggerOutputs()?[‘body/_ownerid_value’], null)),             json(concat(                 ‘{“participationtypemask”: 3,”addressused”: “testblog@gmail.com”}’             )),             null         ),         json(concat(             ‘{“participationtypemask”: 4,”partyid@odata.bind”: “systemusers(‘, outputs(‘Getting_Order_Fulfillment_by_ID’)?[‘body/_cf_projectmanager_value’], ‘)”}’         )),         if(             not(equals(outputs(‘Getting_Order_Fulfillment_by_ID’)?[‘body/_cf_projectmanager_value’], null)),             json(concat(                 ‘{“participationtypemask”: 1,”partyid@odata.bind”: “systemusers(‘, outputs(‘Getting_Order_Fulfillment_by_ID’)?[‘body/_cf_projectmanager_value’], ‘)”}’             )),             json(concat(                 ‘{“participationtypemask”: 1,”addressused”: “CRMAdmin@gmail.com”}’             ))         )    )) Switch to Advanced Mode in the Filter array and use the expression @not(equals(item(), null)) for better control over null filtering of the dynamic participant list. In the step below, I used a “Compose” action to extract the Project Manager’s ID, which is then used in the filter array step. Below is the body of the Filter Array, which I’ve saved in a new Compose action named “Email Participants.” Add a “Compose” action to generate the email body using HTML formatting, apply an if() expression to dynamically insert the recipient’s name, and use concat() to list the required items for fabrication. Below is the expression I used to identify all the recipients who will be receiving the emails. if(   not(equals(outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value@OData.Community.Display.V1.FormattedValue’], null)),    concat(‘<strong>’, outputs(‘Get_Opportunity_by_ID’)?[‘body/_cf_opportunitycontact_value@OData.Community.Display.V1.FormattedValue’], ‘</strong>’),    null After completing the previous steps, add a parallel branch with four parallel actions, each configured to send the email after a delay of 7, 14, 21, and 28 days, respectively. After introducing a 7-day delay, add a parallel branch that retrieves the corresponding Order Fulfillment record by ID and checks if both the Drawing Approval Date and Redline Issued Date fields updates have been made since the initial trigger Add the “Add a new row” action from the Dataverse connector, set the Table Name to Email Messages, populate the Activity Parties field using the dynamic output outputs(‘Email_Participants’), and map the Description field similarly with the appropriate output value containing the email body content. Set the Regarding (Order Fulfillments) field in the “Add a new row” action to the dynamic value Order Fulfilment (cf_orderfulfillment) to associate the email with the corresponding Order Fulfillment record. This is how the final power automate will look like Similarly, you can do for 14,21 and 28 days. Below is how the email would look like in CRM: The trigger will start when Initial shop drawings field contains the date field and Submittal/protocol process equals certain option. If both Redline Int Received and Drawing Approved Date fields remain empty after 7, 14, 21, and 28 days, CRM will automatically send a follow-up email on each of those days. This is how the email will look like. To conclude, setting up this automatic reminder system in Power Automate for D365 CRM will help your team stay on top of project milestones, reduce manual follow-ups, and make sure nothing gets overlooked. It’s a simple yet effective way to automate reminders and keep everyone informed without any extra effort. Hope this helps!!! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

Ensuring Audit Compliance with Workflows in Dynamics 365

This blog outlines the steps required to ensure audit compliance within Microsoft Dynamics 365 Finance and Operations using workflow configurations, database logging, and segregation of duties rules. The goal is to provide a comprehensive record of transaction approvals and status changes. 1. Configure workflow approvalsLocation: Organization Administration > Workflow > Workflow EditorDescription: This section displays the workflow design screen, highlighting steps like review and approve, including role assignments and conditions. 2. Enable database logs for workflow tracking Location: System Administration > Links > Database > Database Log Setup Description: Enables database logging for critical tables and fields related to workflow status changes. 3. View and export workflow History Location: System Administration > Inquiries > Workflow History and Tracking Description: Displays workflow instances, status changes, timestamps, and provides export capabilities. 4. Segregation of Duties Compliance Location: System Administration > Security > Segregation of Duties Rules Description: Shows configured rules and potential role conflicts for review and action. To conclude, integrating workflows in D365 is not just about meeting audit requirements—it also drives operational efficiency, improves data governance, and strengthens organizational integrity. By embedding compliance into daily business processes, companies can proactively manage risk and build a strong foundation for sustainable growth. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification

In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects.  To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation.  Challenge  Sales professionals routinely face challenges such as:  These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects.  How It Works  Platform  Data Sources  CloudFronts SmartPitch pulls information from the following knowledge sources:  AI Integration  Key Features  MQL – SQL Summary Generator  Users can request MQL – SQL document which contains   The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX.  HTTP Request to Logic App  At Logic App we used ChatGPT API to fetch company and client information  Extract the company location from the company information, and similarly, extract the industry as well.  Render it to custom copilot via request to the Logic App.   Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions.  Generative AI can also be instructed to directly create a formatted json based on parsed values.     This formatted JSON can be passed to converted to an actual JSON and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person.   This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry.   This triggers an auto download of the MQL-SQL created as a PDF file on your system.    Content Search  Users can ask questions related to –  1. Case Study FAQ: Helps users ask questions about client success stories and project case studies, retrieves relevant information from a knowledge source, and offers follow-up FAQs before ending the conversation. Cloudfronts official website is used for fetching Case Studies information.  2. Opportunities: Helps users inquire about past projects or opportunities, detailing client names, roles, estimated revenue and outcomes.  3. SOPs: Provides quick answers and summaries for frequently asked questions related to organizational processes and SOPs.  Users can ask questions like   “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume.  Security & Governance  Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions.  To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com 

Share Story :

How to Perform Data Transformation in Microsoft Dataverse

Microsoft Dataverse is a powerful data platform that supports secure and scalable data storage for business applications. However, raw data imported into Dataverse often needs transformation—cleaning, reshaping, filtering, or merging—to make it useful and reliable for apps and analytics.  In this blog, we’ll show you how to apply transformations to data before or after it reaches Dataverse using tools like Power Query, Dataflows, and business rules—ensuring you always work with clean, structured, and actionable data.  What is Data Transformation in Dataverse?  Why Data Transformation Matters Data transformation refers to modifying data’s structure, content, or format before or after it’s stored in Dataverse. This includes:  Step-by-Step Guide: Connecting a Database to Dataverse  Step 1: Open the Power Apps and select the proper Environment  Step 2: Open Dataflow in Power Apps and create a new Dataflow  Step 3: Connect to the Database using SQL Server Database.  Step 4: Add the required credentials to make the connection between the database and Dataverse.  Step 5: Add the transformation in the Dataverse  Step 6: Add proper mapping of the column and find the unique ID of the table   Step 7: Set the schedule refresh and publish the Dataflow.  Step 8: Once Dataflow is published, we can see the table in the Power apps  To conclude, transforming data in Dataverse is key to building reliable and high-performing applications. Whether using Power Query, calculated columns, or Power Automate, you can ensure your data is clean, structured, and actionable.  We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com. Ready to improve your Dataverse data quality? Start with a simple dataflow or calculated column today, and empower your business applications with better, transformed data.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange