Category Archives: Blog
Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification
In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects. To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation. Challenge Sales professionals routinely face challenges such as: These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects. How It Works Platform Data Sources CloudFronts SmartPitch pulls information from the following knowledge sources: AI Integration Key Features MQL – SQL Summary Generator Users can request MQL – SQL document which contains The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX. HTTP Request to Logic App At Logic App we used ChatGPT API to fetch company and client information Extract the company location from the company information, and similarly, extract the industry as well. Render it to custom copilot via request to the Logic App. Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions. Generative AI can also be instructed to directly create a formatted json based on parsed values. This formatted JSON can be passed to converted to an actual JSON and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person. This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry. This triggers an auto download of the MQL-SQL created as a PDF file on your system. Content Search Users can ask questions related to – 1. Case Study FAQ: Helps users ask questions about client success stories and project case studies, retrieves relevant information from a knowledge source, and offers follow-up FAQs before ending the conversation. Cloudfronts official website is used for fetching Case Studies information. 2. Opportunities: Helps users inquire about past projects or opportunities, detailing client names, roles, estimated revenue and outcomes. 3. SOPs: Provides quick answers and summaries for frequently asked questions related to organizational processes and SOPs. Users can ask questions like “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume. Security & Governance Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions. To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
How to Perform Data Transformation in Microsoft Dataverse
Microsoft Dataverse is a powerful data platform that supports secure and scalable data storage for business applications. However, raw data imported into Dataverse often needs transformation—cleaning, reshaping, filtering, or merging—to make it useful and reliable for apps and analytics. In this blog, we’ll show you how to apply transformations to data before or after it reaches Dataverse using tools like Power Query, Dataflows, and business rules—ensuring you always work with clean, structured, and actionable data. What is Data Transformation in Dataverse? Why Data Transformation Matters Data transformation refers to modifying data’s structure, content, or format before or after it’s stored in Dataverse. This includes: Step-by-Step Guide: Connecting a Database to Dataverse Step 1: Open the Power Apps and select the proper Environment Step 2: Open Dataflow in Power Apps and create a new Dataflow Step 3: Connect to the Database using SQL Server Database. Step 4: Add the required credentials to make the connection between the database and Dataverse. Step 5: Add the transformation in the Dataverse Step 6: Add proper mapping of the column and find the unique ID of the table Step 7: Set the schedule refresh and publish the Dataflow. Step 8: Once Dataflow is published, we can see the table in the Power apps To conclude, transforming data in Dataverse is key to building reliable and high-performing applications. Whether using Power Query, calculated columns, or Power Automate, you can ensure your data is clean, structured, and actionable. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com. Ready to improve your Dataverse data quality? Start with a simple dataflow or calculated column today, and empower your business applications with better, transformed data.
Share Story :
Comparing Asynchronous Patterns in C# and JavaScript
Asynchronous programming is essential for building responsive applications, especially when dealing with time-consuming operations like API calls, file I/O, or database queries. Both C# and JavaScript provide powerful tools to handle asynchronous code: Promises in JavaScript and Tasks in C#. However, managing these manually can lead to complex, nested code. Enter async/await—a syntactic sugar that makes asynchronous code look and behave like synchronous code, improving readability and maintainability. Async/Await in JavaScript JavaScript relies heavily on Promises for asynchronous operations. While Promises are powerful, chaining them can lead to callback hell. Async/await simplifies this by allowing us to write asynchronous code in a linear fashion. Scenario: Fetching User Data from an API Instead of chaining .then() calls, we can use async/await to make API calls cleaner. Without Async/Await (Promise Chaining) With Async/Await (Cleaner Approach) Benefits:✅ Easier to read – No nested .then() chains.✅ Better error handling – Structured try/catch blocks. Scenario: Sequential vs. Parallel Execution Sometimes we need to run tasks one after another, while other times we want them to run in parallel for efficiency. Sequential Execution (One After Another) Output: Parallel Execution (Faster Completion) Output: Async/Await in C# C# uses Tasks for asynchronous operations. Before async/await, developers relied on callbacks or .ContinueWith(), leading to complex code. Scenario: Downloading Files Asynchronously Instead of blocking the UI thread, we can use async/await to keep the app responsive. Without Async/Await (Blocking UI) With Async/Await (Non-Blocking UI) Benefits:✅ UI remains responsive – No freezing during downloads.✅ Clean error handling – try/catch works naturally. Scenario: Running Multiple Database Queries If we need to fetch data from multiple sources, async/await makes it easy to manage. Sequential Database Queries Parallel Database Queries (Faster Performance) Key Takeaways ✔ Use async/await to avoid callback hell in JavaScript and blocking calls in C#.✔ Sequential execution (await one by one) vs. parallel execution (Promise.all / Task.WhenAll).✔ Error handling is simpler with try/catch instead of .catch() or .ContinueWith().✔ Improves performance by keeping UIs responsive while waiting for I/O operations. By adopting async/await, you can write cleaner, more maintainable asynchronous code in both JavaScript and C#. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
Getting Started with OData Queries in Microsoft Dynamics 365
Have you ever needed to pull data out of Dynamics 365 but didn’t know where to begin? Whether you’re building a report, wiring up a Power App, or feeding data into another system, OData is your friend. In just a few clicks, you’ll be able to write simple HTTP requests to retrieve exactly the records you want—no complex code required. What Is OData and Why It Matters OData (Open Data Protocol) is a standardized way to query RESTful APIs. Microsoft Dynamics 365 exposes its entire data model via OData, so you can: This means faster development and fewer custom endpoints. 1. Finding Your Web API Endpoint https://yourorg.crm.dynamics.com/api/data/v9.2 That’s your base URL for every OData call. 2. Exploring Entities via Metadata Append $metadata to your base URL: GET https://yourorg.crm.dynamics.com/api/data/v9.2/$metadata You’ll get an XML file listing all entities (contacts, accounts, leads, etc.), their fields, data types, and navigation properties. Tip: press Ctrl + F to search for your entity by name. 3. Core OData Query Options a. $select – Return Only What You Need GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$select=fullname,emailaddress1,jobtitle This limits the payload to just those three fields, making responses smaller and faster. b. $filter – Narrow Down Your Results GET https://yourorg.crm.dynamics.com/api/data/v9.2//contacts?$filter=firstname eq ‘Ankit’ Operators: eq (equals) ne (not equals) gt / lt (greater than / less than) Combine with and / or : GET https://yourorg.crm.dynamics.com/api/data/v9.2//contacts?$filter=statecode eq 0 and jobtitle eq ‘Consultant’ c. $orderby – Sort Your Data GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$orderby=createdon desc Newest records appear first. d. $top – Limit Record Count GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$top=5 Great for previews or testing. e. $expand – Fetch Related Records Example: Get each contact’s full name and its parent account name in one request: GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts? $select=fullname,parentcustomerid &$expand=parentcustomerid_account($select=name) parentcustomerid is the lookup field parentcustomerid_account is the navigation property Nested $select limits expanded fields Another example: Expand opportunities with customer account info: GET https://yourorg.crm.dynamics.com/api/data/v9.2/opportunities?$expand=customerid_account($select=name,accountnumber) Finding Expandable Names In your $metadata, look for lines like: <NavigationProperty Name=”parentcustomerid_account” Type=”Microsoft.Dynamics.CRM.account” /> Use that Name value in your $expand. Putting It All Together Suppose you want all active contacts at “Contoso” and their account names: GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$filter=statecode eq 0 &$expand=parentcustomerid_account($filter=name eq ‘Contoso’; $select=name)&$select=fullname,emailaddress1 Conclusion: OData might sound technical at first, but once you get the hang of it, it becomes one of the most powerful tools in your Dynamics 365 toolbox. Whether you’re building integrations, reports, or simple automations, OData gives you the flexibility to query exactly what you need—without relying on custom development. Start small. Open your environment, locate the Web API URL, and try your first $select or $filter query. Once you’re confident, move on to advanced options like $expand and $orderby. Call to Action: Need help designing smarter OData-based solutions or integrating with Power Platform tools? Reach out to our team today and we’ll help you build something great.
Share Story :
Transforming Financial Operations: The Strategic Impact of Customer Payment Registration in Dynamics 365 Business Central
When customers make electronic payments to your bank account, you should take the following steps: Use the Register Customer Payments page to reconcile internal accounts using actual cash figures, ensuring all payments are collected accurately. This functionality allows you to quickly verify and post individual or lump-sum payments, handle discounted payments, and identify unpaid documents. For different customers with varying payment dates, payments must be posted individually. However, payments from the same customer with the same payment date can be posted as a lump sum. This is particularly useful when a single payment covers multiple sales invoices. Pre-requisites Business Central onCloud Steps: Search for “Register Customer Payments” Since different payment types can be posted to different balancing accounts, it is necessary to select a balancing account on the Payment Registration Setup page before processing customer payments. If you consistently use the same balancing account, you can set it as the default to streamline the process and skip this step each time you open the Register Customer Payments page. Check the Payment Made box on the line corresponding to the posted document for which the payment has been made. Use Post payment option to post regular payment (Non Non-Lump). You can use the preview action to verify entries before payment post. Lump payment: Payment information is posted for documents on lines where the Payment Made checkbox is checked. The payment entries are recorded in the general ledger, bank, and customer accounts, with each payment applied to its corresponding posted sales document. To conclude, effectively managing customer payments is crucial for maintaining accurate financial records and ensuring smooth business operations. Microsoft Dynamics 365 Business Central offers a robust and flexible platform to streamline the payment registration process, empowering businesses to efficiently reconcile accounts, post payments, and handle diverse payment scenarios. By leveraging features like the Register Customer Payments page, businesses can save time, reduce errors, and maintain a clear view of their financial health. Whether it’s managing individual payments, lump-sum transactions, or discounted invoices, Dynamics 365 provides the tools needed to adapt to your organization’s unique requirements. With proper setup and utilization of its payment registration features, businesses can enhance their financial workflows, foster better customer relationships, and drive long-term growth. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Generate Enhanced QR Codes in Business Central Using AL and QuickChart API
QR codes have become a standard tool for sharing data quickly and efficiently—whether it’s for product labeling, document tracking, or digital payments. Now, you can generate customized QR codes and barcodes directly within Microsoft Dynamics 365 Business Central using a simple action. This feature allows users to choose the barcode type and size, embed the image into a record, and optionally download it—all with just a few clicks. It’s an easy way to enhance records with scannable information, without leaving Business Central or needing external tools. In this article, we’ll walk through how this feature works and how it can be used in real business scenarios. What This Feature Does? The “Generate Enhanced QR Code” action gives users the ability to quickly create and manage barcodes within Business Central. Here’s what it can do: Business Scenarios Where This Shines AL Code Behind the Feature Output: Choose an image size (Small, Medium, Large). Select a barcode type (QR, Swiss QR, Aztec, Data Matrix, Telepen). Store the generated image in the Picture field of the item record. To conclude, this customization shows how a simple AL code extension can greatly boost efficiency in Microsoft Dynamics 365 Business Central. By enabling quick generation and embedding of QR codes and barcodes, you eliminate manual steps and streamline processes across departments—from inventory to sales and beyond. With support for multiple barcode types, customizable sizes, and built-in download and validation prompts, this feature brings powerful functionality right into the user’s workflow—no external tools needed. Whether you’re in warehousing, retail, manufacturing, or pharma, this tool helps standardize product labeling and enhances traceability with just a few clicks. Looking ahead? You can extend this further by including additional record fields, customizing encoding logic, or supporting more document types like purchase orders or invoices. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Bridge Your Database and Dataverse: Complete Integration Guide
Modern applications demand seamless, real-time data access. Microsoft Dataverse—the data backbone of the Power Platform—makes it easier to build and scale low-code apps, but often your enterprise data resides in legacy databases. Connecting a database to Dataverse enables automation, reporting, and app-building capabilities using the Power Platform’s ecosystem. In this blog, we’ll walk you through how to connect a traditional SQL database (Azure SQL or On-Premises) to Microsoft Dataverse. What is Dataverse? Dataverse is Microsoft’s cloud-based data platform, designed to securely store and manage data used by business applications. It’s highly integrated with Power Apps, Power Automate, and Dynamics 365. Key Features: Why Connect Your Database to Dataverse? Step-by-Step Guide: Connecting a Database to Dataverse Step 1: Open the Power Apps and select the proper Environment Step 2: Open Dataflow in Power Apps and create a new Dataflow Step 3: Connect to the Database using SQL Server Database. Step 4: Add the required credentials to make the connection between the database and Dataverse. Step 5: Add proper mapping of the column and find the unique ID of the table Step 6: Set the schedule refresh and publish the Dataflow. Step 7: Once Dataflow is published, we can see the table in the Power apps To conclude, connecting your database to Dataverse amplifies the power of your data, enabling app development, automation, and reporting within a unified ecosystem. Whether you need real-time access or periodic data sync, Microsoft offers flexible and secure methods to integrate databases with Dataverse. Start exploring virtual tables or dataflows today to bridge the gap between your existing databases and the Power Platform. Want to learn more? Check out our related guides on Dataverse best practices and virtual table optimization. We hope you found this blog useful. If you would like to discuss anything further, please reach out to us at transform@cloudfonts.com.
Share Story :
Setting Up Workflow Email Alerts in Dynamics 365 Finance & Operations
In today’s fast-paced business environment, staying on top of critical tasks and approvals is vital for maintaining efficiency and ensuring seamless operations. Microsoft Dynamics 365 Finance and Operations (D365 FO) provides a powerful feature—workflow email alerts—to help organizations streamline their processes by automatically notifying the right individuals when certain tasks are completed or conditions are met. In this blog, we will guide you through the step-by-step process of setting up workflow email alerts in D365 FO. Why Workflow Email Alerts Are Important Workflow email alerts are a critical tool for keeping business processes on track. They ensure that: With proper configuration, workflow email alerts can help minimize bottlenecks, enhance communication, and improve overall productivity. Step-by-Step Guide to Setting Up Workflow Email Alerts Step 1: Configure Email Parameters Before you begin, verify that your email parameters are set up correctly to enable email communication: 3. Send a test email to ensure the configuration is working. Step 2: Assign Email Addresses to Users Each user who will receive workflow email alerts needs to have a registered email address in the system: Step 3: Create an Email Template An email template defines the content and layout of the workflow alert emails: Step 4: Assign the Template to the Workflow To send email alerts for specific workflows: Step 5: Configure the Batch Job for Email Notifications To ensure workflow email alerts are sent automatically: Step 6: Monitor Email Sending Status To check the status of email notifications: By following these steps, you can set up workflow email alerts in D365 FO and enhance your organization’s workflow management. With properly configured email alerts, your team will be notified promptly of critical tasks and approvals, ensuring smooth and efficient operations. Take the time to configure these alerts today and experience the benefits of improved communication and productivity in your organization. Thank you for reading! If you have any questions or need further assistance, feel free to reach out in the comments. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Getting Your Organization’s Data Ready for AI
Since the turn of 2025, AI has been thrown around a lot in conversations – both individual and also at an organizational level. Major technology providers have started their own suite of tools to build AI agents. While these tools are good enough for simpler AI use cases like fetching data from systems and presenting to us, but complex use cases like predicting patterns, collating data from multiple systems and driving insights from connected systems – that’s where AI implementations need to be looked at like projects which needs architecting and implementing with organization’s vision of AI. Let’s look at how we can make sure that AI implementations give us over 95% accuracy and not just answers every time which we assume might be correct. Is AI enough by itself? Common perception that AI Agents are deployed on top of applications which can be used to interact with the underlying systems to do what users are supposed to get done from AI. This perception stems from our use of AI tools like ChatGPT/Claude/Gemini as they interact with the Internet to get your queries answered. Since this is a tool available independently, there’s not technical setup and it is ready to go. Speaking of being Copilot being enough on itself, it depends on where the data is sourced from – and what the intent of the Agent is. If your Custom Copilot / AI Agent is meant to only look at some SharePoint files, some websites and within 1 system in your M365 gated access, you should be able to patch to knowledge sources and be good enough to let AI Agent give you the information in the format you need. Challenge occurs where you expecting the AI Agents to make sense of the data which is stored differently in different systems with different naming conventions – that’s when AI agents will fall through because it cannot understand when you are pointing to an “Account” in CRM, but the same is stored as a “Customer” in Business Central. And this is where something like a Unity Catalog comes into picture. The term itself describes that the data comes together in a catalog for common access and AI agents to source from. Let’s look at how we can imagine this unity catalog to be in the next section. Unity Catalog Unity Catalog can be thought of as an implementation strategy and collection of connected systems over which AI Agents can be based upon. Here’s how I summarize this process – Above diagram is a summary for how AI implementations will scale within organizations and have different variations of the same. To encapsulate, while independent AI agents can be implemented for personal use within the organization, given the appropriate privileges, for AI to make sense of and enable trusted decision making, AI implementations need to have data readiness in place with clarity. Hopefully, this topic summarizes the direction in which organizations can think of AI implementation, more than just building agents. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Copy On-Premises SQL Database to Azure SQL Server Using ADF: A Step-by-Step Guide
Migrating an on-premises SQL database to the cloud can streamline operations and enhance scalability. Azure Data Factory (ADF) is a powerful tool that simplifies this process by enabling seamless data transfer to Azure SQL Server. In this guide, we’ll walk you through the steps to copy your on-premises SQL database to Azure SQL Server using ADF, ensuring a smooth and efficient migration. Prerequisites Before you begin, ensure you have: Step 1: Create an Azure SQL Server Database First, set up your target database in Azure: Step 2: Configure the Azure Firewall To allow ADF to access your Azure SQL Database, configure the firewall settings: Step 3: Connect Your On-Premises SQL Database to ADF Next, use ADF Studio to link your on-premises database: Step 4: Set Up a Linked Service A Linked Service is required to connect ADF to your on-premises SQL database: Step 5: Install the Integration Runtime for On-Premises Data Since your data source is on-premises, you need an Integration Runtime: Finally, ensure everything is set up correctly: Step 6: Verify and Test the Connection To conclude, migrating you’re on-premises SQL database to Azure SQL Server using ADF is a straightforward process when broken down into these steps. By setting up the database, configuring the firewall, and establishing the necessary connections, you can ensure a secure and efficient data transfer. With your data now in the cloud, you can leverage Azure’s scalability and performance to optimize your workflows. Happy migrating! Please refer to our case study of the city Council https://www.cloudfronts.com/case-studies/city-council/ to know more about how we used the Azure Data Factory and other AIS to deliver seamless integration. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com.