Latest Microsoft Dynamics 365 Blogs | CloudFronts - Page 2

FetchXML Made Simple: Power Pages Tips for Dynamic Data Retrieval

Dynamics 365 Power Apps Portals (formerly Dynamics 365 Portals) allow organizations to securely expose CRM data to external users. However, fetching and displaying CRM records in a portal page requires more than just entity lists – it often needs custom data queries. That’s where FetchXML comes in. FetchXML is Dynamics 365’s native XML-based query language used to retrieve data and it’s fully supported in Liquid templates within portals. Step 1: Pre-Outline Brief Target Audience: How this blog helps: Step 2: Blog Outline Opening:Identify the need for FetchXML in Power Pages and its importance for developers and portal managers. Core Content: Step 3: Blog Post Introduction For businesses leveraging Microsoft Power Pages, the ability to pull dynamic data from Dataverse is critical. While out-of-the-box entity lists work for simple scenarios, complex needs — such as personalized dashboards and filtered data — require custom FetchXML queries embedded in Liquid templates. In this post, we’ll walk you through how FetchXML works in Power Pages, share examples, and provide best practices so you can deliver efficient, personalized portals. Why This Matters For growing businesses, service portals need more than just static lists. As the volume of data increases, the ability to dynamically query and display relevant information becomes essential to maintain performance, improve user experience, and reduce maintenance efforts. With FetchXML in Liquid, developers can: Prerequisites Before getting started, ensure: Understanding FetchXML FetchXML is an XML-based query language for Dataverse. It allows you to: Example: Retrieve all active contacts: Using FetchXML in Power Pages (Liquid Templates) Here’s a basic implementation: This will execute the query and display results dynamically in your portal. Making FetchXML Dynamic You can make FetchXML personalized by using Liquid variables. Example: Display cases only for the logged-in user: Real-World Example: Recent Cases Dashboard] Best Practices To conclude,FetchXML in Power Pages is a powerful tool for creating customized, dynamic, and efficient portals. Start small — add a dynamic list or dashboard to your portal today. If you need expert guidance, CloudFronts can help you implement FetchXML-driven solutions tailored to your business needs. 💡 Want to learn more? Reach out to CloudFronts Technologies at transform@cloudfronts.com to explore FetchXML use cases for your portals and improve your customer experience.

Share Story :

From Portal Chaos to Power Pages Zen: My Journey Automating Client Forms

Power Pages, the modern evolution of Power Apps Portals, has redefined how organizations build secure, data-driven web experiences connected to Dynamics 365. But let’s be honest, for anyone who’s wrestled with the old portal setup, the journey from chaos to clarity isn’t always smooth. In this blog, I’ll share how I transformed a tangled web of client forms and scripts into a streamlined Power Pages experience using Dynamics 365 forms, Liquid templates, and JavaScript automation — and what I learned along the way. The Beginning of PortalsMy story began with what I thought was a simple request, automate a few client onboarding forms in Power Apps Portals.What followed? I realized I wasn’t managing a portal — I was managing chaos. That’s when I decided to rebuild everything in Power Pages, the modernized, secure, and design-friendly version of Power Apps Portals. Why Power Pages Changed Everything Power Pages offers a low-code, high-control environment that connects directly to Dataverse and Dynamics 365.Here’s what made it a game-changer for me: 1. Built-In Dataverse Power No more juggling SQL tables or external APIs.Dataverse made it simple to store, validate, and update client data directly within Dynamics 365 — cutting down my custom integration scripts by almost 60%. 2. Cleaner Authentication With Azure AD B2C integration, user sign-ins became seamless and secure.I could finally define granular access roles without needing custom web roles or Liquid conditionals scattered across pages. 3. Design That Doesn’t Break Your Brain The Power Pages Design Studio felt like moving from notepad to Figma — I could visually build layouts, insert lists, and add forms connected to Dynamics data without touching complex HTML. Automating Client Forms: My Aha Moment The real “Zen” moment came when I realized that automation in Power Pages didn’t need to be messy.Here’s how I approached it step-by-step: Used Dynamics 365 Forms in Power PagesEmbedded native forms from Dynamics instead of building them from scratch — they respected business rules and validation logic automatically. Applied Liquid Templates for Smart RenderingI used Liquid to conditionally show fields and sections, keeping client forms dynamic and user-friendly.Example: Added JavaScript AutomationFor client-side logic like field dependencies, autofill, and dynamic visibility, JavaScript did the trick. Because Power Pages supports modern script handling, I could isolate my logic cleanly instead of cluttering the HTML.Example: Leveraged Power AutomateIntegrated flows triggered on form submission to send confirmation emails, update records, and even notify the sales team instantly. I integrated Power Automate flows for backend actions: This separation of concerns (frontend in JS/Liquid, backend in Flows) made everything more maintainable. Design Meets Logic: Keeping It Clean One of my key lessons – separate design from logic.Power Pages Studio handled the look and feel, while all the conditional logic stayed in: This modular approach made my site easier to maintain and upgrade later. Security & Permissions Simplified Earlier, managing web roles in Portals was like untangling a spider web.Now with Power Pages: The result? A cleaner, safer, and more scalable structure. The End Result: From Chaos to Zen After weeks of trial, testing, and caffeine, my new Power Pages site was: What once required hours of manual fixes now runs seamlessly, freeing me to focus on building rather than babysitting. Happy Developing!We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

GST Implementation Made Easy in Dynamics 365 Business Central

For any Indian business running on Microsoft Dynamics 365 Business Central, tax compliance isn’t optional, it’s foundational. The Goods and Services Tax (GST) framework is complex and manually managing it is a high-risk gamble. This guide isn’t just a list of steps; it’s your definitive blueprint for configuring Business Central’s powerful Indian localization features to handle GST seamlessly. We will transform your ERP from a standard ledger into a fully automated, compliance-ready machine. Ready to banish tax-related data entry errors and audit anxiety? Let’s dive in and set up the system correctly, from defining your GSTINs to mastering the G/L posting matrix. Microsoft Dynamics 365 Business Central offers robust localization features for India, including comprehensive support for the Goods and Services Tax (GST). Properly configuring GST is essential for calculating, recording, and settling taxes on all your inward and outward supplies, ensuring compliance with Indian tax laws. This guide provides a straightforward, step-by-step process for setting up GST in Business Central, based on Microsoft’s best practices. Phase 1: Laying the Foundation (Tax Periods & Registration) The initial phase involves setting up the legal and temporal frameworks for your GST configuration. Step 1: Define Tax Accounting Periods (GST Calendar) The GST regime operates on a specific timeline, and you need to define this within Business Central. Step 2: Establish Your GST Registration Numbers (GSTINs) Your Goods and Service Tax Payer Identification Number (GSTIN) is critical for identifying your tax entity and the state you operate in. Phase 2: Core Configuration (G/L Accounts and Masters) This phase links the statutory requirements with your company’s general ledger structure. Step 3: Configure GST Groups and HSN/SAC Codes These setups classify your goods and services for accurate rate calculation. Step 4: Define the GST Posting Setup (The Accounting Link) This is perhaps the most crucial step, as it determines which General Ledger (G/L) accounts are used to post GST amounts. Step 5: Set Up GST Rates With your Groups and HSN/SAC codes defined, you now specify the actual tax percentages. Phase 3: Master Data Integration (Connecting the Dots) The final phase ensures that your business entities and locations are linked to the defined GST rules. Step 6: Update Company and Location Information Your company’s primary details must be GST-compliant. Step 7: Configure Customer and Vendor Master Data For every trading partner, you must define their GST status and registration details. To conclude, by following these seven steps, your Indian company’s Business Central environment will be fully configured to handle GST calculations automatically. This setup allows the system to determine the correct tax component (CGST, SGST, or IGST), apply the right rate, and post the amounts to the designated G/L accounts, simplifying your day-to-day transactions and preparing you for GST settlements and reporting. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Advantages and Future Scope of the Unified Databricks Architecture – Part 2

Following our unified data architecture implementation using Databricks Unity Catalog, the next step focuses on understanding the advantages and future potential of this Lakehouse-driven ecosystem. The architecture consolidates data from multiple business systems and transforms it into an AI-powered data foundation that will support advanced analytics, automation, and conversational insights. Key Advantages Centralized Governance:Unity Catalog provides complete visibility into data lineage, security, and schema control — eliminating silos. Dynamic and Scalable Data Loading:A single Databricks notebook can dynamically load and transform data from multiple systems, simplifying maintenance. Enhanced Collaboration:Teams across domains can access shared data securely while maintaining compliance and data accuracy. Improved BI and Reporting:More than 30 Power BI reports are being migrated to the Gold layer for unified reporting. AI & Automation Ready:The architecture supports seamless integration with GenAI tools like Genie for natural language Q&A and predictive insights. Future Aspects In the next phase, we aim to:– Integrate Genie for conversational analytics.– Enable real-time insights through streaming pipelines.– Extend the Lakehouse to additional business sources.– Automate AI-based report generation and anomaly detection. For example, business users will soon be able to ask questions like:“How many hours did a specific resource submit in CRM time entries last week?”Databricks will process this query dynamically, returning instant, AI-driven insights. To conclude, the unified Databricks architecture is more than a data pipeline — it’s the foundation for AI-powered decision-making. By merging governance, automation, and intelligence, CloudFronts is building the next generation of data-first, AI-ready enterprise solutions.

Share Story :

Unified Data Architecture with Databricks Unity Catalog – Part 1

At CloudFronts Technologies, we are implementing a Unified Data Architecture powered by Databricks Unity Catalog to bring together data from multiple business systems into one governed, AI-ready platform. This solution integrates five major systems — Zoho People, Zoho Books, Business Central, Dynamics 365 CRM, and QuickBooks — using Azure Logic Apps, Blob Storage, and Databricks to build a centralized Lakehouse foundation. Objective To design a multi-source data architecture that supports:– Centralized data storage via Unity Catalog.– Automated ingestion through Azure Logic Apps.– Dynamic data loading and transformation in Databricks.– Future-ready integration for AI and BI analytics. Architecture Overview Data Flow Summary:1. Azure Logic Apps extract data from each of the five sources via APIs.2. Data is stored in Azure Blob Storage containers.3. Blob containers are mounted to Databricks for unified access.4. A dynamic Databricks notebook reads and processes data from all sources. Each data source operates independently while following a governed and modular design, making the solution scalable and easily maintainable. Role of Unity Catalog Unity Catalog enables lineage, and secure access across teams. Each layer — Bronze (raw), Silver (refined), and Gold (business-ready) — is managed under Unity Catalog, ensuring clear visibility into data flow and ownership. This ensures that as data grows, governance and performance remain consistent across all environments. Implementation Preview:In the upcoming blog, I will demonstrate the end-to-end implementation of one Power BI report using this unified Databricks architecture. This will include connecting the gold layer dataset from Databricks to Power BI, building dynamic visuals, and showcasing how the unified data foundation simplifies report creation and maintenance across multiple systems. To conclude, this architecture lays the foundation for a unified, governed, and scalable data ecosystem. By combining Azure Logic Apps, Blob Storage, and Databricks Unity Catalog, we are enabling a single source of truth that supports analytics, automation, and future AI innovations.

Share Story :

Optimum Window Partners with CloudFronts for Managed Services Agreement (MSA) Renewal 

We are delighted to announce that the largest US-based manufacturer of Fire-Rated and Architectural steel windows, Optimum Windows is partnering with CloudFronts for Managed Services Agreement (MSA) renewal.  Optimum Window, established in 1985, is a family-owned business based out of Ellenville, in upstate NY. Since then, Optimum Window has become the largest and most diversified manufacturer of Fire-Rated and Architectural steel windows in the United States and has continued its growth with a series of custom high-tech metal window and door systems designed for commercial, high-end residential & landmark applications. Learn more about Optimum Window at https://optimumwindow.com/   Optimum Window’s partnership with CloudFronts began with the implementation of a CRM system with custom enhancements that automates their end-to-end sales & order processes.  Under this MSA, CloudFronts will provide support & maintenance services for the system based on Microsoft Dynamics 365 Sales.  About CloudFronts  CloudFronts is a global AI- First Microsoft Solutions & Databricks Partner for Business Applications, Data & AI, helping teams and organizations worldwide solve their complex business challenges with Microsoft Cloud, AI, and Azure Integration Services. We have a global presence with offices in U.S, Singapore & India.    Since its inception in 2012, CloudFronts has successfully served over 200+ small and medium-sized clients all over the world, such as North America, Europe, Australia, MENA, Maldives & India, with diverse experiences in sectors ranging from Professional Services, Financial Services, Manufacturing, Retail, Logistics/SCM, and Non-profits.     Please feel free to connect with us at transform@cloudfronts.com 

Share Story :

Connecting Databricks to Power BI: A Step-by-Step Guide for Secure and Fast Reporting

Azure Databricks has become the go-to platform for data engineering and analytics, while Power BI remains the most powerful visualization tool in the Microsoft ecosystem. Connecting Databricks to Power BI bridges the gap between your data lakehouse and business users, enabling real-time insights from curated Delta tables. In this blog, we’ll walk through the process of securely connecting Power BI to Databricks, covering both DirectQuery and Import mode, and sharing best practices for performance and governance. Architecture Overview The connection involves:– Azure Databricks → Your compute and transformation layer.– Delta Tables → Your curated and query-optimized data.– Power BI Desktop / Service → Visualization and sharing platform. Flow:1. Databricks processes and stores curated data in Delta format.2. Power BI connects directly to Databricks using the built-in connector.3. Users consume dashboards that are either refreshed on schedule (Import) or query live (DirectQuery). Step 1: Get Connection Details from Databricks In your Azure Databricks workspace:1. Go to the Compute tab and open your cluster (or SQL Warehouse if using Databricks SQL).2. Click on ‘Advanced → JDBC/ODBC’ tab.3. Copy the Server Hostname and HTTP Path — you’ll need these for Power BI. For example:– Server Hostname: adb-1234567890123456.7.azuredatabricks.net– HTTP Path: /sql/1.0/endpoints/1234abcd5678efgh Step 2: Configure Databricks Personal Access Token (PAT) Power BI uses this token to authenticate securely.1. In Databricks, click your profile icon → User Settings → Developer → Access Tokens.2. Click Generate New Token, provide a name and expiration, and copy the token immediately. (You won’t be able to view it again.) Step 3: Connect from Power BI Desktop 1. Open Power BI Desktop.2. Go to Get Data → Azure → Azure Databricks.3. In the connection dialog:   – Server Hostname: paste from Step 1   – HTTP Path: paste from Step 14. Click OK, and when prompted for credentials:   – Select Azure Databricks Personal Access Token   – Enter your token in the Password field. You’ll now see the list of Databricks tables and databases available for import. To conclude, you’ve successfully connected Power BI to Azure Databricks, unlocking analytical capabilities over your Lakehouse. This setup provides flexibility to work in Import mode for speed or Direct Query mode for live data — all while maintaining enterprise security through Azure AD or Personal Access Tokens. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How Delta Lake Keeps Your Data Clean, Consistent, and Future-Ready

Delta Lake is a storage layer that brings reliability, consistency, and flexibility to big data lakes. It enables advanced features such as Time Travel, Schema Evolution, and ACID Transactions, which are crucial for modern data pipelines. Feature Benefit Time Travel Access historical data for auditing, recovery, or analysis. Schema Evolution Adapt automatically to changes in the data schema. ACID Transactions Guarantee reliable and consistent data with atomic upserts. 1. Time Travel Time Travel allows you to access historical versions of your data, making it possible to “go back in time” and query past snapshots of your dataset. Use Cases:– Recover accidentally deleted or updated data.– Audit and track changes over time.– Compare dataset versions for analytics. How it works:Delta Lake maintains a transaction log that records every change made to the table. You can query a previous version using either a timestamp or a version number. Example: 2. Schema Evolution Schema Evolution allows your Delta table to adapt automatically to changes in the data schema without breaking your pipelines. Use Cases:– Adding new columns to your dataset.– Adjusting to evolving business requirements.– Simplifying ETL pipelines when source data changes. How it works:When enabled, Delta automatically updates the table schema if the incoming data contains new columns. Example: 3. ACID Transactions (with Atomic Upsert) ACID Transactions (Atomicity, Consistency, Isolation, Durability) ensure that all data operations are reliable and consistent, even in the presence of concurrent reads and writes. Atomic Upsert guarantees that an update or insert operation happens fully or not at all. Key Benefits:– No partial updates — either all changes succeed or none.– Safe concurrent updates from multiple users or jobs.– Consistent data for reporting and analytics.– Atomic Upsert ensures data integrity during merges. Atomic Upsert Example (MERGE): Here:– whenMatchedUpdateAll() updates existing rows.– whenNotMatchedInsertAll() inserts new rows.– The operation is atomic — either all updates and inserts succeed together or none. To conclude, Delta Lake makes data pipelines modern, maintainable, and error-proof. By leveraging Time Travel, Schema Evolution, and ACID Transactions, you can build robust analytics and ETL workflows with confidence, ensuring reliability, consistency, and adaptability in your data lake operations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Handling Errors and Retries in Dynamics 365 Logic App Integrations

Integrating Dynamics 365 (D365) with external systems using Azure Logic Apps is one of the most common patterns for automation. But in real-world projects, things rarely go smoothly – API throttling, network timeouts, and unexpected data issues are everyday challenges. Without proper error handling and retry strategies, these issues can result in data mismatches, missed transactions, or broken integrations. In this blog, we’ll explore how to handle errors and implement retries in D365 Logic App integrations, ensuring your workflows are reliable, resilient, and production-ready. Core Content 1. Why Error Handling Matters in D365 Integrations Without handling these, your Logic App either fails silently or stops execution entirely, causing broken processes.  2. Built-in Retry Policies in Logic Apps What They Are:Every Logic App action comes with a retry policy that can be configured to automatically retry failed requests. Best Practice: 3. Handling Errors with Scopes and “Run After” Scopes in Logic Apps let you group actions and then define what happens if they succeed or fail. Steps: Example: 4. Designing Retry + Error Flow Together Recommended Pattern: This ensures no transaction is silently lost. 5. Handling Dead-lettering with Service Bus (Advanced) For high-volume integrations, you may need a dead-letter queue (DLQ) approach: This pattern prevents data loss while keeping integrations lightweight. 6. Monitoring & Observability Error handling isn’t complete without monitoring. Building resilient integrations between D365 and Logic Apps isn’t just about connecting APIs—it’s about ensuring reliability even when things go wrong. By configuring retry policies, using scopes for error handling, and adopting dead-lettering for advanced cases, you’ll drastically reduce downtime and data mismatches. Next time you design a D365 Logic App, don’t just think about the happy path. Build error handling and retry strategies from the start, and you’ll thank yourself later when your integration survives the unexpected. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange