Automate Azure Functions Flex Consumption Deployments with Azure DevOps and Azure CLI
Building low-latency, VNET-secure APIs with Azure Functions Flex Consumption is only the beginning.The next step toward modernization is setting up a DevOps release pipeline that automatically deploys your Function Apps-even across multiple regions – using Azure CLI. In this blog, we’ll explore how to implement a CI/CD pipeline using Azure DevOps and Azure CLI to deploy Azure Functions (Flex Consumption), handle cross-platform deployment scenarios, and ensure global availability. Step-by-Step Guide: Azure DevOps Pipeline for Azure Functions Flex Consumption Step 1: Prerequisites You’ll need: Step 2: Provision Function Infrastructure Using Azure CLI Step 3: Configure Azure DevOps Release Pipeline Important Note: Windows vs Linux in Flex Consumption While creating your pipeline, you might notice a critical difference: The Azure Functions Flex Consumption plan only supports Linux environments. If your existing Azure Function was originally created on a Windows-based plan, you cannot use the standard “Azure Function App Deploy” DevOps task, as it assumes Windows compatibility and won’t deploy successfully to Linux-based Flex Consumption. To overcome this, you must use Azure CLI commands (config-zip deployment) — exactly as shown above — to manually upload and deploy your packaged function code. This method works regardless of the OS runtime and ensures smooth deployment to Flex Consumption Functions without compatibility issues. Tip: Before migration, confirm that your Function’s runtime stack supports Linux. Most modern stacks like .NET 6+, Node.js, and Python run natively on Linux in Flex Consumption. Step 4: Secure Configurations and Secrets Use Azure Key Vault integration to safely inject configuration values: Step 5: Enable VNET Integration If your Function App accesses internal resources, enable VNET integration: Step 6: Multi-Region Deployment for High Availability For global coverage, you can deploy your Function Apps to multiple regions using Azure CLI: Dynamic Version (Recommended): This ensures consistent global rollouts across regions. Step 7: Rollback Strategy If deployment fails in a specific region, your pipeline can automatically roll back: Best Practices a. Use YAML pipelines for version-controlled CI/CDb. Use Azure CLI for Flex Consumption deployments (Linux runtime only)c. Add manual approvals for productiond. Monitor rollouts via Azure Monitore. Keep deployment scripts modular and parameterized To conclude, automating deployments for Azure Functions Flex Consumption using Azure DevOps and Azure CLI gives you: If your current Azure Function runs on Windows, remember — Flex Consumption supports only Linux-based plans, so CLI-based deployments are the way forward. Next Step:Start with one Function App pipeline, validate it in a Linux Flex environment, and expand globally. For expert support in automating Azure serverless solutions, connect with CloudFronts — your trusted Azure integration partner. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Flexible Line Display in Purchase Order Report – Business Central RDLC Layout
When working on report customizations in Microsoft Dynamics 365 Business Central, one common challenge is maintaining a consistent layout regardless of how many lines are present in the data source. This situation often arises in reports like Purchase Orders, Sales Orders, or Invoices, where the line section expands or contracts based on the number of lines in the dataset. However, certain business scenarios demand a fixed or uniform presentation, such as when a client wants consistent spacing or placeholders for manual inputs. This article demonstrates how you can achieve this flexibility purely through RDLC layout design – without making any changes in AL or dataset logic. Business Requirement The objective was to design a Purchase Order report where the line area maintains a consistent structure, independent of how many lines exist in the actual data. In other words, the report layout should not necessarily reflect the dataset exactly as it is. The idea was to ensure visual uniformity while keeping the underlying data logic simple. Proposed Solution The solution was implemented directly in the RDLC report layout by creating two tables and controlling their visibility through expressions. There was no need to align them in the same position one table was placed above the other. RDLC automatically handled which one to display at runtime based on the visibility conditions. Table 1 – Actual Purchase Lines Displays the real data from the Purchase Line dataset. Table 2 – Structured or Blank Layout Displays a predefined structure (for example, blank rows) when fewer lines are available. This design ensures that whichever table meets the visibility condition is rendered, maintaining layout flow automatically. Implementation Steps 1. Add Two Tables in the RDLC Layout 2. Set Visibility Conditions To control which table appears at runtime, open each table’s properties and go to:Table Properties → Visibility → Hidden → Expression Then apply the following expressions: For Table 1 (Actual Purchase Lines) =IIF(CountRows(“DataSet_Result”) <= 8, True, False) Hides the actual data table when the dataset has fewer rows. For Table 2 (Structured or Blank Layout) =IIF(CountRows(“DataSet_Result”) > 8, True, False) Hides the structured or blank table when enough data rows are available. Note: The number “8” is just an example threshold. You can set any value that fits your design requirement. Result At runtime: The RDLC engine handles layout adjustment, ensuring the report always looks uniform and visually balanced – without any need for AL code changes or temporary data handling. Advantages of This Approach Benefit Description No AL Code Changes Achieved entirely within RDLC layout. Upgrade Friendly Dataset and report objects remain unchanged. Automatic Layout Flow RDLC adjusts which table is displayed automatically. Professional Appearance Ensures consistent formatting and structure across all reports. Key Takeaways This simple yet effective approach shows that report design in Business Central can be made flexible without altering data logic.By using two tables with visibility expressions, you can create reports that adapt their appearance automatically – keeping the layout professional, stable, and easy to maintain. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Why Modern Enterprises Are Standardizing on the Medallion Architecture for Trusted Analytics
Enterprises today are collecting more data than ever before, yet most leaders admit they don’t fully trust the insights derived from it. Inconsistent formats, missing values, and unreliable sources create what’s often called a data swamp an environment where data exists but can’t be used confidently for decision-making. Clean, trusted data isn’t just a technical concern; it’s a business imperative. Without it, analytics, AI, and forecasting lose credibility and transformation initiatives stall before they start. That’s where the Medallion Architecture comes in. It provides a structured, layered framework for transforming raw, unreliable data into consistent, analytics-ready insights that executives can trust. At CloudFront’s, a Microsoft and Databricks partner, we’ve implemented this architecture to help enterprises modernize their data estates and unlock the full potential of their analytics investments. Why Data Trust Matters More Than Ever CIOs and data leaders today face a paradox: while data volumes are skyrocketing, confidence in that data is shrinking. Poor data quality leads to: In short, when data can’t be trusted, every downstream process from reporting to machine learning is compromised. The Medallion Architecture directly addresses this challenge by enforcing data quality, lineage, and governance at every stage. What Is the Medallion Architecture? The Medallion Architecture is a modern, layered data design framework introduced by Databricks. It organizes data into three progressive layers Bronze, Silver, and Gold each refining data quality and usability. This approach ensures that every layer of data builds upon the last, improving accuracy, consistency, and performance at scale. Inside Each Layer Bronze Layer —> Raw and Untouched The Bronze Layer serves as the raw landing zone for all incoming data. It captures data exactly as it arrives from multiple sources, preserving lineage and ensuring that no information is lost. This layer acts as a foundational source for subsequent transformations. Silver Layer —> Cleansing and Transformation At the Silver Layer, the raw data undergoes cleansing and standardization. Duplicates are removed, inconsistent formats are corrected, and business rules are applied. The result is a curated dataset that is consistent, reliable, and analytics ready. Gold Layer —> Insights and Business Intelligence The Gold Layer aggregates and enriches data around key business metrics. It powers dashboards, reporting, and advanced analytics, providing decision-makers with accurate and actionable insights. Example: Data Transformation Across Layers Layer Data Example Processing Applied Outcome Bronze Customer ID: 123, Name: Null, Date: 12-03-24 / 2024-03-12 Raw data captured as-is Unclean, inconsistent Silver Customer ID: 123, Name: Alex, Date: 2024-03-12 Standardization & de-duplication Clean & consistent Gold Customer ID: 123, Name: Alex, Year: 2024 Aggregation for KPIs Business-ready dataset This layered approach ensures data becomes progressively more accurate, complete, and valuable. Building Reliable, Performant Data Pipelines By leveraging Delta Lake on Databricks, the Medallion Architecture enables enterprises to unify streaming and batch data, automate validations, and ensure schema consistency creating an end-to-end, auditable data pipeline. This layered approach turns chaotic data flows into a structured, governed, and performant data ecosystem that scales as business needs evolve. Client Example: Retail Transformation in Action A leading hardware retailer in the Maldives faced challenges managing inventory and forecasting demand across multiple locations. They needed a unified data model that could deliver real-time visibility and predictive insights. CloudFront’s implemented the Medallion Architecture using Databricks: Results: Key Benefits for Enterprise Leaders Final Thoughts Clean, trusted data isn’t a luxury, it’s the foundation of every successful analytics and AI strategy. The Medallion Architecture gives enterprises a proven, scalable framework to transform disorganized, unreliable data into valuable, business-ready insights. At CloudFront’s, we help organizations modernize their data foundations with Databricks and Azure delivering the clarity, consistency, and confidence needed for data-driven growth. Ready to move from data chaos to clarity? Explore our Databricks Services or Talk to a Cloud Architect to start building your trusted analytics foundation today. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Boost Productivity with the Search in Company Data Feature in Business Central
In modern business settings, employees spend a significant portion of their time searching for information rather than using it. According to Microsoft, office workers can spend up to 20 % of their working time simply looking for data. With the “Search in company data” feature in Business Central, organizations can now provide users with faster, broader, and more relevant search capabilities—giving them more time to focus on strategic tasks rather than just data retrieval. Using this feature is straightforward and intuitive. You can either highlight any text within Business Central and open the Tell Me window, or type one or more keywords directly into it. Then, select the Search company data option to explore matching information across your system. So instead of opening Item list page and searching item name you can simply use above option. Once you click on Search Company Data it will open Search result with new page. You can simply click on result to open searched item page. You can enable more table to search across them by clicking “Setup where to search” option. To conclude, the Search in Company Data feature in Microsoft Dynamics 365 Business Central empowers users to find information faster and more efficiently. Instead of navigating through multiple pages or lists, users can now access the data they need directly through the Tell Me window. With the added flexibility to configure which tables and fields are searchable, organizations can tailor the experience to meet their specific needs. By simplifying the search process and enabling broader data accessibility, this feature not only saves time but also enhances productivity-allowing users to focus on decision-making and value-driven tasks rather than manual data lookups. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Taming the Chaos: A Guide to Dimension Correction in Business Central
We’ve all been there. You’re closing out the month, and you spot it: a General Journal line where the “Department” dimension is set to “Sales” but should have been “Marketing.” Or perhaps a purchase invoice was posted with an incorrect “Project” code. In the world of accounting and Microsoft Dynamics 365 Business Central, dimensions are the lifeblood of meaningful reporting, and even a single mistake can ripple through your financial statements, leading to misguided decisions and frantic period-end corrections. Fortunately, Microsoft Dynamics 365 Business Central offers a powerful, built-in safety net: the Dimension Correction feature. This isn’t just a handy tool, it’s a game-changer for financial integrity and auditor peace of mind. What Are Dimensions, and Why Do Mistakes Happen? Before diving into corrections, let’s quickly recap. Dimensions in Business Central are tags like Department, Project, Cost Center, or Region. Instead of creating separate G/L accounts for every possible combination, dimensions allow you to slice and dice your financial data, delivering incredible analytical power. Common Reasons These Errors Occur: In the past, fixing mistakes meant reversing entries, posting manual journals, and leaving a messy audit trail. Not anymore. Enter the Hero: The Dimension Correction Feature The Dimension Correction feature allows you to change dimensions on already posted entries without creating new transactions or affecting original amounts. It simply updates the dimensional context of the existing entry. Key Benefits of Dimension Correction How to Perform a Dimension Correction: A Step-by-Step Guide Let’s walk through correcting a simple example. Scenario: A telephone expense was incorrectly posted to the SALES department. It should have been posted to the MARKETING department. Step 1: Locate the Posted Entry Step 2: Initiate the Dimension Correction Step 3: Make the Correction Step 4: Verify the Change To conclude, The Dimension Correction feature transforms a once-tedious, error-prone process into a controlled, efficient, and auditable task. It empowers your finance team to maintain the integrity of your financial data without complex accounting workarounds. By understanding how to use this feature and following simple best practices, you ensure that your dimensions-and therefore your management reports – are always accurate, reliable, and ready to guide your business forward. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Don’t Just Delete, TRUNCATE: A Deep Dive into Blazing-Fast Data Clearing in Business Central
If you’ve worked with data in Business Central, you’ve undoubtedly used the DELETE or DELETEALL commands. They get the job done, but when you’re dealing with massive datasets ike clearing out old ledger entries, archived sales orders, or temporary import tables they can feel painfully slow. There’s a better, faster way. Let’s talk about the TRUNCATE TABLE command, the unsung hero of high-performance data purging. What is TRUNCATE TABLE? In simple terms, TRUNCATE TABLE is a SQL command that instantly removes all rows from a table. Unlike DELETE, it doesn’t log individual row deletions in the transaction log. It’s a bulk operation that de-allocates the data pages used by the table, which is why it’s so incredibly fast. In the context of Business Central, you can execute this command directly from an AL codeunit. Yes, it’s that simple. Calling the .TruncateTable() method on a record variable targets its corresponding table and empties it completely. TRUNCATE TABLE vs. DELETE/DELETEALL: What’s the Difference? This is the crucial part. Choosing the right tool is key to performance and data integrity. Feature TRUNCATE TABLE DELETE / DELETEALL Performance Extremely Fast. Operates at the data page level. Slow. Logs every single row deletion individually. Transaction Log Minimal logging. Fills the log with a single “deallocated page” entry. Heavy logging. Fills the log with an entry for every row deleted. Where Clause No. It’s all or nothing. You cannot add a filter. Yes. You can use SETFILTER or SETRANGE to delete specific records. Table Triggers Does not fire. No OnBeforeDelete or OnAfterDelete triggers are executed. Fires for each row that is deleted. Referential Integrity Can fail if a FOREIGN KEY constraint exists. Respects and checks constraints, potentially failing on related records. Resets Identity Seed Yes. The next record inserted will have the first ID in the series (e.g., 1). No. The identity seed continues from where it left off. Transaction Rollback Can be rolled back if used inside a transaction, but it’s still minimally logged. Can be rolled back, as all individual deletions are logged. When Should You Use TRUNCATE TABLE? Given its power and limitations, TRUNCATE TABLE is perfect for specific scenarios: A Real-World Business Central Example Imagine you have a custom “Data Import Staging” table. Every night, a job imports thousands of items from an external system. The first step is always to clear the staging area. The Slow Way (using DELETEALL): The Blazing-Fast Way (using TRUNCATE TABLE): The performance difference can be staggering, turning a minutes-long operation into one that completes in under a second. Critical Warnings and Best Practices With great power comes great responsibility. The limitations of TRUNCATE TABLE are not just footnotes—they are critical considerations. NO FILTERS! This is the biggest “gotcha.” You cannot use SETRANGE before calling TruncateTable(). The method will ignore any filters and always delete everything. Double and triple-check your code to ensure you are targeting the correct table. Bypasses Business Logic: Because table triggers do not fire, any essential business logic in the OnDelete trigger will be skipped. Do not use TRUNCATE TABLE on tables where the delete triggers perform critical actions (e.g., posting, ledger entry creation, validation). Using it on main transaction tables like “G/L Entry” or “Sales Line” is almost always a bad idea. Foreign Key Constraints: If another table has a foreign key constraint pointing to the table you’re trying to truncate, the command will fail with an error. DELETEALL would also fail in this case, but the error message might be different. To Conclude, TRUNCATE TABLE is a powerful tool that should be in every Business Central developer’s arsenal. When used correctly, it can dramatically improve the performance of data maintenance tasks. The Rule of Thumb: Use DELETEALL when you need to respect business logic, delete specific records, or work with tables that have complex relationships. Use TRUNCATE TABLE when you need to quickly and completely empty a large, standalone table where bypassing business logic is safe and acceptable. Embrace TRUNCATE TABLE for the right jobs and watch your large-scale data operations fly. Reference: https://yzhums.com/67343/, We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Configuring OAuth 2.0 Authentication in Power Automate
In today’s automated world, businesses depend on secure, streamlined connections between systems to improve efficiency. Power Automate, a robust tool for building workflows between various services, allows seamless integration of applications and APIs. However, when working with third-party services, ensuring that data access is secure and well-managed is critical. This is where OAuth 2.0, a secure and standard protocol for authorization, comes into play. Are you struggling to configure OAuth 2.0 authentication in your Power Automate flows? If you are considering automating workflows that interact with secured APIs, this article is for you. I will walk you through configuring OAuth 2.0 in Power Automate, so you can ensure the safety of your automation while keeping your services accessible. Why OAuth 2.0? OAuth 2.0 is the industry-standard protocol for authorization. It allows users to grant third-party applications limited access to their resources without exposing passwords. By using OAuth 2.0 in Power Automate, you ensure that the services and APIs you connect to are secure, and that tokens are used to access data on behalf of the user. How OAuth 2.0 Enhances Security OAuth 2.0 significantly improves security by eliminating the need to share sensitive credentials. Instead, access is granted through tokens, which are time-limited and easily revocable. OAuth 2.0 is widely used by many companies, including Microsoft, Google, and Salesforce, to integrate applications securely. Step-by-Step Guide to Configuring OAuth 2.0 in Power Automate 1. Set Up OAuth 2.0 Credentials Before configuring OAuth 2.0 in Power Automate, you need to set up OAuth 2.0 credentials in the platform you’re working with. For example, if you’re using Microsoft Graph API or any third-party service, follow these steps: 2. Initialize OAuth 2.0 Variables in Power Automate Now that you have your client ID and client secret, it’s time to configure them in Power Automate. Set up the variables: 3. Configuring the OAuth 2.0 Connection in Power Automate With the client credentials set, it’s time to establish the connection to the service using OAuth 2.0. 4. Use OAuth Token to Access Secure Data Now that you have the OAuth token, you can use it to authenticate your requests to third-party APIs. 5. Best Practices for OAuth 2.0 in Power Automate To conclude, OAuth 2.0 authentication provides a secure and effective way to authorize third-party applications in Power Automate. By following the steps outlined in this guide, you can set up OAuth 2.0 authentication, ensure data security, and integrate third-party services into your automation workflows with ease. If you’re ready to secure your Power Automate workflows with OAuth 2.0, follow the steps outlined in this post and start integrating APIs in a secure manner today. For more tips and detailed guides, check out our other blog posts on Power Automate and API integration. Need help with the OAuth 2.0 integration? Feel free to reach out for assistance! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Mitigating Implementation Risks Through a Structured Business Assessment
The landscape of digital transformation has never been more complex. Rapid market shifts, rising customer demands, and tightening budgets have made technology decisions more consequential than ever. The challenge isn’t adopting new tools it’s leading transformation by ensuring that every investment is grounded in clarity, alignment, and predictability. At CloudFront’s, we understand this. That’s why our Business Assessment Engagement model has become a proven first step toward successful, low-risk technology implementations. What Is a Business Assessment? A Business Assessment is a structured, short-term engagement conducted before signing a full implementation Statement of Work (SoW). It is designed to create complete visibility into your current business processes, desired future state, and the potential risks that could impact your project. Typically spanning 3 – 4 weeks, this engagement brings together functional and technical stakeholders from both your organization and CloudFront’s. Whenever feasible, we conduct this assessment onsite, ensuring close collaboration and a deep understanding of your business landscape. During this engagement, our experts: The result is a detailed Business Requirements Study (BRS) a comprehensive document that translates assessment insights into an actionable implementation roadmap. This BRS becomes the foundation for a precise and mutually agreed Statement of Work, ensuring every phase of your digital transformation is built on validated insights and shared understanding. Why a Business Assessment Matters For enterprise technology leaders, the Business Assessment approach delivers tangible benefits: Ultimately, this process transforms uncertainty into informed decision-making, enabling IT leaders to confidently advance from planning to execution. Proven Success with CloudFront’s At CloudFront’s, we’ve seen firsthand how Business Assessment engagements set the stage for successful digital transformations. Clients who adopt this model enter implementation phases with greater predictability, stronger governance, and renewed confidence in both the technology and the partnership driving it. Recently, we partnered with one of the world’s largest U.S. based commercial vehicle manufacturers to conduct an onsite Business Requirements Study (BRS). Our team worked closely with their stakeholders to map existing systems and design a strategic roadmap for migration to Microsoft Dynamics 365 Supply Chain Management (SCM). Following the successful completion of the BRS, we are now leading Phase 1, enabling their inventory, advanced warehouse, and procurement operations to establish a strong operational foundation. In Phase 2, we will enable master planning, production, and quality management to deliver end-to-end operational efficiency, ensuring a seamless and future-ready digital ecosystem. Our clients consistently tell us that this approach not only de-risks their investment but also enhances alignment between business and IT, a crucial factor in any transformation journey. To conclude, in today’s unpredictable business landscape, a well-executed Business Assessment isn’t just a preliminary step, it’s a strategic imperative. By partnering with CloudFront’s for a Business Assessment, you’re not committing to uncertainty; you’re investing in clarity, alignment, and long-term success. If your organization is planning a digital transformation initiative, start with a Business Assessment Engagement and move forward with the confidence of knowing your path is mapped, risks are managed, and success is measurable. Ready to move from uncertainty to clarity?Connect with CloudFront’s at transform@cloudfronts.com to schedule a Business Assessment Engagement and gain a clear, actionable roadmap for your next digital transformation. Contact Us to start your assessment today.
Share Story :
FetchXML Made Simple: Power Pages Tips for Dynamic Data Retrieval
Dynamics 365 Power Apps Portals (formerly Dynamics 365 Portals) allow organizations to securely expose CRM data to external users. However, fetching and displaying CRM records in a portal page requires more than just entity lists – it often needs custom data queries. That’s where FetchXML comes in. FetchXML is Dynamics 365’s native XML-based query language used to retrieve data and it’s fully supported in Liquid templates within portals. Step 1: Pre-Outline Brief Target Audience: How this blog helps: Step 2: Blog Outline Opening:Identify the need for FetchXML in Power Pages and its importance for developers and portal managers. Core Content: Step 3: Blog Post Introduction For businesses leveraging Microsoft Power Pages, the ability to pull dynamic data from Dataverse is critical. While out-of-the-box entity lists work for simple scenarios, complex needs — such as personalized dashboards and filtered data — require custom FetchXML queries embedded in Liquid templates. In this post, we’ll walk you through how FetchXML works in Power Pages, share examples, and provide best practices so you can deliver efficient, personalized portals. Why This Matters For growing businesses, service portals need more than just static lists. As the volume of data increases, the ability to dynamically query and display relevant information becomes essential to maintain performance, improve user experience, and reduce maintenance efforts. With FetchXML in Liquid, developers can: Prerequisites Before getting started, ensure: Understanding FetchXML FetchXML is an XML-based query language for Dataverse. It allows you to: Example: Retrieve all active contacts: Using FetchXML in Power Pages (Liquid Templates) Here’s a basic implementation: This will execute the query and display results dynamically in your portal. Making FetchXML Dynamic You can make FetchXML personalized by using Liquid variables. Example: Display cases only for the logged-in user: Real-World Example: Recent Cases Dashboard] Best Practices To conclude,FetchXML in Power Pages is a powerful tool for creating customized, dynamic, and efficient portals. Start small — add a dynamic list or dashboard to your portal today. If you need expert guidance, CloudFronts can help you implement FetchXML-driven solutions tailored to your business needs. 💡 Want to learn more? Reach out to CloudFronts Technologies at transform@cloudfronts.com to explore FetchXML use cases for your portals and improve your customer experience.
Share Story :
From Portal Chaos to Power Pages Zen: My Journey Automating Client Forms
Power Pages, the modern evolution of Power Apps Portals, has redefined how organizations build secure, data-driven web experiences connected to Dynamics 365. But let’s be honest, for anyone who’s wrestled with the old portal setup, the journey from chaos to clarity isn’t always smooth. In this blog, I’ll share how I transformed a tangled web of client forms and scripts into a streamlined Power Pages experience using Dynamics 365 forms, Liquid templates, and JavaScript automation — and what I learned along the way. The Beginning of PortalsMy story began with what I thought was a simple request, automate a few client onboarding forms in Power Apps Portals.What followed? I realized I wasn’t managing a portal — I was managing chaos. That’s when I decided to rebuild everything in Power Pages, the modernized, secure, and design-friendly version of Power Apps Portals. Why Power Pages Changed Everything Power Pages offers a low-code, high-control environment that connects directly to Dataverse and Dynamics 365.Here’s what made it a game-changer for me: 1. Built-In Dataverse Power No more juggling SQL tables or external APIs.Dataverse made it simple to store, validate, and update client data directly within Dynamics 365 — cutting down my custom integration scripts by almost 60%. 2. Cleaner Authentication With Azure AD B2C integration, user sign-ins became seamless and secure.I could finally define granular access roles without needing custom web roles or Liquid conditionals scattered across pages. 3. Design That Doesn’t Break Your Brain The Power Pages Design Studio felt like moving from notepad to Figma — I could visually build layouts, insert lists, and add forms connected to Dynamics data without touching complex HTML. Automating Client Forms: My Aha Moment The real “Zen” moment came when I realized that automation in Power Pages didn’t need to be messy.Here’s how I approached it step-by-step: Used Dynamics 365 Forms in Power PagesEmbedded native forms from Dynamics instead of building them from scratch — they respected business rules and validation logic automatically. Applied Liquid Templates for Smart RenderingI used Liquid to conditionally show fields and sections, keeping client forms dynamic and user-friendly.Example: Added JavaScript AutomationFor client-side logic like field dependencies, autofill, and dynamic visibility, JavaScript did the trick. Because Power Pages supports modern script handling, I could isolate my logic cleanly instead of cluttering the HTML.Example: Leveraged Power AutomateIntegrated flows triggered on form submission to send confirmation emails, update records, and even notify the sales team instantly. I integrated Power Automate flows for backend actions: This separation of concerns (frontend in JS/Liquid, backend in Flows) made everything more maintainable. Design Meets Logic: Keeping It Clean One of my key lessons – separate design from logic.Power Pages Studio handled the look and feel, while all the conditional logic stayed in: This modular approach made my site easier to maintain and upgrade later. Security & Permissions Simplified Earlier, managing web roles in Portals was like untangling a spider web.Now with Power Pages: The result? A cleaner, safer, and more scalable structure. The End Result: From Chaos to Zen After weeks of trial, testing, and caffeine, my new Power Pages site was: What once required hours of manual fixes now runs seamlessly, freeing me to focus on building rather than babysitting. Happy Developing!We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
