Optimum Window Partners with CloudFronts for Managed Services Agreement (MSA) Renewal
We are delighted to announce that the largest US-based manufacturer of Fire-Rated and Architectural steel windows, Optimum Windows is partnering with CloudFronts for Managed Services Agreement (MSA) renewal. Optimum Window, established in 1985, is a family-owned business based out of Ellenville, in upstate NY. Since then, Optimum Window has become the largest and most diversified manufacturer of Fire-Rated and Architectural steel windows in the United States and has continued its growth with a series of custom high-tech metal window and door systems designed for commercial, high-end residential & landmark applications. Learn more about Optimum Window at https://optimumwindow.com/ Optimum Window’s partnership with CloudFronts began with the implementation of a CRM system with custom enhancements that automates their end-to-end sales & order processes. Under this MSA, CloudFronts will provide support & maintenance services for the system based on Microsoft Dynamics 365 Sales. About CloudFronts CloudFronts is a global AI- First Microsoft Solutions & Databricks Partner for Business Applications, Data & AI, helping teams and organizations worldwide solve their complex business challenges with Microsoft Cloud, AI, and Azure Integration Services. We have a global presence with offices in U.S, Singapore & India. Since its inception in 2012, CloudFronts has successfully served over 200+ small and medium-sized clients all over the world, such as North America, Europe, Australia, MENA, Maldives & India, with diverse experiences in sectors ranging from Professional Services, Financial Services, Manufacturing, Retail, Logistics/SCM, and Non-profits. Please feel free to connect with us at transform@cloudfronts.com
Share Story :
Connecting Databricks to Power BI: A Step-by-Step Guide for Secure and Fast Reporting
Azure Databricks has become the go-to platform for data engineering and analytics, while Power BI remains the most powerful visualization tool in the Microsoft ecosystem. Connecting Databricks to Power BI bridges the gap between your data lakehouse and business users, enabling real-time insights from curated Delta tables. In this blog, we’ll walk through the process of securely connecting Power BI to Databricks, covering both DirectQuery and Import mode, and sharing best practices for performance and governance. Architecture Overview The connection involves:– Azure Databricks → Your compute and transformation layer.– Delta Tables → Your curated and query-optimized data.– Power BI Desktop / Service → Visualization and sharing platform. Flow:1. Databricks processes and stores curated data in Delta format.2. Power BI connects directly to Databricks using the built-in connector.3. Users consume dashboards that are either refreshed on schedule (Import) or query live (DirectQuery). Step 1: Get Connection Details from Databricks In your Azure Databricks workspace:1. Go to the Compute tab and open your cluster (or SQL Warehouse if using Databricks SQL).2. Click on ‘Advanced → JDBC/ODBC’ tab.3. Copy the Server Hostname and HTTP Path — you’ll need these for Power BI. For example:– Server Hostname: adb-1234567890123456.7.azuredatabricks.net– HTTP Path: /sql/1.0/endpoints/1234abcd5678efgh Step 2: Configure Databricks Personal Access Token (PAT) Power BI uses this token to authenticate securely.1. In Databricks, click your profile icon → User Settings → Developer → Access Tokens.2. Click Generate New Token, provide a name and expiration, and copy the token immediately. (You won’t be able to view it again.) Step 3: Connect from Power BI Desktop 1. Open Power BI Desktop.2. Go to Get Data → Azure → Azure Databricks.3. In the connection dialog: – Server Hostname: paste from Step 1 – HTTP Path: paste from Step 14. Click OK, and when prompted for credentials: – Select Azure Databricks Personal Access Token – Enter your token in the Password field. You’ll now see the list of Databricks tables and databases available for import. To conclude, you’ve successfully connected Power BI to Azure Databricks, unlocking analytical capabilities over your Lakehouse. This setup provides flexibility to work in Import mode for speed or Direct Query mode for live data — all while maintaining enterprise security through Azure AD or Personal Access Tokens. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Deep Foods: Enabling Data-Driven Decisions: How Deep Foods Transformed Sales Performance with Power BI
Share Story :
How Delta Lake Keeps Your Data Clean, Consistent, and Future-Ready
Delta Lake is a storage layer that brings reliability, consistency, and flexibility to big data lakes. It enables advanced features such as Time Travel, Schema Evolution, and ACID Transactions, which are crucial for modern data pipelines. Feature Benefit Time Travel Access historical data for auditing, recovery, or analysis. Schema Evolution Adapt automatically to changes in the data schema. ACID Transactions Guarantee reliable and consistent data with atomic upserts. 1. Time Travel Time Travel allows you to access historical versions of your data, making it possible to “go back in time” and query past snapshots of your dataset. Use Cases:– Recover accidentally deleted or updated data.– Audit and track changes over time.– Compare dataset versions for analytics. How it works:Delta Lake maintains a transaction log that records every change made to the table. You can query a previous version using either a timestamp or a version number. Example: 2. Schema Evolution Schema Evolution allows your Delta table to adapt automatically to changes in the data schema without breaking your pipelines. Use Cases:– Adding new columns to your dataset.– Adjusting to evolving business requirements.– Simplifying ETL pipelines when source data changes. How it works:When enabled, Delta automatically updates the table schema if the incoming data contains new columns. Example: 3. ACID Transactions (with Atomic Upsert) ACID Transactions (Atomicity, Consistency, Isolation, Durability) ensure that all data operations are reliable and consistent, even in the presence of concurrent reads and writes. Atomic Upsert guarantees that an update or insert operation happens fully or not at all. Key Benefits:– No partial updates — either all changes succeed or none.– Safe concurrent updates from multiple users or jobs.– Consistent data for reporting and analytics.– Atomic Upsert ensures data integrity during merges. Atomic Upsert Example (MERGE): Here:– whenMatchedUpdateAll() updates existing rows.– whenNotMatchedInsertAll() inserts new rows.– The operation is atomic — either all updates and inserts succeed together or none. To conclude, Delta Lake makes data pipelines modern, maintainable, and error-proof. By leveraging Time Travel, Schema Evolution, and ACID Transactions, you can build robust analytics and ETL workflows with confidence, ensuring reliability, consistency, and adaptability in your data lake operations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Handling Errors and Retries in Dynamics 365 Logic App Integrations
Integrating Dynamics 365 (D365) with external systems using Azure Logic Apps is one of the most common patterns for automation. But in real-world projects, things rarely go smoothly – API throttling, network timeouts, and unexpected data issues are everyday challenges. Without proper error handling and retry strategies, these issues can result in data mismatches, missed transactions, or broken integrations. In this blog, we’ll explore how to handle errors and implement retries in D365 Logic App integrations, ensuring your workflows are reliable, resilient, and production-ready. Core Content 1. Why Error Handling Matters in D365 Integrations Without handling these, your Logic App either fails silently or stops execution entirely, causing broken processes. 2. Built-in Retry Policies in Logic Apps What They Are:Every Logic App action comes with a retry policy that can be configured to automatically retry failed requests. Best Practice: 3. Handling Errors with Scopes and “Run After” Scopes in Logic Apps let you group actions and then define what happens if they succeed or fail. Steps: Example: 4. Designing Retry + Error Flow Together Recommended Pattern: This ensures no transaction is silently lost. 5. Handling Dead-lettering with Service Bus (Advanced) For high-volume integrations, you may need a dead-letter queue (DLQ) approach: This pattern prevents data loss while keeping integrations lightweight. 6. Monitoring & Observability Error handling isn’t complete without monitoring. Building resilient integrations between D365 and Logic Apps isn’t just about connecting APIs—it’s about ensuring reliability even when things go wrong. By configuring retry policies, using scopes for error handling, and adopting dead-lettering for advanced cases, you’ll drastically reduce downtime and data mismatches. Next time you design a D365 Logic App, don’t just think about the happy path. Build error handling and retry strategies from the start, and you’ll thank yourself later when your integration survives the unexpected. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Workspaces in Business Central AL Explained
When developing in Microsoft Dynamics 365 Business Central, you spend a lot of time working in Visual Studio Code. To streamline productivity and keep projects well-organized, workspaces in Business Central AL play a critical role. In this blog, we’ll explore what workspaces are, why they matter, and how you can use them effectively in your AL development journey. What is a Workspace in Business Central AL? A workspace in Visual Studio Code is essentially a container that holds your project’s structure, settings, and configurations. In Business Central AL development, a workspace defines: In short, a workspace ensures that everything needed to build and deploy an extension is neatly bundled together. Benefits of Using Workspaces Creating and Managing Workspaces Tip: Save your workspace using File > Save Workspace As… so you can reopen it quickly in future. Example: Multi-root Workspace When working with multiple extensions in a workspace, handling dependencies used to mean installing each required app one by one. Now, the development environment can automatically look at the dependency graph in your workspace and publish the necessary projects along with the one you selected. This way, you can focus on building and testing without worrying about missing dependencies. Imagine you’re working with multiple extensions in your Business Central environment: From the diagram: Base App ALProject1 ALProject2 ALProject3 Adding Folder to Workspace. Saving Workspace. Creating a separate folder to store workspaces. Publishing Full Dependency. This allows you to debug, build, and manage all the extensions from a single VS Code instance. Best Practices for Workspaces To conclude, workspaces in Business Central AL are more than just folders — they are the foundation of your development environment. By structuring your projects with well-maintained workspaces, you ensure smoother collaboration, better organization, and efficient extension deployment. If you’re just starting with AL, experiment with single-project workspaces, and as you grow, explore multi-root setups to manage larger development scenarios. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
From Microsoft Dynamics GP to Business Central: Why the Move Is About More Than Just Technology
For years, Microsoft Dynamics GP has been a reliable ERP system, helping businesses streamline financial operations, maintain compliance, and drive efficiency. It became a backbone for thousands of organizations, particularly mid-sized businesses that valued its stability and robustness. But the business landscape has changed dramatically. Markets move faster. Customer expectations are higher. And technology is no longer just a support function, it is the engine of growth, agility, and innovation. This is why the transition from Dynamics GP to Microsoft Dynamics 365 Business Central is not just another software upgrade. It is a strategic leap forward that determines how ready your business is for the next decade. The Real Question: Maintain or Evolve? Every business leader faces this decision at some point: continue maintaining what’s familiar or evolve into what’s next. GP offers stability, but that stability now comes with limitations, manual upgrades, server costs, and restricted scalability. For many companies, these challenges are becoming a bottleneck to innovation. On the other hand, Business Central offers agility. It’s a modern, cloud-first ERP that grows with your business, continuously innovates, and seamlessly integrates with the entire Microsoft ecosystem. In today’s world, standing still is the same as moving backward. The choice is simple: maintain what works or evolve toward what drives growth. What Businesses Gain with Business Central Always Up to Date No more manual upgrades or disruptive transitions. Business Central runs on the cloud with continuous updates and innovations at no additional cost. This means your team is always using the latest technology, features, and security enhancements without the burden of maintenance. Faster Decisions, Smarter Moves In an age where data drives competitive advantage, Business Central integrates seamlessly with Power BI and embedded analytics to deliver real-time insights. Leaders can act on facts, not assumptions, and empower their teams to make faster, data-driven decisions that move the business forward. Scalability Without Limits Growth brings complexity, new markets, entities, currencies, and compliance requirements. Business Central scales effortlessly to handle it all. Whether you are expanding into new geographies or diversifying your business model, the system grows with you, not against you. An Integrated Digital Workplace Business Central works hand in hand with Microsoft 365, Teams, Power Automate, and AI. The result is a truly connected workplace where data flows freely, collaboration improves, and manual processes give way to automation. This integration not only boosts productivity but also builds a culture of transparency and shared accountability. Cost Efficiency and Risk Reduction By eliminating on-premise IT infrastructure, you reduce overheads, lower downtime, and free up valuable resources to focus on innovation. With built-in security, compliance, and automated backups, your business becomes more resilient and future-proof. A Transformation Story At CloudFronts, we recently began working with a mid-sized client who had been running Dynamics GP for nearly three decades. GP had been the financial backbone of their operations and had served them well. However, the leadership team recognized an emerging reality: GP will soon reach its end of life, and continuing to rely on it would increase both operational risk and cost. They made a strategic decision, to migrate to Business Central and secure a platform built for the next decade of growth. Their goals were clear: This migration is now underway, and the client views it not as an IT project, but as a business transformation initiative. For them, Business Central represents the foundation of a connected, intelligent enterprise, one where decisions are faster, processes are leaner, and growth is continuous. Why Now Is the Right Time Many businesses delay ERP migrations because “things are working fine.” But the reality is that postponing the move comes with hidden risks, rising IT maintenance costs, outdated security models, dependency on legacy infrastructure, and the gradual loss of talent familiar with older systems. At the same time, competitors who embrace modern ERP platforms are moving faster, integrating AI, automating workflows, and leveraging real-time insights. The cost of waiting is not just financial, it is strategic. Business Central is more than an ERP. It is a platform for growth, intelligence, and resilience. It enables organizations to future-proof their operations while staying agile in an unpredictable world. The Takeaway Migrating from GP to Business Central is not a technical move-it is a business transformation decision. It means: With Dynamics GP approaching its end of life, the question is not if you should move, but when and how strategically you make that move. The time to act is now. If you are evaluating your options or planning your next steps, let’s talk. At CloudFronts, we’ve helped businesses across industries transition from legacy ERP systems to modern, scalable platforms like Business Central with minimal disruption and maximum value. Reach out at transform@cloudfronts.com. Let’s explore how you can evolve confidently into the future of business.
Share Story :
Step-by-Step Year-End Closing Guide for Dynamics 365 F&O Users
This blog provides a comprehensive, step-by-step guide for performing a year-end close in Microsoft Dynamics 365 Finance and Operations (F&O). It includes instructions, common issues, and suggestions for extracting relevant screenshots from a demo environment. 1. Pre-Close Activities 2. Performing the Year-End Close A. Set Up the Fiscal Year End Parameters B. Perform the Year-End Close Process C. Review Results D. Post-Close Activities 3. Common Issues and Tips Issue Action Profit and Loss accounts didn’t zero out Check ledger account settings. Retained Earnings entry missing Verify Retained Earnings account in ledger setup. Incorrect balances after year-end close Use the “Reverse” option and review postings. User doesn’t have access Ensure role has Financial Period Close privileges. Ledger Calendar page Performing the year-end close in D365 F&O is a structured process that ensures data integrity across financial periods. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Making the Right Choice: Session Storage or Local Storage in Power Pages
Step 1: Pre-Outline Brief Target Audience: Purpose:To clarify when to use session storage vs local storage in Power Apps Portals, using a real-world use case: syncing customer dropdown selection across pages in a portal. Pain Points Solved: Step 2: Core Content Scenario Overview I was working on a customer dropdown inside a Power Apps Portal (Dashboard). When a customer was selected, I wanted: This brought up the key question: “Should I use session storage or local storage?“ Understanding Session Storage vs Local Storage Before diving into the solution, let’s break down the difference: 🔹 Session Storage 🔹 Local Storage Key difference: Problem Statement When building multi-page Power Apps Portals, we often need to carry user selections (like Account/Customer IDs) across pages. But which one should we use? Initial Approach and Issue I first used localStorage for the customer selection. While it worked across pages, it had one drawback: This confused users because they expected a “fresh start” after logging back in. Working Solution: Using Session Storage The best solution was to use sessionStorage: Power Apps Portal Code Example 1. Store customer selection in sessionStorage (Dashboard page): 2. Apply selection on another page (e.g., Contracts page): 3. Clear selection on Sign Out (SignIn page): Benefits of This Approach In Power Apps Portals, deciding between sessionStorage and localStorage depends on user expectations: In my customer dropdown scenario, sessionStorage was the clear winner, as it ensured selections synced across pages but reset cleanly at logout. Sometimes, it’s not about picking one — but knowing when to use both to balance persistence with user experience. Need help implementing storage logic in your Power Pages? Reach out to CloudFronts Technologies as I’ve already implemented this use case for syncing dropdowns between Dashboard and childpages in a Portal website. You can contact us directly at transform@cloudfronts.com.
Share Story :
Submit Attachments Over 1GB Through MS Forms
One limitation while working with MS forms is the 1 GB limit on file submission through the forms. Many of you guys are must be using Forms to get files from users or clients outside your organizations and those files can be over 1GB. In this blog I will show you how you let users submit files over 1 GB through MS Forms and store this response into a SharePoint list. So let’s being.. Approach: MS Form stores all the files onto your one drive, One drive also offers a feature called “Request Files” using which you can create a shareable link to a one drive folder in which anyone with the link can upload files and it has no limit over the size of the file. So instead of using the forms upload file feature we will be using shareable link from the Request File feature on the form using which users will be able to submit documents of any size. Let’s see how to do this. Create Shareable link to a one drive folder using Request File Feature. Copy this link and save it we will be using this link in our MS form. Create MS Form. You can add the link as you want on the form you can also add it in your sections sub title (Both these are just examples or ideas of how you can show users this link.) Stored attachments in one drive. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
