Blog Archives -

Category Archives: Blog

From Clean Data to Insights: Integrating Azure Databricks with Power BI and MLflow

Cleaning data is only half the journey. The real value comes when that clean, reliable data powers dashboards for decision-makers and machine learning models for prediction. In this post, we’ll explore two powerful integrations of Azure Databricks: Why These Integrations Matter For growing businesses: Together, they create a bridge from cleaned data → insights → action. Practical Example 1: Databricks + Power BI 👉 Result: Executives can open Power BI and instantly see up-to-date sales performance across geographies. Practical Example 2: Databricks + MLflow 👉 Result: Your business can predict customer trends, forecast sales, or identify churn risk directly from cleaned Databricks data. To conclude, with these integrations: Together, they help organizations move from cleaned data → insights → intelligent action. ✅ Already cleaning data in Databricks? Try connecting your first Power BI dashboard today.✅ Want to explore AI? Start logging experiments with MLflow to track and deploy models seamlessly. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Migrating from Dynamics GP to Business Central: A Leap Towards the Future

For years, Microsoft Dynamics GP has been a reliable ERP system, helping businesses streamline financial operations. But the world has changed. Markets move faster, customer expectations are higher, and technology is no longer just a support function – it’s the backbone of growth. This is why the transition from Dynamics GP to Microsoft Dynamics 365 Business Central isn’t just another upgrade. It’s a strategic leap forward. The Real Question: Maintain or Evolve? In today’s world, standing still is the same as moving backward. The choice is simple: maintain what works or evolve to what’s next. What Businesses Gain with Business Central A Transformation Story We’re currently working with a mid-sized client who has been running Dynamics GP for nearly 3 decades. While GP had served them well, the leadership team realized that GP will be obsolete in just a few years. Continuing with GP would only add more risk and cost. That’s why they made a strategic decision: migrate to Business Central, ensuring they move to a platform built for the future. Their goals for the migration are clear: This migration is underway, and the client sees it as the foundation for their next decade of growth. Why Now Is the Right Time Postponing migration might feel safe, but it carries hidden risks: increasing IT costs, reliance on outdated processes, and missing out on innovations competitors are already leveraging. Business Central is more than an ERP—it’s a platform for growth, intelligence, and resilience. The Takeaway Migrating from GP to Business Central is not a technical move – it’s a business transformation. It means: With GP reaching its end of life in the coming years, now is the time to make the transition confidently and strategically. Feel free to reach out. You can contact us at transform@cloudfronts.com. Let’s work together to find the right step for your success.

Share Story :

A Unified Approach to Developing Finance and Operations Applications

Microsoft’s Unified Developer Experience (UDE) helps developers build solutions that work across both Finance and Operations (F&O) and the Power Platform by providing a common, cloud-based environment. Challenges Before UDE Before UDE, developers often faced the following issues: What UDE Changes With UDE, Microsoft combines these tools into one environment, making it easier to: Why UDE Is Useful Adopting UDE brings several practical benefits for developers and organizations: Check Access, Licenses, and Capacity Before starting, make sure your user role, license, and environment capacity are all set up properly. You can check this in the Power Platform Admin Center. Starting the Setup with PowerShell To get started, open PowerShell ISE on your laptop. If you haven’t installed the required Power Platform module yet, run this command (skip it if it’s already installed): #Install the module Install-Module -Name Microsoft.PowerApps.Administration.PowerShell -Force Next, sign in to your account and prepare the JSON template that defines your environment settings. Make sure DevToolsEnabled is set to true so developer tools are available. You can also set DemoDataEnabled to true if you want sample Contoso data included by default. Write-Host “Creating a session against the Power Platform API” Add-PowerAppsAccount -Endpoint prod #To construct the json object to pass in $jsonObject= @” { “PostProvisioningPackages”: [ { “applicationUniqueName”: “msdyn_FinanceAndOperationsProvisioningAppAnchor”, “parameters”: “DevToolsEnabled=true|DemoDataEnabled=true” } ] } “@ | ConvertFrom-Json Finally, you’re ready to start the environment deployment. New-AdminPowerAppEnvironment -DisplayName “EnvironmentName” -EnvironmentSku Sandbox -Templates “D365_FinOps_Finance” -TemplateMetadata $jsonObject -LocationName “unitedstates” -ProvisionDatabase Example: New-AdminPowerAppEnvironment -DisplayName “Basic_Env” -EnvironmentSku Sandbox -LocationName “unitedstates” -Templates “D365_FinOps_Finance” -TemplateMetadata $jsonObject -ProvisionDatabase Make sure to use a proper name for your environment — it must be 20 characters or fewer. Also, pick the correct data center location based on your region (for example, I used unitedstates, but you could choose India or another available region). Alternatively: Install on an Existing Environment If you already have a Power Platform environment with a Dataverse database, you can use it to install Finance and Operations apps. Simply select the environment, navigate to Resources > Dynamics 365 apps, and then select Dynamics 365 Finance and Operations Provisioning App. Once your environment is successfully provisioned, you’ll see it listed in the Power Platform Admin Center — just like in the screenshot above. Here’s what the key information means: You’ll also see links to manage: These settings help control access and structure within your environment. This confirms your Finance + Power Platform environment is now fully functional and integrated — ready for development, testing, and customization. Make sure your user account has the System Administrator security role in Dataverse. Once assigned, this role will automatically carry over to the Finance and Operations (F&O) environment — no need to reassign it separately. If you navigate to the Dynamics 365 apps, you’ll also find pre-configured and installed solutions available. You can check out the Modules, Packages, and Operation History by simply clicking on the Environment URL. System Requirements for Setting Up the Development Environment Before you begin working with the Unified Development Experience (UDE), it’s important to make sure your machine meets the basic hardware and software requirements. Here’s what you’ll need: Workstation Requirements To ensure smooth performance while developing: Required Software The following software components are essential for working with UDE in Visual Studio: Once everything is set up, you’re ready to open Visual Studio. Make sure to run it as Administrator and choose the “Continue without code” option when prompted. This ensures all tools load properly and you’re ready to begin your development work. Install Power Platform VS Extension Go to VS > Manage Extensions > Search ‘Power Platform Tools Now, navigate to Tools > Options > Power Platform Tools and enable the specified parameters. Now, go to Tools > Connect to Dataverse Always show the full list of organizations. Avoid signing in with your current Windows user if it’s not the same account you’ve already connected to in Visual Studio. You can view the environments you previously created in PowerShell – just select the one you set up earlier. Choose the default option, unless you’re planning to create specific components for D365 CE or Power Platform—in that case, it’s best to create a dedicated solution and publisher for your work. If the X++ source code for your specific UnO DevBox version (e.g., 10.0.35) hasn’t been downloaded yet, you’ll be prompted to get it locally. After setting up the Power Platform Tools extension and connecting to your Dataverse sandbox, you’ll see an option to install the Finance and Operations extension for Visual Studio, along with the related metadata. If you didn’t get any option you can download it manually by going to “C:\Users\ShubhamPrajapati\AppData\Local\Microsoft\Dynamics365\10.0.2263.74” Meanwhile, in the background, the PackageLocalDirectory is being extracted. You can monitor the progress by going to View > Output. The installation typically takes around 30 minutes. After installation, you’ll see a few prompts the first time you open Visual Studio—just click “Yes” to continue. As you can see, all models have been downloaded successfully. You can switch between Classic View and Model View by right-clicking on the AOT. Once that’s done, navigate to Tools > Options > Power Platform Tools and apply the required changes as shown in the image below. The final step is to configure the Finance & Operations extension. In my case, I use LocalDB for the Cross Reference (Cross Ref) Database—it’s convenient because it’s already included when you install Visual Studio. If you’re using LocalDB, ensure your connection string is correct. A typical value is: (localdb)\ To set up LocalDB (if not already initialized), open Command Prompt and run: sqllocaldb create MSSQLLocalDB -s This command initializes and starts the LocalDB instance. Once LocalDB is running, your Cross Reference Database will be restored. This enables key development features such as: These features significantly enhance the development experience by improving code navigation and reference tracking. If you receive errors when trying to open certain class files (which are XML files under the hood), it’s likely because the Modeling SDK is not installed. This SDK is essential for working with … Continue reading A Unified Approach to Developing Finance and Operations Applications

Share Story :

Dealing with ISV Extension Updates in Business Central: A Practical Guide

As of August 2025, the number of third-party apps available for Dynamics 365 Business Central on Microsoft AppSource is estimated to be between 4,000 and 6,500. Microsoft regularly publishes marketplace updates, with 200–300 new offers added monthly across all product lines, a significant portion of which are Business Central extensions.  It’s clear that the Business Central ecosystem is rapidly growing, making extension update management increasingly critical. The majority of clients we’ve worked with use third-party modules to enhance their business processes, making the management of extension updates a critical part of our environment health checklist. References Apps added to MS App Source MSLearn | Automatically update appsource apps with business central updates Configuration If you don’t have access to the Business Central Admin Center, your only option is to uninstall the extension and then reinstall it from AppSource. However, if you do have access to the Admin Center and prefer to manage app updates manually. Navigate to your environment and click on Apps. Here, you can –  For those looking to automate this process, Microsoft offers the “Appa Update Cadence” setting, which controls how and when apps are updated alongside Business Central. There are three available settings: To conclude, managing third-party extension updates in D365 Business Central is essential to maintaining a stable and reliable environment.  Whether updates are handled manually through the Admin Center or automated using the App Update Cadence feature, having a clear process helps minimize disruptions.  With the growing number of extensions in AppSource, proactively testing updates, monitoring changes, and coordinating with ISV partners ensures your Business Central environment stays healthy and future ready. If you need further assistance or have specific questions about your ERP setup, feel free to reach out for personalized guidance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

From Raw to Reliable: Cleaning Data at Scale with Azure Databricks

Are you struggling with messy spreadsheets full of duplicates, missing values, and inconsistent records? You’re not alone. Data professionals spend nearly 80% of their time cleaning and preparing data before any real analysis begins. The truth is simple: without clean data, business reports are unreliable, AI models fail, and decision-making slows down. In this blog, we’ll show you how Azure Databricks makes data cleaning easier, faster, and scalable—turning raw inputs into reliable insights with just a few lines of code. Why Clean Data Matters For business leaders, whether you’re a Team Lead, CTO, or CEO, clean data directly impacts growth: With Azure Databricks, you get a cloud-native, Spark-powered platform that handles big data at scale while integrating seamlessly with Azure Data Lake, Synapse, and Power BI. Practical Example: Cleaning a Sales Dataset in Azure Databricks Imagine you have a raw CSV file in Azure Data Lake with customer sales data: Issues in the data: Solution with PySpark in Databricks: Output after cleaning: CustomerID Name Country Sales 101 Alice USA 500 102 Bob USA 300 103 Unknown UK 450 104 David India 0 With just a few lines of Spark code, the dataset is now ready for reporting, visualization, or machine learning. To conclude, clean data is the foundation of every reliable business insight. With Azure Databricks, you can automate messy, manual processes and create repeatable, scalable pipelines that keep your data reliable—no matter how fast your business grows. ✅ Start small: try building a simple cleaning pipeline in Azure Databricks today.✅ Save time: focus more on insights, less on manual data prep.✅ Scale with confidence: as your data grows, Databricks grows with you. 👉 Want to take the next step? Explore how Databricks integrates with Power BI for real-time dashboards or with MLflow for machine learning pipelines. Stay tuned for our next post where we’ll cover these use cases in detail. ✨ With Databricks, your journey from raw to reliable data starts today. Contact us today at Transform@cloudfronts.com to get started. To learn more about functionalities of DataBricks and other Azure AI services, please refer to my other blogs from the links given below: – 1] The Hidden Cost of Bad Data:How Strong Data Management Unlocks Scalable, Accurate AI – CloudFronts 2] Automating Document Vectorization from SharePoint Using Azure Logic Apps and Azure AI Search – CloudFronts 3] Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification – CloudFronts

Share Story :

Setting Up Unity Catalog in Databricks for Centralized Data Governance

The fastest way to lose control of enterprise data? Managing governance separately across workspaces. Unity Catalog solves this with one centralized layer for security, lineage, and discovery. Data governance is crucial for any organization looking to manage and secure its data assets effectively. Databricks’ Unity Catalog is a centralized solution that provides a unified interface for managing access control, auditing, data lineage, and discovery. This blog will guide you through the process of setting up Unity Catalog in your Databricks workspace. What is Unity Catalog? Unity Catalog is Databricks’ answer to centralized data governance. It enables organizations to enforce standards-compliant security policies, apply fine-grained access controls, and visualize data lineage across multiple workspaces. It ensures compliance and promotes efficient data management. Key Features: 1] Standards-Compliant Security: ANSI SQL-based access policies that apply across all workspaces in a region. 2] Fine-Grained Access Control: Support for row- and column-level permissions. 3] Audit Logging: Tracks who accessed what data and when. 4] Data Lineage: Provides visualization of data flow and dependencies. Unity Catalog Object Hierarchy Before diving into the setup, it’s important to understand the hierarchical structure of Unity Catalog: 1] Catalogs: The top-level container (e.g., Production, Development) that represents an organizational unit or environment. 2] Schemas: Logical groupings of tables, views, and AI models within a catalog. 3] Tables and Views: These include managed tables fully governed by Unity Catalog and external tables referencing existing cloud storage. Here is the procedure to setup a Unity Catalog Metastore in association with Azure Storage, as I have done for one of our products (SmartPitch Sales & Marketing Agent) – 1] First create a storage account  with primary service being – “Azure Blob Storage or Azure Data Lake Storage Gen 2”; Performance and Redundancy can be chosen based on the requirement for which the DataBricks service is being used.Here for my Mosaic AI Agent, I have used Locally Redundant Storage & Data Lake Gen 2 2] Once the storage account is created, ensure that you have enabled “Hierarchical Namespace” When creating a Unity Catalog metastore with Azure Blob Storage, Hierarchical Namespace (HNS) is required because Unity Catalog needs: a] Folder-like structure to organize catalogs, schemas, and tables. b] Atomic operations (rename, move, delete) on directories and files. c] POSIX-style access controls for fine-grained permissions. d] Faster metadata handling for lineage and governance. HNS turns Azure Blob into ADLS Gen2, which supports these features. 3] Upload any Raw/Unclean files to your metastore folder in the blob storage, which would be required for your use in DataBricks. 4] Create a Unity Catalog Connector in Azure Portal and assign it “Storage Blob Data Contributor” Role . 5] Assign CORS (Cross-Origin Resource Sharing) settings for that storage account. Why this is necessary: In short: Without configuring CORS, Databricks cannot communicate with your storage container to read/write managed tables, schema metadata, or logs. 6] Generate SAS Token   7] Navigate to your Workspace and select “Manage Account” – this should be done from the account admin.  8] Select Catalog tab on the left and then click “Create Metastore”    9] Assign a Name, Region (Same as Workspace), The path to the storage account, and the connector id. 10] Once the Metastore is created, assign it to a workspace .  11] Once this is done, the catalogs and the schemas, and tables in within it can be created. How does Unity Catalog differ from Hive Metastore ? Feature Hive Metastore Unity Catalog Scope Workspace or cluster-specific Centralized, spans multiple workspaces and regions Architecture Single metastore tied to Spark/Hive Cloud-native service integrated with Databricks Object Hierarchy Databases → Tables → Partitions Catalogs → Schemas → Tables/Views/Models Data Assets Supported Tables, views Tables, views, files, ML models, dashboards Security Basic GRANT/DENY at database/table level Fine-grained, ANSI SQL–based (catalog, schema, table, column, row) Lineage Not available Built-in lineage and impact analysis Auditing Limited or external Integrated audit logs across workspaces Storage Management Points to storage locations; no governance Manages external and managed tables with governance Cloud Integration Primarily on cluster storage or external path Secure integration with ADLS Gen2, S3, GCS Permissions Model Spark SQL statements Attribute- and role-based access, unified policies Use Cases Basic metadata store for Spark/Hive workloads Enterprise-wide data governance, sharing, and compliance To conclude, Unity Catalog is the next-generation governance and metadata solution for Databricks, designed to give organizations a single, secure, and scalable way to manage data and AI assets. Unlike the older Hive Metastore, it centralizes control across multiple workspaces, supports fine-grained access policies, delivers built-in lineage and auditing, and integrates seamlessly with cloud storage like Azure Data Lake, S3, or GCS. When setting it up, key steps include: 1] Creating a metastore and linking it to your workspaces. 2] Enabling hierarchical namespace on Azure storage for folder-level security and operations. 3] Configuring CORS to allow Databricks domains to interact with storage. 4] Defining catalogs, schemas, and tables for structured governance. By implementing Unity Catalog, you ensure stronger security, better compliance, and faster data discovery, making your Databricks environment enterprise-ready for analytics and AI. Business Outcomes of Unity Catalog By implementing Unity Catalog, organizations can achieve: Why now? As data volumes and regulatory requirements grow, organizations can no longer rely on fragmented or legacy governance tools. Unity Catalog offers a future-proof foundation for unified data management and AI governance—essential for any modern data-driven enterprise. At CloudFronts, we help enterprises implement and optimize Unity Catalog within Databricks to ensure secure, compliant, and scalable data governance for enterprise data governance.Book a consultation with our experts to explore how Unity Catalog can simplify compliance and boost productivity for your teams.Contact us today at Transform@cloudfronts.com to get started. To learn more about functionalities of DataBricks and other Azure AI services, please refer to my other blogs from the links given below: – 1] The Hidden Cost of Bad Data:How Strong Data Management Unlocks Scalable, Accurate AI – CloudFronts 2] Automating Document Vectorization from SharePoint Using Azure Logic Apps and Azure AI Search – CloudFronts 3] Using Open AI and Logic Apps to develop a Copilot agent for … Continue reading Setting Up Unity Catalog in Databricks for Centralized Data Governance

Share Story :

Connecting Your MCP Server to Microsoft Copilot Studio – Part 2

In Part 1, we built a simple MCP server in TypeScript that exposed a “getWeather” tool. Now, let’s take the next step: connecting our MCP server to Microsoft Copilot Studio so that Copilot agents can call it directly. This section will cover: Step 1 — Publish Your MCP Server to Azure To make your MCP server accessible to Copilot Studio, you’ll need to host it online. There are multiple ways to deploy it — Azure App Service, Azure Container Apps, or even Azure Functions if you prefer serverless. For example, using Azure App Service: Test using curl to ensure it responds with MCP-compatible JSON: Step 2 — Create a New Copilot in Copilot Studio Step 3 — Add Knowledge Sources Optionally, you can enrich your Copilot by adding: This gives your Copilot a baseline knowledge to answer broader questions, while the MCP server will handle specific tasks (like fetching live weather data). Step 4 — Create a Custom Connector in Dataverse To let Copilot Studio talk to our MCP server, we need a custom connector inside Dataverse/CRM. Step 5 — Add the Custom Connector to Copilot Studio you’ll see the MCP server in your Tools section of copilot. To test the setup, let’s ask Copilot: “What’s the current weather in Mumbai?” On the first attempt, Copilot will prompt you to establish a connection. Simply open the Connection Manager, click Connect, and authorize the link to your MCP server. Once connected, Copilot will fetch the live weather details for Mumbai directly from your MCP server. and click retry on the Test window of your copilot. And just like that, your MCP server is live and fully integrated. It can now provide real-time weather updates for any city mentioned in your conversation with Copilot. You can try out different variations of questions or phrasings — Copilot will intelligently interpret your request, extract the city name, and seamlessly call the MCP server to deliver accurate weather details. Beyond Weather: Business Integrations The same process works for enterprise systems. For example, instead of getWeather, you could expose: By publishing these tools via MCP, your Copilot becomes a true enterprise assistant, capable of pulling structured business data and triggering workflows on demand. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

A Hands-on Guide to Managing Inventory with Microsoft Dynamics 365 Business Central

Inventory is the core of many businesses. Whether you’re selling products, making goods, or managing a supply chain, keeping the right stock at the right time is key. Microsoft Dynamics 365 Business Central helps businesses handle inventory with ease and clarity. 1. Central Item List Item lists are the backbone of inventory management. Business Central lets you create a structured list of all your products—whether you buy them, sell them, or just store them. This organized list becomes the single source of truth across all departments. 2. Real-Time Inventory Levels Business Central keeps track of: This helps businesses plan better and fulfill orders faster without confusion. 3. Multi-Location Tracking If you manage inventory in multiple places (like stores, warehouses, or branches), Business Central supports that too. You can: 4. Reorder and Stock Planning With built-in reorder logic, Business Central tells you when to buy and how much to buy. It considers: This reduces guesswork and supports a smooth procurement process. 5. Purchase and Sales Integration When a purchase order is received or a sales order is shipped, inventory updates automatically. This minimizes the need for manual updates and keeps everyone on the same page. 6. Lot and Serial Number Tracking Business Central supports lot numbers and serial numbers. This helps with: 7. Inventory Valuation Methods You can choose how to value your inventory: This supports accurate financial reporting and cost control. 8. Inventory Transfers Do you need to move items from one location to another? Use transfer orders. You can record: 9. Inventory Adjustments Sometimes physical counts don’t match system data. Business Central allows easy stock corrections for: 10. Reports and Insights With built-in reports and dashboards, you can track: These insights will assist you in making well-informed decisions and planning ahead. Why It Matters Good inventory management helps you: Business Central gives you the tools to manage stock simply and efficiently. If you’re using spreadsheets or disconnected tools to manage inventory, now is a good time to explore Business Central. It gives you more control, better insights, and smoother operations—all in one place. We hope you found this blog useful. If you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Simplifying File-Based Integrations for Dynamics 365 with Azure Blob and Logic Apps

Integrating external systems with Dynamics 365 often involves exchanging files like CSVs or XMLs between platforms. Traditionally, these integrations require custom code, complex workflows, or manual intervention, which increases maintenance overhead and reduces reliability. Thankfully, leveraging Azure Blob Storage and Logic Apps can streamline file-based integrations, making them more efficient, scalable, and easier to maintain. Why File-Based Integrations Are Still Common While APIs are the preferred method for system integration, file-based methods remain popular in many scenarios: The challenge comes in orchestrating file movement, transforming data, and ensuring it reaches Dynamics 365 reliably. Enter Azure Blob Storage Azure Blob Storage is a cloud-based object storage solution designed for massive scalability. When used in file-based integrations, it acts as a reliable intermediary: Orchestrating with Logic Apps Azure Logic Apps is a low-code platform for building automated workflows. It’s particularly useful for integrating Dynamics 365 with file sources: Real-Time Example: Automating Sales Order Uploads Traditional Approach: Solution Using Azure Blob and Logic Apps: Outcome: Best Practices Benefits To conclude, file-based integrations no longer need to be complicated or error-prone. By leveraging Azure Blob Storage for reliable file handling and Logic Apps for automated workflows, Dynamics 365 integrations become simpler, more maintainable, and scalable. The real-time sales order example shows that businesses can save time, reduce errors, and ensure data flows seamlessly between systems allowing teams to focus on their core operations rather than manual file processing. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

How to Enable Recycle Bin in Dynamics 365 CRM

Posted On September 4, 2025 by Vidit Gholam Posted in Tagged in

When working with Dynamics 365 CRM, one common request from users and admins is:“How do we get a Recycle Bin to recover accidentally deleted records?” Unlike SharePoint or Windows, Dynamics 365 doesn’t come with a native Recycle Bin. But that doesn’t mean you’re out of luck! There are a few smart ways to implement soft delete or restore capabilities depending on your organization’s needs. In this blog, we’ll explore all the available options — from built-in Power Platform features to custom approaches — to simulate or enable Recycle Bin-like functionality in Dynamics 365 CRM. Option 1: Use the Built-in Dataverse Recycle Bin (Preview/GA in Some Regions) Microsoft is gradually rolling out a Recycle Bin feature for Dataverse environments. How to Enable: Option 2: Implement a Custom Recycle Bin (Recommended for Full Control) You can also write a bulk delete after 15-30 days to actually clear these records from Dataverse. Option 3: Restore from Environment Backups If a record is permanently deleted, your last line of defence is a full environment restore. Not ideal for frequent recovery, but lifesaving in major accidents. Tips and Tools you can use. If you also want to track who deleted what and when, Auditing might be helpful. You cannot restore deleted records using this. It is useful only for traceability and compliance, not recovery. XrmToolBox Plugins like Recycle Bin Manager simulate soft delete and allow browsing deleted records. While Dynamics 365 CRM doesn’t provide a built-in Recycle Bin like other Microsoft products, there are several reliable ways to implement soft-delete or recovery mechanisms that fit your organization’s needs. Whether you leverage Dataverse’s native capabilities, create a custom status based Recycle Bin, or track deletions through auditing and backups, it’s essential to plan ahead for data protection and user experience. By proactively enabling recovery options, you not only safeguard critical business data but also empower users with confidence and control over their CRM operations. What’s Your Approach? Have you built your own Recycle Bin experience in Dynamics 365? Share your thoughts or tips in the comments below! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange