Blog Archives - Page 4 of 183 - - Page 4

Category Archives: Blog

How to use Dynamics 365 CRM Field-Level Security to maintain confidentiality of Intra-Organizational Data

Summary In most CRM implementations, data exposure should be encapsulated for both inside & outside the organization. Sales, Finance, Operations, HR, everyone works in the same system. Collaboration increases. Visibility increases. But so does risk. This is based on real-world project experience, for a practical example I had implemented for a technology consulting and cybersecurity services firm based in Houston, Texas, USA, specializing in modern digital transformation and enterprise security solutions. This blog explains: 1] Why Security Roles alone are not enough. 2] How users can still access data through Advanced Find, etc. 3] What Field-Level Security offers beyond entity-level restriction. 4] Step-by-step implementation. 5] Business advantages you gain. Table of Contents The Real Problem: Intra-Organizational Data Exposure Implementation of Field-Level Security Results Why Was a Solution Required? Business Impact The Real Problem: Intra-Organizational Data Exposure Let’s take a practical cross-department scenario. Both X Department and Y Department work in the same CRM system built on Microsoft Dynamics 365. Entities Involved 1] Entity 1 2] Entity 2 Working Model X Department Fully owns and manages Entity 1 Occasionally needs to refer to specific information in Entity 2 Y Department Fully owns and manages Entity 2 Occasionally needs to refer to specific information in Entity 1 This is collaborative work. You cannot isolate departments completely. But here’s the challenge: Each entity contains sensitive fields that should not be editable — or sometimes not even visible — to the other department. Security Roles in Microsoft Dynamics 365 operate at the entity (table) level, not at the field (column) level. Approach Result Remove Write access to Entity 2 for X Dept X Dept cannot update anything in Entity 2 — even non-sensitive fields Remove Read access to sensitive fields in Entity 2 Not possible at field level using Security Roles Restrict Entity 2 entirely from X Dept X Dept loses visibility — collaboration breaks Hide fields from the form only Data still accessible via Advanced Find or exports This is the core limitation. Security Roles answer: “Can the user access this record?” They do NOT answer: “Which specific data inside this record can the user access?” Implementation of Field-Level Security Step 1: Go to your Solution & Identify Sensitive Fields, usually Personal info, facts & figures, etc. e.g. cf_proficiencyrating. Step 2: Select the field and “Enable” it for Field Level Security (This is not possible for MS Out of the Box fields) Step 3: Go to Settings and then select “Security” Step 4: Go to Settings and then select “Security” -> “Field Security Profiles” Step 5: Either create or use existing Field Security Profile, as required Step 6: Within this one can see all the fields across Dataverse which are enabled for Field Security, Here the user should select their field and set create/read/update privileges (Yes/No). Step 7: Then select the system users, or the Team (having the stakeholder users), and save it. Results: Assume you are a user from X dept. who wants to access Entity 2 Record, and you need to see only the Proficiency Rating & Characteristic Name, but not Effective Date & Expiration Date; now since all fields have Field Level Security they would have a Key Icon on them, but the fields which do not have read/write access for you/your team, would have the Key Icon as well as a “—“. The same thing would happen in Views, subgrids, as well as if the user uses Advanced Find. Why this Solution was Required? The organization needed: 1] Cross-functional collaboration 2] Protection of confidential internal data 3] Clear separation of duties 4] No disruption to operational workflows They required a solution that: 1] Did not block entity access 2] Did not require custom development 3] Enforced true data-level protection Business Impact 1. Confidential Data Protection Sensitive internal data was secured without restricting overall entity access, enabling controlled collaboration. 2. Reduced Internal Data Exposure Risk Unauthorized users could no longer retrieve protected fields via Advanced Find, significantly lowering governance risk. 3. Clear Separation of Duties Departmental ownership of sensitive fields was enforced without disrupting cross-functional visibility. 4. Improved Audit Readiness Every modification to protected fields became traceable, strengthening accountability and compliance posture. 5. Reduced Operational Friction System-enforced field restrictions eliminated the need for entity blocking, duplicate records, and manual approval workarounds. 6. Efficiency Gains The solution was delivered through configuration — no custom code, no complex business rules, and minimal maintenance overhead. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

Simplifying Data Pipelines with Delta Live Tables in Azure Databricks

From a customer perspective, the hardest part of data engineering isn’t building pipelines-it’s ensuring that the data customers rely on is accurate, consistent, and trustworthy. When reports show incorrect revenue or missing customer information, confidence drops quickly. This is where Delta Live Tables in Databricks makes a real difference for customers. Instead of customers dealing with broken dashboards, manual fixes in BI tools, or delayed insights, Delta Live Tables enforces data quality at the pipeline level. Using a Bronze–Silver–Gold approach: Data validation rules are built directly into the pipeline, and customers gain visibility into data quality through built-in monitoring-without extra tools or manual checks. Quick Preview Building data pipelines is not the difficult part. The real challenge is building pipelines that are reliable, monitored, and enforce data quality automatically. That’s where Delta Live Tables in Databricks makes a difference. Instead of stitching together notebooks, writing custom validation scripts, and setting up separate monitoring jobs, Delta Live Tables lets you define your transformations once and handles the rest. Let’s look at a simple example. Imagine an e-commerce company storing raw order data in a Unity Catalog table called: cf.staging.orders_raw The problem? The data isn’t perfect. Some records have negative quantities. Some orders have zero amounts. Customer IDs may be missing. There might even be duplicate order IDs. If this raw data goes straight into reporting dashboards, revenue numbers will be wrong. And once business users lose trust in reports, it’s hard to win it back. Instead of fixing issues later in Power BI or during analysis, we fix them at the pipeline level. In Databricks, we create an ETL pipeline and define a simple three-layer structure: Bronze for raw data, Silver for cleaned data, and Gold for business-ready aggregation. The Bronze layer simply reads from Unity Catalog: Nothing complex here. We’re just loading data from Unity Catalog. No manual dependency setup required. The real value appears in the Silver layer, where we enforce data quality rules directly inside the pipeline: Here’s what’s happening behind the scenes. Invalid rows are automatically removed. Duplicate orders are eliminated. Data quality metrics are tracked and visible in the pipeline UI. There’s no need for separate validation jobs or manual checks. This is what simplifies pipeline development. You define expectations declaratively, and Delta Live Tables enforces them consistently. Finally, in the Gold layer, we create a clean reporting table: At this point, only validated and trusted data reaches reporting systems. Dashboards become reliable. Delta Live Tables doesn’t replace databases, and it doesn’t magically fix bad source systems. What it does is simplify how we build and manage reliable data pipelines. It combines transformation logic, validation rules, orchestration, monitoring, and lineage into one managed framework. Instead of reacting to data issues after reports break, we prevent them from progressing in the first place. For customers, trust in data is everything. Delta Live Tables helps organizations ensure that only validated, reliable data reaches customer-facing dashboards and analytics. Rather than reacting after customers notice incorrect numbers, Delta Live Tables prevents poor-quality data from moving forward. By unifying transformation logic, data quality enforcement, orchestration, monitoring, and lineage in one framework, it enables teams to deliver consistent, dependable insights. The result for customers is simple: accurate reports, faster decisions, and confidence that the data they see reflects reality. I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

How Pharmaceutical Companies Can Move ERPs to the Cloud – Without Risk

Summary ERP migration in the pharmaceutical industry is not just a technology upgrade – it is a compliance and quality decision. For highly regulated manufacturers, cloud migration must ensure that regulatory processes, audit trails, and product quality controls remain intact. This article explains why pharmaceutical ERP migrations feel risky, how modern cloud platforms such as Microsoft Dynamics 365 Business Central can strengthen compliance controls, and how a compliance-first migration approach helps pharmaceutical organizations modernize safely. Table of Contents 1. ERP Migration in Pharma Is a Strategic Decision 2. Why Cloud Migrations Feel Risky in Pharma 3. Cloud Does Not Mean Less Control 4. How CloudFronts Approaches Pharma ERP Migration 5. Real-World Example The Outcome ERP Migration in Pharma Is a Strategic Decision In pharmaceuticals, ERP migration is never just an IT upgrade. It is a compliance decision, a quality decision, and often a decision that senior leadership and QA teams will remain accountable for long after the system goes live. When pharmaceutical organizations evaluate cloud ERP adoption, the biggest concern is rarely performance or cost. The real question is: “How do we move to the cloud without putting compliance, audits, or product quality at risk?” The answer lies in one core principle: Compliance-First Migration. Why Cloud Migrations Feel Risky in Pharma Pharmaceutical ERP systems support highly regulated manufacturing processes such as: Batch manufacturing Quality control and approvals Quarantine and release processes Expiry and retesting End-to-end product traceability Because of these requirements, a generic “lift-and-shift” cloud migration approach rarely works in pharmaceutical environments. In pharma operations: A missed QC step is not just a process gap – it becomes a compliance issue. A broken batch trail is not just an inconvenience – it becomes an audit finding. This is why many ERP migrations in the pharmaceutical industry stall or exceed expected timelines. The issue is rarely technology. It is usually the absence of compliance as the foundation of the migration strategy. Cloud Does Not Mean Less Control In pharmaceutical organizations, cloud ERP adoption is sometimes perceived as a loss of control. In reality, modern cloud ERP platforms such as Microsoft Dynamics 365 Business Central can provide stronger compliance capabilities than many legacy on-premise systems when implemented correctly. Cloud ERP systems enable: System-driven audit trails Role-based approvals Enforced quality and release controls End-to-end batch and lot traceability Cloud technology enables compliance – but it does not automatically guarantee it. Compliance ultimately depends on how processes are designed and enforced within the ERP system. Real-World Example One of our customers – an EU-GMP and TGA-approved pharmaceutical company specializing in advanced solutions for pellets, granules, tablets, and capsule manufacturing – modernized its ERP landscape by migrating from Microsoft Dynamics NAV to Microsoft Dynamics 365 Business Central in the cloud. The migration strengthened quality processes, improved operational efficiency, and enhanced regulatory compliance across manufacturing operations. Read the full customer success story here: EU-GMP & TGA Approved Pharmaceutical Company – Dynamics 365 Business Central Case Study The Outcome A compliance-first ERP migration approach builds confidence across the organization. Quality assurance teams trust the system. Operational risks are significantly reduced. Regulatory audits become more predictable and easier to manage. When compliance becomes the foundation of the migration strategy, the cloud stops feeling risky – and starts becoming a reliable platform for growth. Final Thought Pharmaceutical companies do not struggle with cloud ERP migrations because the cloud is unsafe. They struggle when compliance is treated as a phase instead of a foundation. A compliance-first migration does not slow digital transformation – it protects the organization while allowing the cloud to deliver its full value. We hope you found this blog useful. If you would like to discuss ERP modernization for pharmaceutical manufacturing, you can reach out to us at transform@cloudfronts.com.

Share Story :

Databricks Notebooks Explained – Your First Steps in Data Engineering

If you’re new to Databricks, chances are someone told you “Everything starts with a Notebook.” They weren’t wrong. In Databricks, a Notebook is where your entire data engineering workflow begins from reading raw data, transforming it, visualizing trends, and even deploying jobs. It’s your coding lab, dashboard, and documentation space all in one. What Is a Databricks Notebook? A Databricks Notebook is an interactive environment that supports multiple programming languages such as Python, SQL, R, and Scala. Each Notebook is divided into cells you can write code, add text (Markdown), and visualize data directly within it. Unlike local scripts, Notebooks in Databricks run on distributed Spark clusters. That means even your 100 GB dataset is processed within seconds using parallel computation. So, Notebooks are more than just code editors they are collaborative data workspaces for building, testing, and documenting pipelines. How Databricks Notebooks Work Under the hood, every Notebook connects to a cluster a group of virtual machines managed by Databricks. When you run code in a cell, it’s sent to Spark running on the cluster, processed there, and results are sent back to your Notebook. This gives you the scalability of big data without worrying about servers or configurations. Setting Up Your First Cluster Before running a Notebook, you must create a cluster it’s like starting the engine of your car. Here’s how: Step-by-Step: Creating a Cluster in a Standard Databricks Workspace Once the cluster is active, you’ll see a green light next to its name now it’s ready to process your code. Creating Your First Notebook Now, let’s build your first Databricks Notebook: Your Notebook is now live ready to connect to data and start executing. Loading and Exploring Data Let’s say you have a sales dataset in Azure Blob Storage or Data Lake. You can easily read it into Databricks using Spark: df = spark.read.csv(“/mnt/data/sales_data.csv”, header=True, inferSchema=True)display(df.limit(5)) Databricks automatically recognizes your file’s schema and displays a tabular preview.Now, you can transform the data: from pyspark.sql.functions import col, sumsummary = df.groupBy(“Region”).agg(sum(“Revenue”).alias(“Total_Revenue”))display(summary) Or, switch to SQL instantly: %sqlSELECT Region, SUM(Revenue) AS Total_RevenueFROM sales_dataGROUP BY RegionORDER BY Total_Revenue DESC Visualizing DataDatabricks Notebooks include built-in charting tools.After running your SQL query:Click + → Visualization → choose Bar Chart.Assign Region to the X-axis and Total_Revenue to the Y-axis.Congratulations — you’ve just built your first mini-dashboard! Real-World Example: ETL Pipeline in a Notebook In many projects, Databricks Notebooks are used to build ETL pipelines: Each stage is often written in a separate cell, making debugging and testing easier.Once tested, you can schedule the Notebook as a Job running daily, weekly, or on demand. Best Practices To conclude, Databricks Notebooks are not just a beginner’s playground they’re the backbone of real data engineering in the cloud.They combine flexibility, scalability, and collaboration into a single workspace where ideas turn into production pipelines. If you’re starting your data journey, learning Notebooks is the best first step.They help you understand data movement, Spark transformations, and the Databricks workflow everything a data engineer need. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com

Share Story :

Advanced Field Control in Power Pages Using JavaScript and Liquid – Real Scenarios from Projects

While working on Dynamics 365 Power Pages implementations, I realized very quickly that portal metadata alone cannot handle complex business requirements. Basic field visibility rules are easy, but real-world forms demand dynamic behavior that depends on user role, record data, multi-select options, and business logic. This is where combining JavaScript (client-side control) with Liquid (server-side logic) becomes powerful. This article shares practical scenarios where this approach made the solution possible. Why Configuration Alone Was Not Enough Portal Management allows: But in my projects, requirements looked like this: These are not achievable with metadata alone. Role of JavaScript vs Liquid (How I Use Them) Purpose Tool Why Dynamic field behavior JavaScript Runs instantly in browser User role detection Liquid Server knows portal roles Record-based decisions Liquid Data available before render UI interactivity JavaScript Responds to user actions Liquid decides what context the page loads with.JavaScript controls what happens after the page loads. Scenario 1: Multi-Select Option Controls Multiple Fields Requirement:If a user selects “Service Issue” AND “Billing Issue” in a multi-select, show an escalation section. Problem: Metadata rules cannot evaluate multiple selections together. My Approach: This improved user experience and prevented unnecessary data entry. Scenario 2: Role-Based Form Behavior Requirement: Managers should see approval fields; normal users should not even know those fields exist. Why Liquid Helped:Portal roles are determined server-side, so I used Liquid to pass a flag to JavaScript. Then JavaScript handled visibility accordingly. This ensured: Scenario 3: Locking Fields After Status Change Requirement: Once a case moves to “Submitted”, users should only view, not edit. Solution Design: This approach avoided creating multiple forms and kept maintenance simple. Scenario 4: Dynamic Label Changes Requirement: Label should say: Instead of duplicating forms, JavaScript updated the label text based on user type passed via Liquid. Preventing the Common Mistake JavaScript improves UX, but it does not secure data. I always ensure final validation also exists in: Portal scripting is the first layer, not the only layer. Lessons Learned from Real Implementations To encapsulate, Power Pages becomes truly flexible when JavaScript and Liquid are used together with clear responsibility boundaries. Liquid prepares the context; JavaScript handles the interaction. In my experience, this combination bridges the gap between standard configuration and complex business needs without overcomplicating the architecture. Custom portal behavior is not about more code but it’s about placing logic at the right layer. If you found this useful and are working on similar Power Pages scenarios, feel free to connect or reach out to the CloudFronts team – transform@cloudfronts.com, for deeper discussions on advanced Dynamics 365 implementations.

Share Story :

Reducing Try-Catch in Dynamics 365 Plugins Using DTO Validation Classes

Dynamics 365 plugins run inside an event-driven execution pipeline where business rules and data validation must be enforced before database operations complete. As systems evolve, plugin code often degrades into heavy defensive programming filled with null checks, type casting, and nested exception handling. Over time, exception handling begins to overshadow the actual business logic. This article introduces a DTO + Validation Layer pattern that restructures plugin design into a clean, testable, and scalable architecture. What Is “Try-Catch Hell” in Plugins? Typical Causes Symptoms Issue Impact Deep nested try-catch blocks Hard-to-read code Repeated attribute checks Code duplication Business logic hidden in error handling Difficult debugging Validation spread across plugins Inconsistent rules Result:Plugins behave like procedural error-handling scripts instead of structured business components. Architectural Shift: Entity-Centric → DTO-Centric Traditional plugins manipulate the CRM Entity directly. Problem:The Entity object is: This makes failures more likely. Proposed Flow This separates: DTO: A Strongly-Typed Contract DTOs act as a safe bridge between CRM data and business logic. Without DTO With DTO Dynamic attributes Strong typing Runtime failures Compile-time safety SDK-dependent logic Business-layer independenc Mapping Layer: Controlled Data Extraction All CRM attribute handling is isolated in one place. Benefits Validation Layer: Centralized Rule Logic Traditional DTO Validation Model Validation inside plugin Dedicated validator class Hard to reuse Reusable Hard to test Unit-test friendly The Clean Plugin Pattern Now the plugin: Reduction of Exception Noise Before: Exceptions for missing fields, null references, casting issues, and validation errors. After: Only meaningful business validation exceptions remain. The architecture shifts from reactive error handling to structured validation. Generative Design Advantages Capability Benefit New fields Update DTO + Mapper only New rules Extend validator Shared logic Reuse across plugins Automated testing No CRM dependency Testability Improvement No CRM context required. Performance Considerations To conclude, the DTO + Validation pattern is not just a coding improvement- it changes the way plugins are built. In many Dynamics 365 projects, plugins slowly become difficult to read because developers keep adding null checks, type conversions, and try-catch blocks. Over time, the actual business logic gets lost inside error handling. Using DTOs and a Validation layer fixes this problem in a structured way. Instead of working directly with the CRM Entity object everywhere, we first convert data into a clean, strongly typed DTO. This removes repeated attribute checks and reduces runtime errors. The code becomes clearer because we work with proper properties instead of dictionary-style field access. Then, all business rules are moved into a Validator class. This means: Now the plugin only performs four simple steps: Get data → Convert to DTO → Validate → Run business logic Because of this: Developers can understand the purpose of the plugin faster, instead of trying to follow complex nested try-catch blocks. If your Dynamics 365 environment is evolving in complexity, reach out to CloudFronts, transform@cloudfronts.com, specialists regularly design and implement structured plugin architectures that improve performance, maintainability, and long-term scalability.

Share Story :

How to Generate and Use SSL Certificates in Microsoft Dynamics 365 Business Central

Security is a critical aspect of any ERP implementation. When integrating Microsoft Dynamics 365 Business Central with external systems such as APIs, payment gateways, banks, IRIS, VAT systems, or third-party services, SSL/TLS certificates play a key role in securing communication. A common misconception is that Business Central itself generates SSL certificates. In reality, Business Central only consumes certificates-the generation and management are handled externally. In this blog, we will cover: What Is an SSL Certificate in Business Central? An SSL (Secure Sockets Layer) / TLS certificate is used to:\Hook: In Business Central, certificates are commonly used for: Important: Business Central does not create SSL certificates—it only stores and uses them. Steps to Generate an SSL Certificate (Self-Signed) This approach is typically used for development or on-premises environments. Step 1: Create a Self‑Signed Certificate in IIS Step 2: Provide Certificate Details Step 3: Copy the Certificate Thumbprint This thumbprint will be required in the next step. Step 4: Configure Certificate Using PowerShell Step 5: Verify Required Properties Ensure all required certificate properties are set to True, including: Step 6: Bind the Certificate in IIS Step 7: Add Certificate Using MMC Step 8: Verify Certificate Installation The certificate should now be visible under: Step 9: Grant Permissions to Business Central Service This ensures the Business Central service can access the certificate. To conclude, SSL certificates are a core security component in Business Central integrations. While Business Central does not generate certificates, it provides robust mechanisms to store and consume certificates securely in both cloud and on‑prem environments. Understanding the generation, configuration, and usage flow ensures secure, compliant, and reliable integrations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Finding the Right Events in Business Central: Payment Journals & Purchase Orders

When working with Payment Journals in Microsoft Dynamics 365 Business Central, one of the most common customization requirements is to trigger custom logic immediately after the user selects the Applies-to Doc. No.. In one of my recent client projects, the requirement was very specific: As soon as a payment journal line is applied to an invoice (via Applies-to Doc. No. lookup), the system should automatically calculate amounts and create additional retained lines (VAT and IRIS). Sounds simple, right? The real challenge was finding the correct event that fires after the lookup completes and after Business Central internally updates the journal line fields. This blog documents: Problem Statement The client wanted the following behavior in Payment Journal: The logic must run right after the lookup, not during posting and not on page validation. Why Page Events Were Not Enough Initially, it is natural to look for: However, in this case: So even though the value was visible, the amounts were not reliable yet. Using Event Recorder to Find the Right Event This is where Event Recorder becomes extremely powerful. Steps I Followed The recorder captured a detailed list of: After analyzing the sequence, one event stood out. The Key Event That Solved the Problem The event that fulfilled the exact requirement was: [EventSubscriber(    ObjectType::Table,    Database::”Gen. Journal Line”,    ‘OnLookupAppliestoDocNoOnAfterSetJournalLineFieldsFromApplication’,    ”,    false,    false)]local procedure OnAfterLookupAppliesToDocNo(var GenJournalLine: Record “Gen. Journal Line”) Why This Event Is Perfect This is exactly the moment where custom business logic should run. Implementing the Business Logic Below is the simplified version of the logic implemented inside the subscriber: local procedure OnAfterLookupAppliesToDocNo(var GenJournalLine: Record “Gen. Journal Line”)begin    GenJournalLine.GetUpdatedAmount(); if GenJournalLine.”Applies-to Doc. No.” <> ” then begin        GenJournalLine.GetUpdatedAmount_(GenJournalLine);        AppliestoDocNo := GenJournalLine.”Applies-to Doc. No.”;        GenJournalLine.CreateRetainedVATLine(GenJournalLine, AppliestoDocNo);        GenJournalLine.CreateRetainedIRISLine(GenJournalLine, AppliestoDocNo);    end;end; What This Code Does All of this happens immediately after the lookup, without waiting for posting. Important Design Notes Key Takeaway Finding the right event is often harder than writing the logic itself. In scenarios where: This table event: OnLookupAppliestoDocNoOnAfterSetJournalLineFieldsFromApplication is a hidden gem for Payment Journal customizations involving Applies-to logic. Another Real-World Case: Invoice Discount Recalculation on Purchase Orders In the same project, we faced another tricky requirement related to Invoice Discounts on Purchase Orders. The Problem did not fire reliably when invoice discounts were recalculated by the system This became an issue because the client wanted custom tax and withholding logic (IR, IS, Withheld VAT, Excise) to be recalculated immediately after invoice discount recalculation. Why Page and Line Events Failed Again Business Central recalculates invoice discounts using an internal codeunit: Purch – Calc Disc. By Type This logic: So once again, page-level and line-level events were too early or never triggered. Finding the Right Event (Again) Using Event Recorder Using Event Recorder, I traced the execution when: This led to the discovery of another perfectly-timed system event. The Key Event for Invoice Discount Scenarios [EventSubscriber(    ObjectType::Codeunit,    Codeunit::”Purch – Calc Disc. By Type”,    ‘OnAfterResetRecalculateInvoiceDisc’,    ”,    false,    false)]local procedure OnAfterResetRecalculateInvoiceDisc(var PurchaseHeader: Record “Purchase Header”) Why This Event Works Applying Custom Logic on Purchase Lines local procedure OnAfterResetRecalculateInvoiceDisc(var PurchaseHeader: Record “Purchase Header”)var    PurchLine: Record “Purchase Line”;begin    PurchLine.SetRange(“Document Type”, PurchaseHeader.”Document Type”);    PurchLine.SetRange(“Document No.”, PurchaseHeader.”No.”); if PurchLine.FindSet() then        repeat            PurchLine.UpdateIRandIS();            PurchLine.CalculateWithHeldVAT();            PurchLine.CalculateIR();            PurchLine.CalculateIS();            PurchLine.CalculateExcise(); PurchLine.Modify();        until PurchLine.Next() = 0;end; What Happens Here All of this happens automatically, without relying on UI triggers. Key Lessons from Both Scenarios Final Thoughts Both of these scenarios reinforce one important principle in Business Central development: Finding the right event matters more than writing the logic itself. Whether it is: The solution lies in understanding where Business Central actually performs the work – and subscribing after that point. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Create records in Dynamics CRM using Microsoft Excel Online

Quick Preview Importing customer data into Dynamics 365 can be simple and efficient. Whether you’re transitioning from another system or adding a large batch of new customers, Microsoft Excel Online offers a practical way to create multiple records at once—without any technical configuration. In this blog, I’ll walk you through a clear, step-by-step approach to importing customer (or any entity) records directly into your Dynamics 365 environment using Excel Online. By the end, you’ll be able to upload bulk data quickly while maintaining accuracy and data consistency. Let’s get started and see how you can seamlessly import multiple customer records into Dynamics 365 using Excel Online. Step 1: Go to the entity’s home page who’s records you want to create (In my case it is customer entity).   Step 2: On the active accounts view (or any view) click on edit columns and add the columns as per the data you want to be fill in. (Don’t forget to hit apply button at the bottom)  Step 2 : Once your view is ready click on Export to Excel Button on the top left and select Open in excel online.  Step 3: If you are using a system view like in this example you will see existing records on the online excel, you can clear those records or keep them as is. If you change any existing record, it will update the data of that record so you can also use this to update existing records at once (I will write a separate blog post for updating records for now let’s focus on creating records)  Step 4: You can then add the data which you want to create to the online excel sheet, in this example I am transferring data from a local excel sheet to the online excel.  Step 5: Once you have added your data on the online excel, hit apply button.  Step 6: You will get a Popup about your data being submitted for import, hit Track Progress.  Step 7: You will see your data has been submitted and is parsing. (It will take couple of minutes to hours depending upon the amount of data you have submitted keep refreshing to see the progress of the records).  Step 8: Once the import job is completed you will see how many records were created successfully and how many failed or partially failed. You can open the import job and check failed entries, correct the entries and re-import   Failed records  All the successfully parsed records will be created in your system.  Importing customer records in Dynamics 365 becomes incredibly seamless with Excel Online. With just a few steps-preparing your view, exporting to Excel, adding your data, and submitting the import-you can create hundreds or even thousands of records in a fraction of the time. This approach not only speeds up data entry but also ensures consistency and reduces manual errors. Hope this helps! 😊  I Hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudFronts.com.

Share Story :

SMTP with OAuth 2.0 in Business Central: A Modern Email Setup

Email remains one of the most critical communication tools in Business Central. Whether you’re sending invoices, notifications, or workflow approvals, the reliability and security of your email integration matter. With Microsoft officially retiring Basic Authentication in Exchange Online, Business Central users must now embrace OAuth 2.0 for SMTP connections. Let’s explore what this means and how to configure it. Why the Shift to OAuth 2.0? Setting Up SMTP with OAuth 2.0 in Business Central The process is simpler than many expect. Here’s the streamlined approach: To conclude, switching to SMTP with OAuth 2.0 in Business Central is not just a technical requirement it’s a strategic move toward secure, modern communication. The setup is straightforward, but the payoff is significant: stronger security, smoother compliance, and reliable email delivery.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange