Blog Archives - Page 12 of 169 - - Page 12

Category Archives: Blog

Fixed Asset Depreciation: As per Companies Act and Income Tax Act

Indian Companies are required to maintain fixed asset records as per Companies Act and Income Tax Act, governed by Companies Act, 2013, and the Income Tax Act, 1961, respectively. While the Companies Act focuses on providing a true and fair view of an organization’s financial position for stakeholders, the Income Tax Act is concerned with determining taxable income and ensuring fair tax collection. This blog will explore the methodologies, and practical considerations for managing fixed asset depreciation under the Companies Act and the Income Tax Act, helping organizations to be compliant. This is possible by following the below steps: This functionality helps companies to maintain proper records, traceability and retrieval of records for compliance. Organizations must carefully maintain separate records and calculations to meet the distinct requirements of both laws. Aligning accounting practices with these regulatory frameworks not only helps in minimizing compliance risks but also optimizes financial planning and tax efficiency. Regular reviews and updates in line with legislative changes are critical to sustaining accurate asset management and reporting. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Transforming Development: How Copilot is Revolutionizing Developer Productivity

Software development has been around since the 1940s.We started with punch cards, then machine language, followed by assembly, high-level programming languages, low code, no code, and now AI-assisted coding. Along the way, several tools have been developed to make programmers’ jobs easier, from card sorters and verifiers to debuggers and IDEs. Now, with the advent of AI, we have large language models (LLMs) writing code for us, but I don’t think it’s quite there yet. In this article we’ll see how AI assists developers, what it can do for us today, its limitations, and where it’s headed. The concept of AI began in the 1950s when researchers tried to imbue machines with the magic to think. Early systems followed set rules, but as computers improved and data became more available, smarter methods emerged, such as machine learning, natural language processing, and neural networks. Large Language Models (LLMs) grew from these advances, using huge amounts of data and computing power to understand and create language. This marked a shift from fixed rules to models that learn on their own. By 2025, AI has taken root in most fields, even in places we might not have expected.For example, robotic bees — tiny drones designed to mimic bee behavior, are now being used to assist with pollination in areas where natural bee populations are struggling. These drones combine machine learning and computer vision for navigation, flight control, pollination strategies, and swarm intelligence. Usage Copilot is integrated with both Visual Studio Code and Visual Studio, and it comes with a few LLMs built in by default.Currently, these include Claude Sonnet 3.5, GPT-4o, o3-mini, and Gemini Flash 2.0.If you want to add more models, you’ll need a subscription for Copilot Pro. We can use Copilot Chat to prompt these models directly in the sidebar chat, whether to generate a specific functionality or create an entirely new file. Here, I asked it to create a simple sales order.Notably, it kept the key details — Customer, Item, and Quantity — as parameters without requiring any input. From here, we can click a button to apply the changes to the open file. At the bottom, we can see which file Copilot is currently using as a reference.If we want to stop Copilot from referencing that file, we can click the eye button. We can also ask it to make changes to the generated code. Now, I noticed that while it has parameterized the “Customer No.” for the sales order, it hasn’t actually used it anywhere in the code. If I point this out to Copilot… Instead of using Copilot Chat, we can also get recommendations directly within the file.Here, I’m trying to write a function to delete a sales order based on the given SO No. I can just tab my way into writing the method. One common way I’ve used copilot is to add Guard clauses to methods that I’ve written. For instance –  Here, it is referring to Customer and Item record variables, which don’t exist yet. But if I go to the variables section then it knows what I’m trying to do and suggests the same. Now, if I were to make it handle something complex, that’s when the cracks start to show. For example, pulling data from an API and creating customers would require several steps — authenticating with the API, fetching the data, parsing it, handling errors or logging, and finally creating the customers. We get the following as an output – Here, we can see that while it has a surface-level understanding of the code structure and the steps needed to achieve the goal, it struggles with the details. This could be because, unlike open-source languages like Java, Python, or C++, there isn’t as much publicly available source code for AL. I believe Microsoft Documentation would have helped to some degree, but instead, it tends to guess what the correct methods or fields should be. To its credit, the generated code isn’t far off from being functional, especially considering the simplicity of the input prompt. The structure it provides is still a solid starting point and much better than writing everything from scratch. Another example of these “hallucinations” is when it suggests methods that don’t actually exist, like this- However, once you show it what the correct method is, it suggests that –  To go one step further, I asked the different models to create an entire project based on the below prompt –  Findings: o3-mini 1. The objects it generated had the fewest errors.2. It was the simplest and closest to compiling successfully.3. It returned all the text in a single response, so I had to manually create files from it. GPT-4o 1. Created a Readme.md with project requirement details.2. Automatically generated the necessary project files.3. Farthest from compiling successfully, with most requirements missed.4. There were plenty of hallucinations, including methods that don’t exist in AL at all – like this example below. Gemini Flash 2.0 1. Created a Readme.md with project requirement details.2. Automatically generated the necessary project files.3. Added launch.json, settings.json, and app.json.4. Didn’t meet all requirements but managed to lay some groundwork.5. Struggled with code structure in several places, though still significantly better than GPT-4o.6. Had at least a couple of pages with zero errors. Claude Sonnet 3.5 1. Created a Readme.md with project requirement details.2. Automatically generated the necessary project files.3. Added launch.json and app.json.4. Included a test codeunit, though it had errors.5. Created a permission set for the objects generated.6. All files had one or more errors. In my opinion, Claude and o3-mini are the most useful for coding assistance. HumanEval is a test developed by OpenAI to assess how well language models can write code.It includes 164 programming problems where the model must generate accurate and functional Python code. The HumanEval leaderboard aligns with my assessment as well. Pricing While all these models offer a free trial with a limited set of tokens, they can become quite expensive if you don’t monitor your usage. Below … Continue reading Transforming Development: How Copilot is Revolutionizing Developer Productivity

Share Story :

How to Connect to a Sandbox (UAT) Database in Dynamics 365 Finance & Operations

Microsoft Dynamics 365 Finance & Operations (D365 F&O) is a powerful enterprise solution that helps businesses streamline their operations. However, troubleshooting issues in D365 F&O can be challenging if the root cause isn’t visible on a form. One of the most effective ways to diagnose problems is by connecting to the UAT (Sandbox) database and querying tables directly. This blog will walk you through: How to retrieve SQL connection details from LCS (Lifecycle Services) How to enable firewall access to allow a secure connection How to connect to the D365 UAT database using SQL Server Management Studio (SSMS) Why Connect to the UAT Database? Diagnose Issues: Querying the database allows you to inspect data and troubleshoot errors that aren’t visible in the front-end UI. Microsoft-Managed Environments: In sandbox/UAT environments, remote desktop access is restricted, making database queries essential for analysis. Test Before Deployment: Ensures that all configurations and data changes work as expected before going live. Step 1: Retrieve SQL Connection Details from LCS To connect to a D365 F&O UAT database, you must obtain SQL connection details from Lifecycle Services (LCS). Follow these steps: Go to Lifecycle Services (LCS): Select Your Project: Find the UAT Environment: Request Database Access: Find Database Connection Info: Step 2: Enable Firewall Access for Your IP Address By default, the D365 UAT database is secured behind a firewall. You must add a rule to allow access from your machine. Go to the LCS “Full Details” page for your UAT environment. Select: Maintain > Enable Access. Add a Firewall Rule: Note: The firewall rule expires after 8 hours, so you may need to re-add it later. Step 3: Connect to the UAT Database Using SQL Server Management Studio (SSMS) The best tool for connecting to the database is Microsoft SQL Server Management Studio (SSMS). Launch SSMS and Open the Connection Dialog Enter Connection Details from LCS Set Database Name in Connection Properties Click ‘Connect’ to Establish the Connection Key Takeaway Direct Access to Data: Enables in-depth troubleshooting by querying database tables directly. Secure and Controlled Access: LCS-managed firewall rules ensure data security. Easy Setup: The process takes only a few minutes to complete. By following these steps, you can quickly and efficiently connect to your D365 F&O UAT database and retrieve critical data for testing and issue resolution. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com. Need help troubleshooting your D365 environment? Let us know in the comments!

Share Story :

Cancellation of Old Purchase Orders in D365 F&O

In Dynamics 365 Finance and Operations (D365 F&O), managing the lifecycle of purchase orders (POs) is important for maintaining accurate procurement, inventory, and financial records. Over time, companies may accumulate old or obsolete purchase orders that are no longer valid — whether due to supplier changes, business needs evolving, or operational delays. Cancelling these POs helps keep the system clean, improves reporting accuracy, and prevents unnecessary financial commitments. Purchase orders can have the following status & stages: Stage Status Confirmed Invoiced Rejected Received Draft Open order Approved Cancelled In review   Finalized   From Finance & Accounts point of view, Open PO means commitments to order and contingent financial liabilities. Rationale behind cancelling of old Pos: Cancelling old or unordered POs ensures that your records are up to date and reflective of actual business needs, which is important for financial planning, reporting, and auditing. Companies can streamline their procurement processes by maintaining only those Purchase Orders which are active and required as per current business needs. This was an issue faced by one of our Client in Oil and Gas industries which was resolved by using the below method. Stage Status Finalized Invoiced   Received Closing of Purchase orders: Purchase Orders can be closed only if all the items contained in the Purchase order are invoiced and the delivery is completed. Cancelling of Purchase orders: Purchase orders having In principle, in the above cases, PO is no longer required, and requirements are not fulfilled though the particular PO. Hence, it is justified to cancel the PO and not close it. Stage Status Confirmed Open order Rejected   Draft   Approved   In review   Click on Cancel quantity. In case of approved & draft purchase orders. Deactivate the workflow and continue the same process. We can delete the purchase orders which are in draft, however, then it would not be traceable in the system and the number sequences would be disrupted. By following the above process, Companies can maintain only active Purchase Orders, thereby, showing the actual committed value of an organization. Effectively cancelling old purchase orders in D365 F&O is crucial for maintaining clean procurement records, improving reporting accuracy, and ensuring better control over open financial commitments. By following systematic cancellation processes and adhering to best practices, organizations can avoid confusion, prevent overstatement of liabilities, and streamline operational workflows. Regularly reviewing and closing obsolete purchase orders not only enhances system performance but also supports better decision-making for purchasing, budgeting, and inventory management. A disciplined approach to managing old POs ultimately leads to greater efficiency, improved compliance, and stronger financial governance within D365 F&O. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

From Commit to Inbox: Automating Change Summaries with Azure AI

In our small development team, we usually merge code without formal pull requests. Instead, changes are committed directly by the developer responsible for the project, and while I don’t need to approve every change in my role as the senior developer, I still need to stay aware of what’s being merged.  Manually reviewing each commit was becoming too time-consuming, so I built an automated process using Power Automate, Azure DevOps, and Azure AI.Now, whenever a commit is made, it triggers a workflow that summarizes the changes and sends me an email.This simple system keeps me informed without slowing down the team’s work. Although I kept the automation straightforward, it could easily be extended further.For example, it could be improved to allow me to reply directly to the committer from the email or even display file changes in detail using a text comparison feature in Outlook.We didn’t need that level of detail, but it’s a good option if deeper insights are ever required. Journey We get started with the Azure DevOps trigger “When a code is pushed”. Here we specify the organization name, project name and repository name. We can also specify a specific branch if we want to limit our tracking to simply that branch otherwise it tracks all the available branches to the User. Then we have a foreach loop that iterates over the “Ref Updates” object array. It contains a list of all the changes but not the exact details.This action pops up automatically as well when we configure the next action. Then we set up a “Azure DevOps REST API request to invoke” action. This has connection capabilities to Azure DevOps directly so it is better to use over a simple REST API action. We specify the relative URL as {Repository Name}/_apis/git/repositories/{Repository ID}/commits/{Commit ID}/changes?api-version=6.0 The Commit ID shows up as newObjectId in the “When code is pushed” trigger. Then we pass the output of this action to a “Create Text with GPT using a prompt” action under the AI Builder group.I’ve passed the prompt as below but it took several trials and errors to get exactly what I wanted. The last action is a simple “Send an email” one where I’ve kept myself as a recepient and I’ve added a subject and a body. Now to put it all together and run it – And here is the final output – When the hyperlinks are clicked they take me straight to azure while pointing to the file which is referred. For instance, if I click on the Events Codeunit – Conclusion Summarizing commit changes is just one way automation can make life easier.This same idea can be applied to other tasks, like summarizing meeting notes, project updates, or customer feedback.With a bit of creativity, we can use tools like this to cut down on repetitive work and free up time to focus on learning new skills or tackling more challenging projects.By finding smart ways to streamline our workflows, we can work more efficiently and open up more time for growth and development. If you need further assistance or have specific questions about your ERP setup, feel free to reach out for personalized guidance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Data-Driven Project Oversight: Selecting the Right Reports for Your Business

In today’s fast-paced business landscape, data-driven decision-making is essential for project success. Organizations must navigate vast amounts of data and determine which reports provide the most valuable insights. Effective project oversight relies on selecting the right reports that align with business objectives, operational efficiency, and strategic growth. The Importance of Data-Driven Oversight Data-driven project oversight ensures that organizations make informed decisions based on real-time and historical data. It enhances accountability, improves resource allocation, and mitigates risks before they become significant issues. The key to success lies in choosing reports that offer relevant, actionable insights rather than being overwhelmed by excessive, unnecessary data. Identifying the Right Reports for Your Business 1. Define Your Business Objectives Before selecting reports, clarify your project goals. Are you monitoring financial performance, tracking project timelines, evaluating team productivity, or assessing risk factors? Each objective requires different metrics and key performance indicators (KPIs). 2. Categorize Reports Based on Project Needs Reports can be categorized into various types based on their function: 3. Leverage Real-Time and Historical Data A balanced mix of real-time dashboards and historical trend analysis ensures a comprehensive understanding of project performance. Real-time reports help in immediate decision-making, while historical data provides context and trends for long-term strategy. 4. Customize Reports to Stakeholder Needs Different stakeholders require different levels of detail. Executives may prefer high-level summaries, while project managers need granular insights. Tailoring reports ensures that each stakeholder receives relevant and actionable information. 5. Automate and Visualize Reports for Better Insights Leveraging automation tools can streamline report generation and reduce human error. Data visualization tools such as Power BI, Tableau, or built-in reporting features in project management software can enhance comprehension and decision-making. Real-World Examples of Data-Driven Reports To illustrate the importance of selecting the right reports, here are two examples: 1. Return Management Dashboard This dashboard provides an overview of product returns, highlighting trends in return reasons, active cases, and return processing efficiency. By analyzing such reports, businesses can identify common product issues, improve quality control, and streamline return processes. 2. Billable Allocation Report This report tracks resource allocation in a project, helping businesses monitor utilization rates, availability, and forecasting staffing needs. By using such reports, companies can optimize workforce planning and reduce underutilization or overallocation of resources. To conclude, selecting the right reports for project oversight is crucial for achieving business success. By aligning reports with business objectives, categorizing them effectively, leveraging both real-time and historical data, and customizing insights for stakeholders, organizations can enhance efficiency and drive strategic growth. A well-structured reporting framework ensures that project oversight remains proactive, insightful, and results driven. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

How to Use the Debugger in Dynamics 365 Finance and Operations

Debugging is an essential skill for developers working with Dynamics 365 Finance and Operations (D365 F&O). The built-in debugger helps you identify and fix issues in your X++ code efficiently. In this blog post, we’ll walk through how to use the debugger effectively in D365 F&O. Prerequisites Before you can start debugging, you’ll need: Access to a D365 F&O development environment Appropriate permissions (developer role) Visual Studio installed (for some debugging scenarios) Enabling Debugging Set up debugging permissions: Navigate to System administration > Setup > License configuration Ensure the “Debugger” privilege is enabled for your user role Configure debugging options: Go to Tools > Options > Development > Debugging Configure your preferred debugging settings Starting a Debug Session There are several ways to start debugging in D365 F&O: 1. Attaching to a Process Open the Debugger workspace Click on Attach debugger Select the process you want to debug (user session) Click Attach 2. Debugging from Visual Studio Open your X++ project in Visual Studio Set breakpoints in your code Press F5 to start debugging (or use the Debug menu) 3. Using Conditional Breakpoints Navigate to the form or process you want to debug After adding the breakpoint, right-click it in the breakpoints list Select “Edit breakpoint” In the “Condition” field, enter your X++ expression Example: custAccount == “US-001” Key Debugging Features Breakpoints Breakpoints pause execution at specific lines of code. You can: Set conditional breakpoints that only trigger when certain conditions are met Set hit count breakpoints that trigger after a specified number of hits Enable/disable breakpoints as needed Stepping Through Code When execution is paused, you can: Step Over (F10): Execute the current line and move to the next Step Into (F11): Dive into method calls Step Out (Shift+F11): Complete the current method and return to the caller Examining Variables The debugger allows you to: View local variables in the Locals window Add watches for specific variables Quickly evaluate expressions in the Immediate window Call Stack The call stack shows: The hierarchy of method calls that led to the current execution point , Allows navigation to different levels of the call stack Debugging Different Scenarios Batch Jobs To debug batch jobs: Set breakpoints in the batch job code , submit the batch job and Attach the debugger to the batch process Business Events To debug business events: Set breakpoints in the event handler code Trigger the business event The debugger will pause when the event is processed Tips for Effective Debugging Use the debugger’s data tips (hover over variables to see their values) Common Debugging Challenges Solution: Use thread debugging and pay attention to execution order Solution: Replicate the production environment configuration as closely as possible Solution: Use targeted debugging rather than broad breakpoints To Conclude, The D365 F&O debugger is a powerful tool that can save you hours of troubleshooting time. By mastering breakpoints, variable inspection, and call stack navigation, you can quickly identify and resolve issues in your X++ code. Remember to use debugging judiciously in production environments and always follow your organization’s guidelines for debugging in live systems. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Method of Depreciation – Consumption Depreciation – D365 F&O

Financial records of a Company should show its true and fair value of its assets and liabilities. In circumstances, for value of assets to be shown correctly, they have to be depreciated as per their use. For examples, car (kms run), production machine (number of hours run). In this case, straight line or reducing balance method of depreciation is not appropriate. How to set up Consumption Depreciation in D365F&O: 2. Set up consumption units under Fixed Assets=>Setup=>Consumption depreciation=> Consumption units. 3. Set up the consumption factor, either percent or units. 4. Define the depreciation method to the specific asset. 5. Run depreciation proposal by selecting consumption depreciation proposal. This depreciation method is applicable to Manufacturing, Transportation & Logistics, Mining & Oil and Gas, Utilities & Energy, Agriculture, Printing and Publishing. This depreciation method is useful for performance-based maintenance and replacement planning. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Restoring a Deleted Posted Bank Reconciliation in Business Central: A Comprehensive Guide

Are you having trouble restoring a deleted posted bank reconciliation in Microsoft Dynamics 365 Business Central? In this blog, I’m going to guide you through the process of effectively restoring a deleted posted bank reconciliation and ensuring the accuracy of your financial records. You’ll learn the step-by-step procedure to re-post a deleted bank reconciliation, along with best practices to prevent future errors and maintain the integrity of your financial data. Let’s get started! Steps to Achieve Goal: page 50100 BankLedgerEntryEditable {     ApplicationArea = All;     Caption = ‘Bank Ledger Entry Editable’;     PageType = List;     SourceTable = “Bank Account Ledger Entry”;     UsageCategory = Lists;     Permissions = tabledata “Bank Account Ledger Entry” = RIMD;     layout     {         area(Content)         {             repeater(General)             {                 field(“Document No.”; Rec.”Document No.”)                 {                     ToolTip = ‘Specifies the document number on the bank account entry.’;                 }                 field(“Statement No.”; Rec.”Statement No.”)                 {                     ToolTip = ‘Specifies the bank account statement that the ledger entry has been applied to, if the Statement Status is Bank Account Ledger Applied.’;                 }                 field(“Statement Line No.”; Rec.”Statement Line No.”)                 {                     ToolTip = ‘Specifies the number of the statement line that has been applied to by this ledger entry line.’;                 }                 field(“Statement Status”; Rec.”Statement Status”)                 {                     ToolTip = ‘Specifies the statement status of the bank account ledger entry.’;                 }                 field(Amount; Rec.Amount)                 {                     ToolTip = ‘Specifies the amount of the entry denominated in the applicable foreign currency.’;                 }                 field(“Amount (LCY)”; Rec.”Amount (LCY)”)                 {                     ToolTip = ‘Specifies the amount of the entry in LCY.’;                 }                 field(“Posting Date”; Rec.”Posting Date”)                 {                     ToolTip = ‘Specifies the posting date for the entry.’;                 }             }         }     } } pageextension 50101 PostedBankAccRecon extends “Bank Account Statement List” {     trigger OnDeleteRecord(): Boolean     begin         Error(‘You cannot delete a bank account reconciliation entry.’);     end; } To conclude, by following these steps, you can successfully undo a deleted posted bank reconciliation in Business Central. The process involves editing the Bank Ledger Entry, recreating the bank reconciliation with the same details, and ensuring the previously deleted reconciliation is removed through AL. With this approach, you maintain accurate financial records and ensure that your bank account reconciliation process runs smoothly without any discrepancies. If you need further assistance or have specific questions about your Business Central setup, feel free to reach out for personalized guidance.  Happy Reconciliation! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Using Copilot for simplifying Sales Quote and Order Lines creation in Dynamics 365 Business Central

Microsoft is rapidly integrating Copilot across its ecosystem, empowering users with AI-driven assistance in various business processes. As enterprise systems become more connected, AI gains deeper access to data, enabling automation that eliminates tedious tasks and lets users focus on strategic decisions. In Dynamics 365, Copilot can help sales teams by generating Sales Quote Lines or Sales Order Lines by providing a rough prompt.  In this blog, we’ll explore how to leverage Copilot for a more efficient sales workflow by taking a sample use case. References Copilot in Business Central Overview Sales Line Suggestions with Copilot Scenario One fine morning your sales team receives an email from a customer who’s looking to try out your product. He sends your team an email. Your team goes to Business Central and creates a Sales Quote. In the lines section, they click on the Copilot button and click on “Suggest lines”. They can add the text the customer sent them directly or with some minor changes. And Copilot will find the best matching item and suggest some lines to the User. You can adjust the matching criteria to your required –  Permissive means that all keywords are optional. This option typically generates the most suggestions. Balanced is a blend of required and optional keywords. This option typically generates fewer suggestions. Precise means that all keywords are required. This option typically generates the fewest suggestions. Fast-foward a few days, the Customer is happy with your product and sends a bigger order. We can paste the entire description again into the suggest lines box. Copilot handles minor mistakes like spelling errors and mismatched totals without any intervention. And your new sales quote is ready! The accuracy of sales lines suggested by Copilot rely heavily on the quality of data present in the system. I’m not sure why Microsoft hasn’t included the same functionality for the Purchase side of things but I’m sure it’s not too far off in the future.  There’s already a BC Idea raised for this (Please vote it!).  If you need further assistance or have specific questions about your ERP setup, feel free to reach out for personalized guidance. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange