Category Archives: Azure and Office 365
Smarter Data Integrations Across Regions with Dynamic Templates
At CloudFronts Technologies, we understand that growing organizations often operate across multiple geographies and business units. Whether you’re working with Dynamics 365 CRM or Finance & Operations (F&O), syncing data between systems can quickly become complex—especially when different legal entities follow different formats, rules, or structures. To solve this, our team developed a powerful yet simple approach: Dynamic Templates for Multi-Entity Integration. The Business Challenge When a global business operates in multiple regions (like India, the US, or Europe), each location may have different formats for project codes, financial categories, customer naming, or compliance requirements. Traditional integrations hardcode these rules—making them expensive to maintain and difficult to scale as your business grows. Our Solution: Dynamic Liquid Templates We built a flexible, reusable template system that automatically adjusts to each legal entity’s specific rules—without the need to rebuild integrations for each one. Here’s how it works: Why This Matters for Your Business Real-World Success Story One of our client’s needs to integrate project data from CRM to F&O across three different regions. Instead of building three separate integrations, we implemented a single solution with dynamic templates. The result? What Makes CloudFronts Different At CloudFronts, we build future-ready integration frameworks. Our approach ensures you don’t just solve today’s problems—but prepare your business for tomorrow’s growth. We specialize in Microsoft Dynamics 365, Azure, and enterprise-grade automation solutions. “Smart integrations are the key to global growth. Let’s build yours.” We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
How We Used Azure Blob Storage and Logic Apps to Centralize Dynamics 365 Integration Configurations
Managing multiple Dynamics 365 integrations across environments often becomes complex when each integration depends on static or hardcoded configuration values like API URLs, headers, secrets, or custom parameters. We faced similar challenges until we centralized our configuration strategy using Azure Blob Storage to host the configs and Logic Apps to dynamically fetch and apply them during execution. In this blog, we’ll walk through how we implemented this architecture and simplified config management across our D365 projects. Why We Needed Centralized Config Management In projects with multiple Logic Apps and D365 endpoints: Key problems: Solution Architecture Overview Key Components: Workflow: Step-by-Step Implementation Step 1: Store Config in Azure Blob Storage Example JSON: json CopyEdit { “apiUrl”: “https://externalapi.com/v1/”, “apiKey”: “xyz123abc”, “timeout”: 60 } Step 2: Build Logic App to Read Config Step 3: Parse and Use Config Step 4: Apply to All Logic Apps Benefits of This Approach To conclude, centralizing D365 integration configs using Azure Blob and Logic Apps transformed our integration architecture. It made our systems easier to maintain, more scalable, and resilient to changes.Are you still hardcoding configs in your Logic Apps or Power Automate flows? Start organizing your integration configs in Azure Blob today, and build workflows that are smart, scalable, and maintainable. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Creating and Accessing Blob Storage with Azure Data Factory: A Complete Guide
Introduction: This guide will walk you through creating and accessing Azure Blob Storage and integrating it with Azure Data Factory to automate data pipelines. From setting up a storage account and managing containers to configuring pipelines and transferring data to an Azure SQL Database, this step-by-step tutorial ensures you gain a comprehensive understanding of the process. Steps: 3. Click on + Create to initiate the creation of a new storage account. 4. Fill in the required fields like subscription, resource group, and region. Review all the settings before proceeding. 5. Create a Storage Account 6. Once the storage account is created, go to the resource by clicking on Go to Resource. 7. In the storage account, navigate to the Containers section and click + Container to create a new container for storing your files. 8. Click on the container you just created to access its contents. 9. Upload the desired JSON file into the container by clicking on Upload and selecting the file from your local system. 10. Ensure that the uploaded file is now listed in the container. 11. Go back to the Azure Portal and search for Azure Data Factory to open the ADF service. 12. From the ADF home screen, go to Author > Datasets. Click + New Dataset to create a new dataset for your Blob Storage. 13. Select the Azure Blob Storage dataset type, as you are working with data stored in Blob Storage. 14. Choose the data format that matches the file you uploaded, such as JSON, and click Continue. 15. Enter the necessary details for your dataset, including the file path and format settings. Select the appropriate Authentication type and specify the Storage account where the Blob Storage resides. Click Create to finalize the dataset creation. 16. Verify the settings and click OK to confirm the dataset configuration. 17. Navigate to the Pipelines section and click + New Pipeline to create a pipeline that will define your data flow. 18. Pipeline gets created successfully as shown below. 19. In the pipeline, select the dataset type as Azure SQL Database and click Continue to set up the SQL Database dataset. 20. Provide the necessary Linked Service details for your SQL database and click Create. 21. After configuring both the source and target datasets, and the pipeline, publish all the elements to save your work. 22. Once the pipeline is running successfully, you can verify its functionality by querying the destination database to ensure data is being transferred properly. a. Go to the SQL Database and select the relevant database. b. Select the database on which we have perform a query. c. Log in with your credentials. d. Write a simple test query to verify data has been transferred from Blob Storage to the SQL Database. Execute the query and confirm that the expected output is returned. Conclusion: Integrating Azure Blob Storage with Azure Data Factory is a powerful way to manage and automate data workflows in the cloud. This guide walks you through creating a storage account, configuring containers, uploading data, and designing a pipeline to process and transfer data to Azure SQL Database. By following these steps, you can efficiently handle large-scale data integration and ensure seamless communication between your data sources and destinations. Azure Data Factory not only simplifies the process of orchestrating data pipelines but also provides robust options for monitoring and optimizing workflows. Whether you are managing JSON files, processing transactional data, or setting up complex ETL processes, Azure’s ecosystem offers a reliable and scalable solution. Start exploring these tools today to unlock new possibilities in data-driven operations! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Integrating Project Operations to Financial Platforms
Introduction Dynamics 365 Project Operations (PO) is a project management application within the Dynamics 365 suite. It is designed to manage project-related tasks, schedules, resources, and budgets. While they may include some financial functionalities, they often lack the comprehensive financial management capabilities that dedicated financial platforms offer. In this article, we will explore several functions that Project Operations (PO) cannot perform as effectively as financial platforms like QuickBooks (QB) or Dynamics 365 Business Central (BC). We will also discuss how to bridge this gap and create a seamless integration between Project Operations and these financial platforms. Let’s first look at what Project Operations falls short of and what financial platforms like QuickBooks or Dynamics 365 Business Central can offer. Accounting Functionalities General Ledger Management: Financial platforms provide robust general ledger management, allowing for detailed tracking and reporting of all financial transactions across the entire organization. Accounts Payable and Receivable: They manage accounts payable (AP) and accounts receivable (AR) efficiently, including invoicing, bill payments, and collections. Tax Compliance: Financial platforms are equipped with tools to manage tax calculations, filings, and compliance with local and international tax regulations. Financial Reporting: Financial platforms offer extensive reporting capabilities, including profit and loss statements, balance sheets, and customizable financial reports. Audit Trails: Financial platforms maintain detailed audit trails of all financial transactions, which are crucial for internal audits and external regulatory audits. To leverage the Project Management features of Project Operations and the above-discussed features of financial platforms, businesses often choose to integrate both systems. Integration Approach Custom integration offers the utmost flexibility when connecting Project Operations with QuickBooks or Business Central. Several key considerations and entities are important to ensure a seamless integration: Data Mapping: Tables: Identifying the key entities (Tables) such as projects, expenses, invoices, customers, vendors, contacts, and accounts that need to be synchronized between project operations and financial platforms. Mapping: Map the fields and attributes of these entities between the two systems to ensure accurate data transfer and synchronization. Tip: The best practice is to create mapping Excel for maintaining the table and column mappings between the systems. Chart of Accounts (COA): Chart of Accounts: Proper alignment between the chart of accounts in Project Operations and the financial platforms is necessary to facilitate accurate financial reporting and reconciliation. Tip: Creating custom tables for your Chart of Accounts (COAs) and designating the financial systems as the source of truth for COAs is recommended. This approach offers flexibility to associate COAs with expenses, materials, roles, etc. API Integration: API Access: Check if the financial platforms offer APIs for integration. Integration Points: Determine the integration points where data will be exchanged between the two systems, such as project creation, expense tracking, invoice generation, and payment reconciliation. Data Flow: Data Direction: Define the direction of data flow between Project Operations and financial platforms, ensuring consistency and integrity of data. The source and the target systems should be defined. Real-Time Sync: Decide whether data synchronization will occur in real-time or through scheduled batch processes to meet business requirements. Currency: Currency Conversion: Consider currency conversion requirements when dealing with contracts or transactions in multiple currencies. Error Handling and Logging: Error Handling: Implement mechanisms to handle data validation errors, inconsistencies, and exceptions during data transfer between systems. Logging: Maintain logs of integration activities and errors for troubleshooting, audit trails, and compliance purposes. Security: Authentication: Implement secure authentication mechanisms to ensure data privacy and integrity during data exchange between systems. Access Control: Define roles and permissions to restrict access to sensitive data and functionalities based on user roles and responsibilities. Testing: Testing: Set up a dedicated testing session to validate the integration setup, data mappings, and synchronization processes before deploying to production. Integration process flow diagrams: Create a process flow diagram for all the entities, for example below, is an integration process flow diagram for integrating Accounts, Contacts, Vendors from Project Operations to Quick Books. In conclusion, while Project Operations is essential for managing the operational aspects of projects, it lacks the depth and breadth of functionalities offered by dedicated financial platforms. Financial platforms provide accounting, regulatory compliance, advanced financial reporting, cash flow management, and more, which are crucial for the overall financial health and strategic planning of an organization. Integrating these platforms with Project Operations tools leverages the strengths of both, ensuring efficient project management and robust financial oversight. Here is our featured Customer Success Story: Armexa, a leading US-based Industrial Cybersecurity Company, partnered with CloudFronts for Services Automation with Microsoft Dynamics 365 Project Operations and Business Central. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Setting Up Business Central On-Premise (BC-230) on a Virtual Machine
Introduction: Microsoft Dynamics 365 Business Central (formerly known as Microsoft Dynamics NAV) is a comprehensive business management solution that helps organizations streamline their financials, supply chain, sales, and customer service processes. It offers robust features for managing various aspects of your business, from inventory control to financial reporting. In this blog post, we’ll focus on the steps to download and install Business Central on-premises within a virtual machine. Whether you’re a developer, IT administrator, or business user, understanding this process is essential for setting up a local environment to explore and work with Business Central. Pre-requisites: Steps: 2. Choose the region for the business central 3. Extract the downloaded file. 4. Go to the extracted file and click on setup. 5. Choose Advanced Installation Options -> Choose an Installation Option -> Custom. 6. Make all the listed components available (Run from My Computer) or (Run all from My Computer). 7. Make the necessary changes. 8. Go to Azure Portal and assign the DNS Name in Azure Portal. 9. After the successfully installation, go to Windows PowerShell ISE -> “Run as Administrator” and execute the below commands line by line. Set-ExecutionPolicy unrestricted -Force Import-Module ‘C:\Program Files\Microsoft Dynamics 365 Business Central\230\Service\NavAdminTool.ps1’; Get-NAVServerConfiguration -ServerInstance BC230 Set-NAVServerConfiguration -KeyName EnableDebugging -KeyValue true Set-NAVServerConfiguration -KeyName DeveloperServicesEnabled -KeyValue true Restart-NAVServerInstance -ServerInstance BC230 Get-NAVServerUser BC230 Set-NavServerUser -Company ‘CRONUS International Ltd.’ Note: Upon executing the New-SelfAssingedCertificate command, a Thumbprint will be generated. Please retain the thumbprint ID for your reference. New-SelfSignedCertificate -DnsName “www.shubhazure.eastus.cloudapp.azure.com” -CertStoreLocation “Cert:\LocalMachine\My” 10. After creation/installation of Certificate, Go to Manage Computer Certificates > right-click on the Certificate > All Tasks>Manage Private Keys > Add NETWORK SERVICE and allow access to all the users and copy the certificate to Enterprise Trust, Trusted People, Trusted Publisher and Trust Devices folder. 11. Go to IIS Manager in that BC230 -> click on Browse “:8080 (http)”. 12. Change Credential Type and Add Thumbprint Set-NAVServerConfiguration -KeyName ServicesCertificateThumbprint -KeyValue Set-NAVServerConfiguration -KeyName ClientServicesCredentialType -KeyValue NavUserPassword 13. Change the Credential Type in navsettings.json file: This tells Business Central Clients to change the CredentialTypes for the Client. Goto C:\inetpub\wwwroot\<WEB SERVER INSTANCE>\navsettings.json 14. Go to Users in Business Central, insert the Password: 15. Binding your Web Server Instance with SSL / Self-Signed Certificate in IIS: 16. Restart the Server Instance in Business Central Administration and Webserver instance in IIS. Result After inserting the credentials, you will get access to Business Central. Conclusion Thus, in this blog we saw how to download Business Central (BC230) on Virtual Machine. We hope you found this article useful and if you would like to discuss anything you can reach out to us at transform@cloudfonts.com.
Share Story :
How to create Virtual Machine on Azure Portal?
Introduction: A virtual machine (VM) is like a computer within a computer. It’s software that emulates a physical computer, allowing you to run multiple operating systems and applications on a single physical machine. VMs are useful for testing software, running different operating systems simultaneously, isolating applications, and consolidating hardware resources. They provide flexibility, scalability, and cost savings by reducing the need for physical hardware and allowing for more efficient resource utilization. Pre-requisites: Configuration: To create a Virtual Machine using Azure Portal Go to Azure Portal and sign in with your credentials. Once you are signed in, you will see the Azure Portal dashboard. This dashboard is customizable and can be tailored to your needs. To create a new resource, click on the Create a resource button on the left-hand side of the dashboard. You will be taken to this page where you can select the type of resource you want to create. Choose the appropriate resource type and follow the prompts to create it. To create a Virtual Machine, select that and enter required credentials. After the deployment is complete click on “Go to resource”. Click on Connect and download the RDP File. Now the Virtual Machine is created. To start the Virtual Machine, Click on Select > Start VM Then go to downloads and click on the downloaded RDP File > Connect. Enter the password which you had entered while the creation of Virtual Machine. Conclusion Thus, in this blog we saw how to create virtual machine and how to setup Business Central on Virtual Machine. Thank you!
Share Story :
Secure your Environment Variable inside Azure Function using Key Vault
In the previous blog, we learn about how to use the Environment variable in Azure Function. Environment variables are most important and confidential as they might contain the system credentials or configure that you don’t want any to access directly. The most unsecure way to store them is directly inside the Configuration of Azure function so in this blog we will see how to store the Environment variable inside the Key Vault and use the Key Vault reference in Configuration. You can create an Azure function and use Environment Variable to do so do refer to my previous blog: Step 1: Create a Key Vault Login to Azure Portal and click on + Create a resource called “Key Vault”. Click on Create You can create a new resource group or select the existing group based on your preference. (You can create all resources related to a single project in one resource group so that it will easier to manage resources project wise) You need to set up the Access policy during the creation or after creating the Key Vault. We will set up an access policy for Azure Function later in the blog. Step 2: Set an Access policy for Azure Function. Open the Azure Function in which you want to use the reference of Azure Key Vault. Navigate to Identity Tab and toggle System assigned status to “ON”. Copy the Object (Principle) ID as we are going to use it during adding access policy. Open Azure Key Vault and Navigate to the Access Policies section. Click on + Add Access Policy We will only Add Get Permission in the Secret permissions section. You need to only add the permission that you want to allow your Azure Function can perform. Select the Principle User. Copy the Copied “Object ID” into search it will be easier to find it. Click on Add. Once you will add a resource in the access policy, it will allow your resource (Azure Function) to perform a Get operation on all the secrets from Key Vault. Do not forget to Save after you add the policy. Step 3: Add Secret and Configure it inside Azure Function Navigate to the Secrets section and Click on + Generate/Import Enter the Name and Value(Credentials/Configuration) Open the current version and Copy the secret URL as it will be required while configuring the Reference. Navigate to the Configuration section of Azure Function. Change the Configuration and add the KeyVault Reference as below: @Microsoft.KeyVault(SecretUri=<Copied SecretURL>) Do not forget to Save after changing the configuration Step 4: Testing using Postman We will require the API testing tool, here I am using Postman and the following is the link to download “Postman”. https://www.postman.com/downloads/ Copy the Function URL and send a post request. As result, you will notice the username is coming from KeyVault as we have to change the configuration. Conclusion: This is how you can secure your Environment Variable using Azure KeyVault. You can set up multiple KeyVault for different deployment and access policies depending on the requirement. When you are storing the Credentials and Configuration in KeyVault, you just need to set up the Access policy. All configuration is managed by the Azure Admin and if origination has a policy that they don’t want to share the credentials they can follow this process and only share the KeyVault URL with the developer so that they only need to configure it.
Share Story :
How to call logic app in Azure function app
In this blog, we will see how we can trigger the logic app using the azure function app. Let’s start with creating a logic app (HTTP triggered) that sends an email when the logic app URL is hit in the function app. Step 1: Create a logic app resource and start by adding an HTTP GET request trigger. Url will auto-generate after saving the logic app. Step 2: Add send email block after that to receive an email Step 3: now create a function app in visual studio by creating a new project. Step 4: Add project name and now select type of function app, select the create option Step 5: Add the below code to the auto-generated code template in the visual studio Step 6: Run the function locally. then you will get URL in the command prompt copy that and paste it into your browser. Step 7: After entering the URL you should receive an email as mentioned in the logic app. In this way, the logic app triggered successfully using the function app Hope this blog helps you. Thank you!!
Share Story :
How to create a table using Azure Logic Apps with proper formatting
In this blog, we will see how we can create a table using HTML language using the Azure logic app. If you see carefully, there is an action block present which is “Create HTML table” but it does not give formatting flexibility. so in this blog, I will explain how we can use compose block to create a table with HTML syntax Step 1: To start with the logic app I took the recurrence trigger. you can use any trigger as per your requirement (which is set to run once a day) Step 2: Compose block is an important block as we are writing HTML syntax to format our table. For the demo I used sample data otherwise you can enter dynamic fields as well. in the highlighted section. Step 3: So output of Compose block is sent as the body of the email as shown in the below screenshot. Output: Email Hope this blog helps you. Thank you !!
Share Story :
Update records in Dynamics CRM using Azure Logic Apps
In this blog, we will see how we can update the records CRM with the help of a logic app workflow. Step 1: Add the recurrence trigger in the logic app and set it to run in a one-day interval. you can set any interval. Without a trigger, you cannot create a logic app. Step 2: Add new step after recurrence trigger Step 3: Add the List records action from Dynamics 365 and connect to CRM with your credentials Select Account entity Step 4: For testing purposes, I have created a test account (account number = 1001)in the UAT environment as shown below. Step 5: Initialize the variable with account number is 1001. The account where you want to change/update the data Step 6: Filter the list of accounts where the account number is equal to 1001 as mentioned above step. Step 7: After finding a record from the account list then we will update the record here I updated the account name and city(Note: Account number should unique) Result Hope this blog helps you Thank you!