Azure Function Archives -

Category Archives: Azure Function

Smarter Data Integrations Across Regions with Dynamic Templates

At CloudFronts Technologies, we understand that growing organizations often operate across multiple geographies and business units. Whether you’re working with Dynamics 365 CRM or Finance & Operations (F&O), syncing data between systems can quickly become complex—especially when different legal entities follow different formats, rules, or structures. To solve this, our team developed a powerful yet simple approach: Dynamic Templates for Multi-Entity Integration. The Business Challenge When a global business operates in multiple regions (like India, the US, or Europe), each location may have different formats for project codes, financial categories, customer naming, or compliance requirements. Traditional integrations hardcode these rules—making them expensive to maintain and difficult to scale as your business grows. Our Solution: Dynamic Liquid Templates We built a flexible, reusable template system that automatically adjusts to each legal entity’s specific rules—without the need to rebuild integrations for each one. Here’s how it works: Why This Matters for Your Business Real-World Success Story One of our client’s needs to integrate project data from CRM to F&O across three different regions. Instead of building three separate integrations, we implemented a single solution with dynamic templates. The result? What Makes CloudFronts Different At CloudFronts, we build future-ready integration frameworks. Our approach ensures you don’t just solve today’s problems—but prepare your business for tomorrow’s growth. We specialize in Microsoft Dynamics 365, Azure, and enterprise-grade automation solutions. “Smart integrations are the key to global growth. Let’s build yours.” We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Create No Code Powerful AI Agents – Azure AI Foundry

An AI agent is a smart program that can think, make decisions, and do tasks. Sometimes it works alone, and sometimes it works with people or other agents. The main difference between an agent and a regular assistant is that agents can do things on their own. They don’t just help—you can give them a goal, and they’ll try to reach it. Every AI agent has three main parts: Agents can take input like a message or a prompt and respond with answers or actions.  For example, they might look something up or start a process based on what you asked. Azure AI Foundry is a platform that brings all these things together; so you can build, train, and manage AI agents easily. References What is Azure AI Foundry Agent Service? – Azure AI Foundry | Microsoft Learn Understanding deployment types in Azure AI Foundry Models – Azure AI Foundry | Microsoft Learnhttps://learn.microsoft.com/en-us/azure/ai-foundry/how-to/index-add Usage Firstly, we create a project in Azure AI Foundry. Click on Next and give a name to your project. Wait till the setup finishes. Once the project creation finishes we are greeted with this screen. Click on Agents tab and click on Next to choose the model. I’m currently using GPT-4o Mini. It also includes descriptions for all the available models. Then we configure the deployment details. There are multiple deployment types available such as – Global Deployments Data Zone Standard Deployments Standard deployments [Standard] follow a pay-per-use model perfect for getting started quickly.They’re best for low to medium usage with occasional traffic spikes.  However, for high and steady loads, performance may vary.Provisioned deployments [ProvisionedManaged] let you pre-allocate the amount of processing power you need.This is measured using Provisioned Throughput Units (PTUs).  Each model and version requires a different number of PTUs and offers different performance levels. Provisioned deployments ensure predictable and stable performance for large or mission-critical workloads. This is how the deployment details look for in Global Standard. I’ll be choosing Standard deployment for our use case. Click on deploy and wait for a few seconds. Once the deployment is completed, you can give your agent a name and some instructions for their behavior. You should specify the tone, end goal, verbosity, etc as well. You can also specify the Temperature and Top P values which are both a control on the randomness or creativeness of the model. Temperature controls how bold or cautious the model is. Lower temperature = Safer, more predictable answers. (Factual Q&A, Code Summarization)Higher temperature = More creative or surprising answers. (Poetry/Creative writing) Top P (Nucleus Sampling) controls how wide the model’s word choices are. Lower Top P = Only picks from the most likely words. (Legal or financial writing) Higher Top P = Includes less likely, more diverse words. (Brainstorming names) Next, I’ll add a knowledge base to my bot. For this example, I’ll just upload a single file.However, you have the option to add an sharepoint folder or files, connect it to Bing Search, MS Fabric, Azure AI search, etc as required. A Vector store in Azure AI Foundry helps your AI agent retrieve relevant information based on meaning rather than just keywords.It works by breaking your content (like a PDF) into smaller parts, converting them into numerical representations (embeddings), and storing them.When a user asks a question, the AI finds the most semantically similar parts from the vector store and uses them to generate accurate, context-aware responses. Once you select the file, click on Upload and save. At this point, you can start to interact with your model. To “play around” with your model, click on the “Try in Playground” button. And here, we can see the output based on our provided knowledge base. One more example, just because it is kind of fun. Every input that you provide to the agent is called as a “message”. Everytime the agent is invoked for processing the provided input is called a “run”. Every interaction session with the agent is called a “thread”. We can see all the open threads in the threads section. To conclude, Azure AI Foundry makes it easy to build and use AI agents without writing any code.  You can choose models, set how they behave, and connect your data all through a simple interface. Whether you’re testing ideas, automating tasks, or building custom bots, Foundry gives you the tools to do it.If you’re curious about AI or want to try building your agent, Foundry is a great place to begin. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com

Share Story :

Struggling with Siloed Systems? Here’s How CloudFronts Gets You Connected

In today’s world, we use many different applications for our daily work. One single application can’t handle everything because some apps are designed for specific tasks. That’s why organizations use multiple applications, which often leads to data being stored separately or in isolation. In this blog, we’ll take you on a journey from siloed systems to connected systems through a customer success story. About BÜCHI Büchi Labortechnik AG is a Swiss company renowned for providing laboratory and industrial solutions for R&D, quality control, and production. Founded in 1939, Büchi specializes in technologies such as: Their equipment is widely used in pharmaceuticals, chemicals, food & beverage, and academia for sample preparation, formulation, and analysis. Büchi is known for its precision, innovation, and strong customer support worldwide.  Systems Used by BÜCHI To streamline operations and ensure seamless collaboration, BÜCHI leverages a variety of enterprise systems: Infor and SAP Business One are utilized for managing critical business functions such as finance, supply chain, manufacturing, and inventory. Reporting Challenges Due to Siloed Systems Organizations often rely on multiple disconnected systems across departments — such as ERP, CRM, marketing platforms, spreadsheets, and legacy tools. These siloed systems result in: The Need for a Single Source of Truth To solve these challenges, it’s critical to establish a Single Source of Truth (SSOT) — a central, trusted data platform where all key business data is: How We Helped Büchi Connect Their Systems To build a seamless and scalable integration framework, we leveraged the following Azure services: >Azure Logic Apps – Enabled no-code/low-code automation for integrating applications quickly and efficiently. >Azure Functions – Provided serverless computing for lightweight data transformations and custom logic execution. >Azure Service Bus – Ensured reliable, asynchronous communication between systems with FIFO message processing and decoupling of sender/receiver availability. >Azure API Management (APIM) – Secured and simplified access to backend services by exposing only required APIs, enforcing policies like authentication and rate limiting, and unifying multiple APIs under a single endpoint. BÜCHI’s case study was published on the Microsoft website, highlighting how CloudFronts helped connect their systems and prepare their data for insights and AI-driven solutions. Why a Single Source of Truth (SSOT) Is Important A Single Source of Truth means having one trusted location where your business stores consistent, accurate, and up-to-date data. Key Reasons It Matters: How we did this We used Azure Function Apps, Service Bus, and Logic Apps to seamlessly connect the systems. Databricks was implemented to build a Unity Catalog, establishing a Single Source of Truth (SSOT). On top of this unified data layer, we enabled advanced analytics and reporting using Power BI. In May, we hosted an event with BÜCHI at the Microsoft Office in Zurich. During the session, one of the attending customers remarked, “We are five years behind BÜCHI.” Another added, “If we don’t start now, we’ll be out of the race in the future.” This clearly reflects the urgent need for businesses to evolve. Today, Connected Systems, a Single Source of Truth (SSOT), Advanced Analytics, and AI are not optional — they are essential for sustainable growth and improved human efficiency. The pace of transformation has accelerated: tasks that once took months can now be achieved in days — and soon, perhaps, with just a prompt. To conclude, if you’re operating with multiple disconnected systems and relying heavily on manual processes, it’s time to rethink your approach. System integration and automation free your teams from repetitive work and empower them to focus on high impact, strategic activities. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

How to Use Webhooks for Real-Time CRM Integrations in Dynamics 365 

Are you looking for a reliable way to integrate Dynamics 365 CRM with external systems in real time? Polling APIs or scheduling batch jobs can delay updates and increase complexity. What if you could instantly notify external services when key events happen inside Dynamics 365?  This blog will show you how to use webhooks—an event-driven mechanism to trigger real-time updates and data exchange with external services, making your integrations faster and more efficient.  A webhook is a user-defined HTTP callback that is triggered by specific events. Instead of your external system repeatedly asking Dynamics 365 for updates, Dynamics 365 pushes updates to your service instantly.  Dynamics 365 supports registering webhooks through plugin steps that execute when specific messages (create, update, delete, etc.) occur. This approach provides low latency and ensures that your external systems always have fresh data.  This walkthrough covers the end-to-end process of configuring webhooks in Dynamics 365, registering them via plugins, and securely triggering external services.  What You Need to Get Started  Step 1: Create Your Webhook Endpoint  Step 2: Register Your Webhook in Dynamics 365  Step 3: Create a Plugin to Trigger the Webhook  public void Execute (IServiceProvider serviceProvider)  {     var notificationService = (IServiceEndpointNotificationService)serviceProvider.GetService(typeof(IServiceEndpointNotificationService));     Entity entity = (Entity)context.InputParameters[“Target”];     notificationService.Notify(“yourWebhookRegistrationName”, entity.ToEntityReference().Id.ToString()); }  Register this plugin step for your message (Create, Update, Delete) on the entity you want to monitor. Step 4: Test Your Webhook Integration  Step 5: Security and Best Practices Real-World Use Case A company wants to notify its external billing system immediately when a new invoice is created in Dynamics 365. By registering a webhook triggered by the Invoice creation event, the billing system receives data instantly and processes payment without delays or manual intervention.  To conclude, webhooks offer a powerful way to build real-time, event-driven integrations with Dynamics 365, reducing latency and complexity in your integration solutions.  We encourage you to start by creating a simple webhook endpoint and registering it with Dynamics 365 using plugins. This can transform how your CRM communicates with external systems.  For deeper technical support and advanced integration patterns, explore CloudFront’s resources or get in touch with our experts to accelerate your Dynamics 365 integration project. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Using Open AI and Logic Apps to develop a Copilot agent for Elevator Pitches & Lead Qualification

In today’s competitive landscape, the ability to prepare quickly and deliver relevant, high-impact sales conversations is more critical than ever. Sales teams often spend valuable time gathering case studies, reviewing past opportunities, and preparing client-specific messaging — time that could be better spent engaging prospects.  To address this, we developed “Smart Pitch” — a Microsoft Teams-integrated AI Copilot designed to equip our sales professionals with instant, contextual access to case studies, opportunity data, and procedural documentation.  Challenge  Sales professionals routinely face challenges such as:  These hurdles not only slow down the sales cycle but also affect the consistency and quality of conversations with prospects.  How It Works  Platform  Data Sources  CloudFronts SmartPitch pulls information from the following knowledge sources:  AI Integration  Key Features  MQL – SQL Summary Generator  Users can request MQL – SQL document which contains   The copilot prompts the user to provide the prospect name, contact person name, and client requirement. This is achieved via an adaptive card for better UX.  HTTP Request to Logic App At Logic App we used ChatGPT API to fetch company and client information  Extract the company location from the company information, and similarly, extract the industry as well.  Render it to custom copilot via request to the Logic App.  Use Generative answers node to display the results as required with proper formatting via prompt/Agent Instructions. Generative AI can also be instructed to directly create a formatted json based on parsed values.   This formatted Json can be passed to converted to an actual Json and is used to populate a liquid template for the MQL-SQL file to dynamically create MQL-SQL for every searched company and contact person. This returns an HTML File with dynamically populated company and contact details as well as similar case studies, and work with client in similar region and industry.   This triggers an auto download of the MQL-SQL created as a PDF file on your system.   Content Search  Users can ask questions related to –  Users can ask questions like   “Smart Pitch” searches SharePoint documents, public case studies, and the opportunity table to return relevant results — structured and easy to consume.  –Security & Governance  Integrated in Microsoft Teams, so the same authentication as Teams. Access to Dataverse and SharePoint is read-only and scoped to organizational permissions.  To conclude, Smart Pitch reflects our commitment to leveraging AI to drive business outcomes. By combining Microsoft’s AI ecosystem with our internal data strategy, we’ve created a practical and impactful sales assistant that improves productivity, accelerates deal cycles, and enhances client engagement. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com 

Share Story :

Automating File Transfers from Azure File Share to Blob Storage with a Function App

Efficient file management is essential for businesses leveraging Azure cloud storage. Automating file transfers between Azure File Share and Azure Blob Storage enhances scalability, reduces manual intervention, and ensures data availability. This blog provides a step-by-step guide to setting up an Azure Timer Trigger Function App to automate the transfer process. Why Automate File Transfers? Steps to Implement the Solution 1. Prerequisites To follow this guide, ensure you have: 2. Create a Timer Trigger Function App 3. Install Required Packages For C#: For Python: 4. Implement the File Transfer Logic C# Implementation 5. Deploy and Monitor the Function To conclude, automating file transfers from Azure File Share to Blob Storage using a Timer Trigger Function streamlines operations and enhances reliability. Implementing this solution optimizes file management, improves cost efficiency, and ensures compliance with best practices. Begin automating your file transfers today! Need expert assistance? Reach out for tailored Azure solutions to enhance your workflow. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

How to Recover Azure Function App Code

Azure Function Apps are a powerful tool for creating serverless applications, but losing the underlying code can be a stressful experience. Whether due to a missing backup, accidental deletion, or unclear deployment pipelines, the need to recover code becomes critical. Thankfully, even without backups, there are ways to retrieve and reconstruct your Azure Function App code using the right tools and techniques. In this blog, we’ll guide you through a step-by-step process to recover your code, explore the use of decompilation tools, and share preventive tips to help you avoid similar challenges in the future. Step 1: Understand Your Function App Configuration Step 2: Retrieve the DLL File To recover your code, you need access to the compiled assembly file (DLL).From Kudu (Advanced Tools), navigate to the site/wwwroot/bin directory where the YourFunctionApp.dll file resides and download it. Step 3: Decompile the DLL File Once you have the DLL file, use a .NET decompiler to extract the source code by opening .dll file using a .Net decompiler and running the decompiler script. The decompiler I have used here is dotPeek which is a free .Net decompiler. To Conclude, recovering a Function App without backups might seem daunting, but by understanding its configuration, retrieving the compiled DLL, and using decomplication tools, you can successfully reconstruct your code. To prevent such situations in the future you can enable Source Control to Integrate your Function App with GitHub or Azure DevOps or set backups. We hope you found this blog post helpful! If you have any questions or want to discuss further, please contact us at transform@cloudfronts.com. Please refer to our customer success story Customer Success Story – BUCHI | CloudFronts to know more about how we used the function app and other AIS to deliver seamless integration. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Understanding Azure Function Trigger Methods and Recurrence Syntax in Dynamics 365

Azure Functions are a vital component of serverless computing, offering the flexibility to run event-driven code without the need to manage infrastructure. When integrated with Dynamics 365, they provide a robust mechanism for automating processes and extending the platform’s functionality. This blog explores Azure Function trigger methods and recurrence syntax, highlighting their relevance in Dynamics 365 scenarios. Azure Function Trigger Methods Azure Functions can be triggered by various events. These triggers determine how and when the function executes. Here are some commonly used trigger methods in Dynamics 365 integrations: 1. HTTP Trigger Example: 2. Queue Storage Trigger Example: 3. Timer Trigger Example: 4. Service Bus Trigger Example: Recurrence Syntax for Timer Triggers Timer Triggers in Azure Functions rely on CRON expressions to define their schedule. Understanding this syntax is crucial for scheduling Dynamics 365-related tasks. CRON Expression Format: Examples: 2. Run daily at 2:30 AM: 3. Run every Monday at 9:00 AM: Key Points: Integrating Azure Functions with Dynamics 365 To integrate Azure Functions with Dynamics 365: 4. For asynchronous processes, leverage Azure Storage Queues or Service Bus to manage workload distribution To conclude that, Azure Functions, with their diverse trigger options, provide unmatched flexibility for extending Dynamics 365 capabilities. The recurrence syntax in Timer Triggers ensures that tasks are executed precisely when needed, enabling efficient process automation. By combining these tools, organizations can unlock the full potential of Dynamics 365 in their digital transformation journey. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Azure Integration with Dynamics 365 Finance & Operations

Introduction: Businesses in the digital age depend on cloud platforms and ERP systems integrating seamlessly. Dynamics 365 Finance & Operations (F&O) and azure integration is one such potent combination. Numerous advantages, such as improved scalability, agility, and data-driven decision-making capabilities, are made possible by this integration. The step-by-step instructions for connecting Azure with Dynamics 365 F&O will be provided in this blog. Steps to achieve the goal: Step 1: Setting up Azure Services a. Create an Azure account: Sign up for an Azure account if you don’t have one already. b. Provision Azure resources: Set up the required Azure resources such as virtual machines, databases, storage accounts, and other services according to your needs. Below are few links to create azure account. https://learn.microsoft.com/en-us/answers/questions/433827/how-to-get-an-azure-account-without-credit-card https://azure.microsoft.com/en-in/free/students Step 2: Configure Azure Active Directory (AAD) a. Click on New on the App Registration page. Set the name and set the type like below screenshots. b. Once you click on Ok button you would get notification like below. c. Now go to API Permission and click on Add permission d. Select Dynamics ERP e. Select Delegated Permission f. Select all permission and then click on Add Permission g. After selecting this permission again add permission on the screen this time selected Application Permission. h. Now we have to generate client secret value. Just select Certificates and secret. i. You will see the below screen where you can generate a new client secret j. Once you click on new you will see below screen where you can set the date to which this secret key would be valid. Max validity is 2 years. k. This is how the secret value would look like just copy Value. l. Now copy the Directory ID and Application ID Step 3: Connect Azure Services to F&O a. Go to Finance and Operations and serach globally Azure Active Directory/Microsoft Entra ID b. And then click on New and add your client id over here and set User ID as Admin. Please Note you should have the admin access right if not this won’t work. Conclusion: Azure integration with Dynamics 365 Finance & Operations empowers businesses to streamline processes, unlock data insights, and achieve operational excellence. Next blog would be how to connect standard API on postman and perform get and post function. Stay tuned! We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Import an API and all its operations using its documentation in Azure API Management

Most of the time you would need to import an entire API collection with all of its supported operations to mask all of them or set policies on them. This can be easily done using Azure API management service. To start, log in to your Azure Portal, head over to the API management resource, and go to the API section on the left. From the options, select OpenAPI definition. Here I will use the Pet store API, https://petstore.swagger.io/ Go to the site or the site where your API stores the collection of all supported operations in JSON format. You can either have the JSON file of the collection or a website. In the OpenAPI specification put the link to your JSON collection or upload the JSON file and fill in the rest of the details and click on the Create button. You can see the list of operations appear. Hope you enjoyed this blog!

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange