Tag Archives: Azure
Introduction to Azure Service Bus and Its Use Case
Introduction Azure Service Bus is a fully managed, multi-tenant cloud messaging service functioning as a brokered messaging system. In a software-oriented architecture (SOA), application components interact through communication protocols over a network, facilitated by the Service Bus. This article provides an overview of Azure Service Bus, highlighting its role in integrating systems like Microsoft Dynamics 365 CRM with third-party e-commerce platforms. Real-World Scenario: Integrating Dynamics 365 CRM with an E-commerce Platform Azure Service Bus is instrumental in enabling seamless interaction between Dynamics 365 CRM and external e-commerce applications, enhancing data management and operational efficiency. – Customer Data Synchronization: Customer data from the e-commerce platform is transferred to Dynamics 365 CRM using Service Bus queues, ensuring the CRM system reflects the latest information. – Order Processing: When an order is placed, it triggers a message to Dynamics 365 CRM, streamlining order fulfilment and tracking through Service Bus topics and subscriptions. – Inventory Management: Inventory levels are updated in real-time across both systems. Messages sent through Service Bus ensure accurate stock levels, preventing overselling. – Customer Support Integration: Customer support tickets from the e-commerce platform are channelled to Dynamics 365 CRM, providing a comprehensive view of customer interactions and improving support quality. Use Case Real-Time Data Synchronization Between Dynamics 365 CRM and Finance & Operations Scenario: Imagine a company that uses Microsoft Dynamics 365 CRM for customer relationship management and Dynamics 365 Finance & Operations (F&O) for financial and inventory management. To ensure consistent and accurate data across these systems, especially regarding inventory levels, real-time data synchronization is essential. Solution: In this integration scenario, the goal is to synchronize inventory levels between Microsoft Dynamics 365 CRM and Finance and Operations (F&O) systems to ensure real-time accuracy. The process starts with Dynamics 365 CRM, where changes in inventory, such as sales or restocking, trigger an event. This event generates a message containing the updated inventory details, which is then sent via Azure Service Bus. Azure Service Bus serves as a reliable messaging service that decouples the CRM and F&O systems, facilitating smooth communication between them. Once the message reaches Azure Service Bus, it is picked up by an Azure Logic App. The Logic App orchestrates the integration process, potentially using Azure Functions for tasks such as data transformation, validation, or enrichment. For instance, it may convert the message into a format compatible with the F&O system, such as OData, a standard protocol for data exchange. After processing, the transformed data is sent to the F&O system, where the inventory levels are updated accordingly. This setup ensures that inventory records are synchronized in real time across both systems, preventing issues like overselling by maintaining up-to-date stock levels. The use of Azure Service Bus and Logic Apps not only supports real-time communication but also offers a scalable and flexible integration solution that can adapt to evolving business needs. Key benefits of this approach include real-time updates, fault tolerance through message persistence and retry logic, and the flexibility to scale and integrate systems without tight coupling. Azure Service Bus Queues and Topics and Subscriptions Azure Service Bus offers Queues and Topics and Subscriptions as core features, enabling different messaging patterns to suit various use cases. Queues facilitate point-to-point communication, while Topics and Subscriptions support a publish-subscribe model. This flexibility allows for efficient data transfer and processing across applications. Stay tuned for my next post, where we’ll explore the specific scenarios in which to use queues versus topics and subscriptions. Conclusion Azure Service Bus provides a versatile and reliable messaging solution for building scalable, decoupled distributed applications. By integrating seamlessly with the broader Azure ecosystem, Service Bus empowers developers to create efficient communication channels, enhancing the performance and reliability of their applications. Whether you’re modernizing existing systems or developing new cloud-native applications, Azure Service Bus is an essential tool for delivering an excellent user experience. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 1
Introduction In this blog, I will explain how we can automate generating access tokens or refresh tokens. When working with APIs, a common problem is the expiration of the access token or refresh token after some time. We solved this issue by using Azure Data Factory and Azure Key Vault. Azure Key Vault is used for storing API credentials as it is one of the most secure ways to store keys/secrets in Azure. Azure Data Factory is used to automate the process of generating access tokens for APIs. We are dividing this blog into two parts: Before we proceed with the blog, please test your API in Postman to know the API requirements for generating access tokens. For me, it is client ID, client secret, and Refresh Token. Steps to Set Up Azure Key Vault and Azure Data Factory: – Go to the Azure portal and create a Key Vault resource. Please make sure that your Key Vault and Azure Data Factory are in the same region. – Create a secret by generate/import and entering the required details. I have already created the secrets I need. – For the access token, you can keep the initial value as anything you want; we will update it later using an ADF Pipeline. – Set up an access policy for the Azure Data Factory to access our Key Vault. To do this, go to “Access Policy” and select the appropriate options. – Click “Next” and select your Azure Data Factory, where you will be creating a pipeline for refreshing the access token. – Now, go to Azure Data Factory Studio and set up the linked Service for your API in the Azure data factory. – The dataset is also pretty straightforward, and I prefer to use a parameter for the relative URL so that I can reuse the same dataset and just set the URL of the API I want to call during runtime: Conclusion That’s all for the setup in part 1. We’ve covered the essential steps to set up Azure Key Vault and Azure Data Factory for securely managing API credentials and setting the groundwork for automating access token generation. These tools provide a reliable and secure way to handle token expiration, ensuring smooth API operations without manual intervention. In part 2, we will discuss in detail how we can automate access token generation using Azure Data Factory. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
BÜCHI Labortechnik AG partners with CloudFronts to renew the Contract Update – AIS Managed Services Agreement
We are delighted to announce that BÜCHI Labortechnik AG, headquartered in Switzerland and widely regarded as a global pioneer in laboratory equipment and technology solutions, has renewed its longstanding partnership with CloudFronts through a Contract Update – AIS Managed Services Agreement (MSA). BÜCHI Labortechnik AG is one of the largest solution providers in the world for R&D, quality control and production of laboratory technology. For more than 80 years, Switzerland-headquartered BÜCHI has been delivering world-class solutions for laboratory, industrial and parallel evaporation, spray drying, melting point, preparative chromatography, extraction, distillation & digestion, dumas, and near-infrared spectroscopy to meet the needs of customers around the globe. It operates its business through a fine network of branches, subsidiaries and affiliates spread across the globe. Learn more about them at https://www.buchi.com/en The foundation of BÜCHI’s partnership with CloudFronts was laid with the creation of a harmonious data integration mechanism and insightful data analytics reports, achieved through the utilization of Microsoft Azure Integration Services (AIS) and Power BI. With this agreement, CloudFronts will focus on providing a dedicated Azure Developer and ensuring that the necessary elements and commitments are in place to provide BÜCHI Labortechnik AG with proactive monitoring, rapid issue resolution, ongoing maintenance, and support. About CloudFronts CloudFronts is a Dynamics 365 focused Microsoft Solutions Partner helping Teams & Organizations worldwide solve their Complex Business Challenges with Microsoft Cloud. Our head office and robust delivery centre are based out of Mumbai, India, along with branch offices in Singapore & the U.S. Since its inception in 2012, CloudFronts has successfully served over 500+ small and medium-sized clients all over the world, such as North America, Europe, Australia, MENA, Maldives & India, with diverse experiences in sectors ranging from Professional Services, Financial Services, Manufacturing, Retail, Logistics/SCM, and Non-profits. Please feel free to connect with us at transform@cloudfronts.com
How to implement Azure Blob Lifecycle Management Policy
Introduction Azure Blob Storage Lifecycle Management allows you to manage and optimize the storage lifecycle of your data. You can define policies that automate the transition of blobs to different access tiers or delete them after a specified period. This can help reduce costs and manage the data efficiently. This Blog shows how to set up and manage lifecycle policies: Steps to Create a Lifecycle Management Policy Access the Azure Portal: Sign in to your Azure account and navigate to the Azure Portal. Navigate to Your Storage Account: – Go to “Storage accounts”. – Select the storage account where you want to apply the lifecycle policy. Configure Lifecycle Management: – In the storage account menu, under the “Blob service” section, select “Lifecycle management”. Add a Rule: – Click on “+ Add rule” to create a new lifecycle management rule. – Provide a name for the rule. Define Filters: You can specify filters to apply the rule to a subset of blobs. Filters can be based on: – Blob prefix (to apply the rule to blobs with a specific prefix). – Blob types (block blobs, append blobs, page blobs). Set Actions: – Define the actions for the rule, such as moving blobs to a cooler storage tier (Hot, Cool, Archive) or deleting them after a certain number of days. – You can specify the number of days after the blob’s last modification date or its creation date to trigger the action. Review and Save: – Review the policy settings. – Save the policy. Key Points to Remember – Access Tiers: Azure Blob Storage has different access tiers (Hot, Cool, Archive), and lifecycle policies help optimize costs by moving data to the appropriate tier based on its access patterns. – JSON Configuration: Policies can be defined using JSON, which provides flexibility and allows for complex rules. – Automation: Lifecycle management helps automate data management, reducing manual intervention and operational costs. Conclusion By setting up these policies, you can ensure that your data is stored cost-effectively while meeting your access and retention requirements. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Integrating Salesforce with InforLN using Azure Integration Services
Introduction Integrating Salesforce with InforLN is a critical task for organizations looking to streamline their sales and billing processes. With the AIS Interface, businesses can efficiently manage data flow between these two platforms, reducing manual effort, enhancing visibility, and improving overall organizational performance. In this Blog, it shows the detailed information for integration between Salesforce to InforLN. The AIS Interface is intended to Extract, Transform and Route the data from Salesforce to InforLN. The steps for integration would be same for different entities. Many organizations need Salesforce to InforLN integration because of the below reasons: Event Scenario Pre-Requisites: Process Steps: On Demand Load Scenario Pre-Requisites: Process Steps: Conclusion Based on the above Integration scenarios Azure Developer can easily navigate for the integration implementation and they can choose between Event Driven or On-Demand based on the business requirement. This integration not only simplifies complex processes but also eliminates redundant tasks, allowing teams to focus on more strategic initiatives. Whether your organization requires event-driven or on-demand integration, this guide equips you with the knowledge to implement a solution that enhances efficiency and supports your business goals. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Read data from Blob using Logic App
In this blog post, we are going to create an Azure Logic App that reads blob content from Azure Storage and responds with specific data. We walked through the entire process, from setting up the Logic App in the Azure Portal to configuring actions and testing the workflow. This Logic App provides a seamless way to automate the retrieval and processing of data stored in Azure Blob Storage, showcasing the flexibility and power of Azure Logic Apps in building serverless workflows. Use Cases Data Processing Pipeline – Scenario: A company collects data from various sources and stores it in Azure Blob Storage for processing and insights. – Solution: Use a Logic App to trigger new data uploads, process the data, and send it to downstream applications. – Benefits: Automates data processing, reduces manual effort, and ensures timely data availability. Configuration Management – Scenario: An organization needs to fetch and apply configuration files from Azure Blob Storage dynamically. – Solution: Use a Logic App to handle HTTP requests for configuration data and respond with the necessary settings. – Benefits: Centralizes configuration management, ensuring consistency and reducing errors. Customer Support Automation – Scenario: A support system needs to fetch specific information from stored documents to respond to customer queries. – Solution: Use a Logic App to trigger API queries, retrieve relevant documents from Blob Storage, and send responses. – Benefits: Automates common customer query responses, improving support efficiency. Prerequisites Note: – To learn more about how to obtain a free Azure account, click on Azure free account to create Free Trial Account. – To learn how to create Azure Blob Storage Account & Container can refer blog: How to create: Azure Blob Storage, Container and Blob – CloudFronts Steps to Create a Logic App in Azure Step 1: Create a Logic App Step 2: Fill in the necessary details: Note: – Consumption Plan: Ideal for scenarios with unpredictable or low to moderate workloads, where you only pay for what you use. – Standard Plan: Best for high-usage, mission-critical applications that require consistent performance, dedicated resources, and enhanced development capabilities. Choosing between the Consumption and Standard plans depends on your specific requirements regarding cost, performance, scaling, and development preferences. Steps to Upload File on blob Create a Logic App to Read Data from Blob: Step-by-Step Guide Step 1: Set Up Logic App Designer Step 2: Add Blob Storage Action Step 3: Configure Blob Storage Action Step 4: Add Response Action & Configure Step 5: Save and Test the Logic App Step 6: Test your Logic App Conclusion With the help of Azure Logic Apps, you can easily build automated processes that connect to a wide range of services and applications. By following this guide, you have learned how to build a Logic App that reads data from Azure Blob Storage and responds with specific information. This foundational knowledge can be expanded to create more complex workflows, offering endless possibilities for automation and integration in your business processes. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Set Up a Log Analytics Workspace in Azure
Creating a Log Analytics Workspace in Azure is an essential step for monitoring and analyzing data from various sources within your Azure environment. This guide will walk you through the process, providing clear instructions and tips to help you set up your workspace efficiently. Azure Log Analytics Workspace is a powerful tool that allows you to collect and analyze data from various sources within your Azure environment. It provides insights that help you monitor the performance, availability, and health of your resources. Setting up a Log Analytics Workspace is crucial for effective cloud management and optimization. Access the Azure Portal – Log In: Start by logging into the Azure Portal. – Look for Log Analytics: Type “Log Analytics Workspaces” into the top search bar and choose it from the drop-down menu. Create a New Workspace – Initiate Creation: Click on the “Create” button to start the process. – Resource Group: To arrange your resources, select an already-existing group or establish a new one. – Name: Enter a unique name for your Log Analytics Workspace. Review and Create – Review Details: Check all the details you have entered to ensure they are correct. – Create Workspace: Click “Review + create,” and after validation, click “Create” to deploy your workspace. – If you go to Logs, you will have the ability to query the logs: Benefits of using Azure Log Analytics Workspace Practical Applications We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Integrating Azure Logic Apps with Common Data Service
Integrating Azure Logic Apps with the Common Data Service (CDS) opens up a world of possibilities for automating business processes and enhancing productivity within your organization. This blog will guide you through the steps to set up this integration, explaining the benefits and practical applications along the way. Azure Logic Apps is a powerful cloud-based service that allows you to automate workflows and integrate apps, data, and services across organizations. The Common Data Service, now known as Dataverse, provides a secure and scalable data storage solution that supports integration with various Microsoft and third-party applications. By integrating these two services, you can streamline data flow and automate complex workflows with ease. Prerequisites Before you begin, ensure you have the following: – An active Azure subscription. – Access to the Common Data Service (Dataverse) environment. – Necessary permissions to create and manage Logic Apps and CDS. Log in to the Azure Portal: Go to Azure Portal. Create a New Logic App: – Search for “Logic Apps” in the search bar. – Click on “Add” to create a new Logic App. – Fill in the required details (name, resource group) and click “Create”. Add a CDS Connector: – Once the Logic App is created, open the Workflow and Add one. – Click on “When a row is added” under Common Data Service triggers. – Sign in to your CDS environment and grant the necessary permissions. Configure the Trigger: – Select the relevant entity – Accounts and specify the trigger conditions – When a row is added. – Accordingly, when a record is added then create a Contact Record. – Save the Logic App Testing Out: Scenario – Create a new account in your Dynamics 365 / Power App. After a few moments, refresh and we see the contact has been created and assigned to the new Account. – So, we know our Logic App has run. Now let’s look at it in the Azure portal. Under Metrics, we see the Logic App has run. Why Integrate Azure Logic Apps with Common Data Service? We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
BÜCHI Labortechnik AG partners with CloudFronts to renew the BI Developer Managed Services Agreement
We are delighted to announce that BÜCHI Labortechnik AG, headquartered in Switzerland and widely regarded as a global pioneer in laboratory equipment and technology solutions, has renewed its longstanding partnership with CloudFronts through a BI Developer Managed Services Agreement (MSA). BÜCHI Labortechnik AG is one of the largest solution providers in the world for R&D, quality control and production of laboratory technology. For more than 80 years, Switzerland-headquartered BÜCHI has been delivering world-class solutions for laboratory, industrial and parallel evaporation, spray drying, melting point, preparative chromatography, extraction, distillation & digestion, dumas, and near-infrared spectroscopy to meet the needs of customers around the globe. It operates its business through a fine network of branches, subsidiaries and affiliates spread across the globe. Learn more about them at https://www.buchi.com/en The foundation of BÜCHI’s partnership with CloudFronts was laid with the creation of a harmonious data integration mechanism and insightful data analytics reports, achieved through the utilization of Microsoft Azure Integration Services (AIS) and Power BI. With this agreement, CloudFronts will focus on providing a dedicated BI Developer and ensuring that the necessary elements and commitments are in place to provide BÜCHI Labortechnik AG with proactive monitoring, rapid issue resolution, ongoing maintenance, and support. This agreement is the extension to the previous MSA which is a testament to the quality, commitment and passion of CloudFronts team. About CloudFronts CloudFronts is a Dynamics 365 focused Microsoft Solutions Partner helping Teams & Organizations worldwide solve their Complex Business Challenges with Microsoft Cloud. Our head office and robust delivery centre are based out of Mumbai, India, along with branch offices in Singapore & the U.S. Since its inception in 2012, CloudFronts has successfully served over 500+ small and medium-sized clients all over the world, such as North America, Europe, Australia, MENA, Maldives & India, with diverse experiences in sectors ranging from Professional Services, Financial Services, Manufacturing, Retail, Logistics/SCM, and Non-profits. Please feel free to connect with us at transform@cloudfronts.com