Azure Archives - Page 5 of 14 - - Page 5

Category Archives: Azure

Understanding When to Use Azure Service Bus Queues or Topics

Posted On October 14, 2024 by Tanu Prajapati Posted in Tagged in

If you’re finding it challenging to decide when to use Azure Service Bus Queues or Topics, this blog is for you! In our previous blog, we explored Azure Service Bus Queues, Topics, and Subscriptions. To recap, Azure Service Bus is a fully managed messaging service provided by Microsoft Azure. It helps decouple and scale applications by allowing different components to communicate with each other through messages. In this blog, we will delve deeper into Azure Service Bus Queues vs. Topics, examining their differences, use cases, and how to choose between them based on your application needs. By understanding these core concepts, we’ll be better equipped to design scalable and efficient messaging solutions using Azure Service Bus. Azure Service Bus Queues vs. Topics Service Bus Queues Queues work on a First In, First Out (FIFO) basis. This means that clients that receive messages from the queue and then process that message in the order in which they were added to the queue, and they will be the only consumer that processes this message. The queue will store this message until our client is able to process them. To process the message, the client will pull the message off the queue. Purpose: Queues are designed for point-to-point communication. They are ideal when a single consumer needs to process messages from a single sender. Message Handling: Messages are stored in a queue and processed by a single consumer in a first-in, first-out (FIFO) manner. Use Case: Best suited for scenarios where a specific task needs to be handled one at a time. For example, in an order processing system where each order needs to be managed sequentially. Fig – Message Queue with Messages One of the benefits of using queues is that producers and consumers do not need to exchange messages simultaneously. Messages are stored in the queue and are processed only when the consumer retrieves them. This setup enables producers to continue sending messages to the queue independently. Consequently, components within our architecture can be decoupled, as producers and consumers are not required to synchronize their actions. If there is a high volume of messages entering the queue, we can scale up the consumers without needing to scale the producers. Service Bus Topics Topics are different to Queues since instead of working with a single consumer, we can have multiple subscribers to our topic, who will receive their own copy of the message from the topic. This works in a pub/sub pattern, where we will have messages being published to the topic and have multiple clients subscribe to that topic. Purpose: Topics are designed for publish-subscribe communication. They allow messages to be sent to a topic and processed by multiple consumers. Message Handling: Messages sent to a topic are delivered to multiple subscriptions. Each subscription can have its own filter and process messages independently. Use Case: Ideal for broadcasting messages to multiple systems. For instance, a CRM system might need to notify various departments (e.g., sales, marketing) about a new customer record. Fig – Topic with three Subscription with Messages In Topics, our consumers don’t directly consume the message from our Topic. Instead, we create subscriptions that subscribe to the topic and our consumers receive a copy of a message from the topic. In Azure Service Bus, we can define filters on these subscriptions that determine conditions for messages to be published to a subscription and actions that modifies the message metadata. Conclusion In this post, we discussed the differences between Queues and Topics in Azure Service Bus. To summarize, Azure Service Bus Queues are ideal for point-to-point communication in which messages must be handled sequentially by a single consumer. Topics, on the other hand, are suitable for scenarios that need publish-subscribe patterns, as they enable several consumers to process the same message independently. Choosing the proper solution is determined by your application’s, individual requirements, ensuring that your message system is both scalable and efficient. If your system requires sequential processing and single customers, queues are the best option. However, if your system wants to broadcast messages to several users, Topics will give the necessary flexibility and scalability. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 2

Posted On September 11, 2024 by Deepak Chauhan Posted in Tagged in

In continuation to our Part 1, welcome to part 2 of the blog on Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault. We have already completed the necessary setup in part 1, so if you haven’t read part 1 yet, please do so before proceeding with this part.  Assumptions-  Before going further, let’s first discuss the assumptions we made:  Now, let’s discuss the step to create a pipeline to refresh the access token: –   – Create a web activity to pull the client ID, client secret, and refresh token you created in part 1. – As for settings, you use this setup, and URI is your Azure key vault’s Secret Identifier.  – Similarly, set up web activities for the client ID, client secret, and refresh token.  – For the refresh token, I have done setup as shown but you may want to change it according to your API requirements.  Body-   grant_type=refresh_token&refresh_token=@{activity(‘Get Refresh Token’).output.value}  Authorization-   Basic @{base64(concat(activity(‘Get Client Id’).output.value, ‘:’, activity(‘Get Client Secret’).output.value))}  – After this, use another web activity to refresh the access token using the refresh token and save it to the Azure Key Vault.  Body-  {    “value”: “@{activity(‘Refresh Access Token’).output.access_token}”  }  Conlusion: This blog provides a comprehensive guide to automating the access token and refresh token generation process using Azure Data Factory and Azure Key Vault. By following the steps outlined, you can ensure seamless token management, reduce manual interventions, and maintain secure access to your resources. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Introduction to Azure Service Bus and Its Use Case

Posted On September 6, 2024 by Tanu Prajapati Posted in Tagged in

Introduction Azure Service Bus is a fully managed, multi-tenant cloud messaging service functioning as a brokered messaging system. In a software-oriented architecture (SOA), application components interact through communication protocols over a network, facilitated by the Service Bus. This article provides an overview of Azure Service Bus, highlighting its role in integrating systems like Microsoft Dynamics 365 CRM with third-party e-commerce platforms. Real-World Scenario: Integrating Dynamics 365 CRM with an E-commerce Platform Azure Service Bus is instrumental in enabling seamless interaction between Dynamics 365 CRM and external e-commerce applications, enhancing data management and operational efficiency. – Customer Data Synchronization: Customer data from the e-commerce platform is transferred to Dynamics 365 CRM using Service Bus queues, ensuring the CRM system reflects the latest information. – Order Processing: When an order is placed, it triggers a message to Dynamics 365 CRM, streamlining order fulfilment and tracking through Service Bus topics and subscriptions. – Inventory Management: Inventory levels are updated in real-time across both systems. Messages sent through Service Bus ensure accurate stock levels, preventing overselling. – Customer Support Integration: Customer support tickets from the e-commerce platform are channelled to Dynamics 365 CRM, providing a comprehensive view of customer interactions and improving support quality. Use Case Real-Time Data Synchronization Between Dynamics 365 CRM and Finance & Operations Scenario: Imagine a company that uses Microsoft Dynamics 365 CRM for customer relationship management and Dynamics 365 Finance & Operations (F&O) for financial and inventory management. To ensure consistent and accurate data across these systems, especially regarding inventory levels, real-time data synchronization is essential. Solution: In this integration scenario, the goal is to synchronize inventory levels between Microsoft Dynamics 365 CRM and Finance and Operations (F&O) systems to ensure real-time accuracy. The process starts with Dynamics 365 CRM, where changes in inventory, such as sales or restocking, trigger an event. This event generates a message containing the updated inventory details, which is then sent via Azure Service Bus. Azure Service Bus serves as a reliable messaging service that decouples the CRM and F&O systems, facilitating smooth communication between them. Once the message reaches Azure Service Bus, it is picked up by an Azure Logic App. The Logic App orchestrates the integration process, potentially using Azure Functions for tasks such as data transformation, validation, or enrichment. For instance, it may convert the message into a format compatible with the F&O system, such as OData, a standard protocol for data exchange. After processing, the transformed data is sent to the F&O system, where the inventory levels are updated accordingly. This setup ensures that inventory records are synchronized in real time across both systems, preventing issues like overselling by maintaining up-to-date stock levels. The use of Azure Service Bus and Logic Apps not only supports real-time communication but also offers a scalable and flexible integration solution that can adapt to evolving business needs. Key benefits of this approach include real-time updates, fault tolerance through message persistence and retry logic, and the flexibility to scale and integrate systems without tight coupling. Azure Service Bus Queues and Topics and Subscriptions Azure Service Bus offers Queues and Topics and Subscriptions as core features, enabling different messaging patterns to suit various use cases. Queues facilitate point-to-point communication, while Topics and Subscriptions support a publish-subscribe model. This flexibility allows for efficient data transfer and processing across applications. Stay tuned for my next post, where we’ll explore the specific scenarios in which to use queues versus topics and subscriptions. Conclusion Azure Service Bus provides a versatile and reliable messaging solution for building scalable, decoupled distributed applications. By integrating seamlessly with the broader Azure ecosystem, Service Bus empowers developers to create efficient communication channels, enhancing the performance and reliability of their applications. Whether you’re modernizing existing systems or developing new cloud-native applications, Azure Service Bus is an essential tool for delivering an excellent user experience. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 1 

Posted On September 5, 2024 by Deepak Chauhan Posted in Tagged in

Introduction In this blog, I will explain how we can automate generating access tokens or refresh tokens. When working with APIs, a common problem is the expiration of the access token or refresh token after some time. We solved this issue by using Azure Data Factory and Azure Key Vault.  Azure Key Vault is used for storing API credentials as it is one of the most secure ways to store keys/secrets in Azure. Azure Data Factory is used to automate the process of generating access tokens for APIs. We are dividing this blog into two parts:  Before we proceed with the blog, please test your API in Postman to know the API requirements for generating access tokens. For me, it is client ID, client secret, and Refresh Token.  Steps to Set Up Azure Key Vault and Azure Data Factory:  – Go to the Azure portal and create a Key Vault resource. Please make sure that your Key Vault and Azure Data Factory are in the same region.    – Create a secret by generate/import and entering the required details. I have already created the secrets I need. – For the access token, you can keep the initial value as anything you want; we will update it later using an ADF Pipeline.   – Set up an access policy for the Azure Data Factory to access our Key Vault. To do this, go to “Access Policy” and select the appropriate options.   – Click “Next” and select your Azure Data Factory, where you will be creating a pipeline for refreshing the access token.   – Now, go to Azure Data Factory Studio and set up the linked Service for your API in the Azure data factory.  – The dataset is also pretty straightforward, and I prefer to use a parameter for the relative URL so that I can reuse the same dataset and just set the URL of the API I want to call during runtime:  Conclusion That’s all for the setup in part 1. We’ve covered the essential steps to set up Azure Key Vault and Azure Data Factory for securely managing API credentials and setting the groundwork for automating access token generation. These tools provide a reliable and secure way to handle token expiration, ensuring smooth API operations without manual intervention. In part 2, we will discuss in detail how we can automate access token generation using Azure Data Factory. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to implement Azure Blob Lifecycle Management Policy

Introduction Azure Blob Storage Lifecycle Management allows you to manage and optimize the storage lifecycle of your data. You can define policies that automate the transition of blobs to different access tiers or delete them after a specified period. This can help reduce costs and manage the data efficiently. This Blog shows how to set up and manage lifecycle policies: Steps to Create a Lifecycle Management Policy Access the Azure Portal: Sign in to your Azure account and navigate to the Azure Portal. Navigate to Your Storage Account: – Go to “Storage accounts”. – Select the storage account where you want to apply the lifecycle policy. Configure Lifecycle Management: – In the storage account menu, under the “Blob service” section, select “Lifecycle management”. Add a Rule: – Click on “+ Add rule” to create a new lifecycle management rule. – Provide a name for the rule. Define Filters: You can specify filters to apply the rule to a subset of blobs. Filters can be based on: – Blob prefix (to apply the rule to blobs with a specific prefix). – Blob types (block blobs, append blobs, page blobs). Set Actions: – Define the actions for the rule, such as moving blobs to a cooler storage tier (Hot, Cool, Archive) or deleting them after a certain number of days. – You can specify the number of days after the blob’s last modification date or its creation date to trigger the action. Review and Save: – Review the policy settings. – Save the policy. Key Points to Remember – Access Tiers: Azure Blob Storage has different access tiers (Hot, Cool, Archive), and lifecycle policies help optimize costs by moving data to the appropriate tier based on its access patterns. – JSON Configuration: Policies can be defined using JSON, which provides flexibility and allows for complex rules. – Automation: Lifecycle management helps automate data management, reducing manual intervention and operational costs. Conclusion By setting up these policies, you can ensure that your data is stored cost-effectively while meeting your access and retention requirements. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Integrating Salesforce with InforLN using Azure Integration Services

Introduction Integrating Salesforce with InforLN is a critical task for organizations looking to streamline their sales and billing processes. With the AIS Interface, businesses can efficiently manage data flow between these two platforms, reducing manual effort, enhancing visibility, and improving overall organizational performance. In this Blog, it shows the detailed information for integration between Salesforce to InforLN. The AIS Interface is intended to Extract, Transform and Route the data from Salesforce to InforLN. The steps for integration would be same for different entities. Many organizations need Salesforce to InforLN integration because of the below reasons: Event Scenario Pre-Requisites: Process Steps: On Demand Load Scenario Pre-Requisites: Process Steps: Conclusion Based on the above Integration scenarios Azure Developer can easily navigate for the integration implementation and they can choose between Event Driven or On-Demand based on the business requirement. This integration not only simplifies complex processes but also eliminates redundant tasks, allowing teams to focus on more strategic initiatives. Whether your organization requires event-driven or on-demand integration, this guide equips you with the knowledge to implement a solution that enhances efficiency and supports your business goals. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Read data from Blob using Logic App 

Posted On August 13, 2024 by Bhavika Shetty Posted in Tagged in

In this blog post, we are going to create an Azure Logic App that reads blob content from Azure Storage and responds with specific data. We walked through the entire process, from setting up the Logic App in the Azure Portal to configuring actions and testing the workflow. This Logic App provides a seamless way to automate the retrieval and processing of data stored in Azure Blob Storage, showcasing the flexibility and power of Azure Logic Apps in building serverless workflows.  Use Cases  Data Processing Pipeline  – Scenario: A company collects data from various sources and stores it in Azure Blob Storage for processing and insights.  – Solution: Use a Logic App to trigger new data uploads, process the data, and send it to downstream applications.  – Benefits: Automates data processing, reduces manual effort, and ensures timely data availability.  Configuration Management  – Scenario: An organization needs to fetch and apply configuration files from Azure Blob Storage dynamically.  – Solution: Use a Logic App to handle HTTP requests for configuration data and respond with the necessary settings.  – Benefits: Centralizes configuration management, ensuring consistency and reducing errors.  Customer Support Automation  – Scenario: A support system needs to fetch specific information from stored documents to respond to customer queries.  – Solution: Use a Logic App to trigger API queries, retrieve relevant documents from Blob Storage, and send responses.  – Benefits: Automates common customer query responses, improving support efficiency.   Prerequisites  Note:  – To learn more about how to obtain a free Azure account, click on Azure free account to create Free Trial Account.  – To learn how to create Azure Blob Storage Account & Container can refer blog: How to create: Azure Blob Storage, Container and Blob – CloudFronts  Steps to Create a Logic App in Azure  Step 1: Create a Logic App  Step 2: Fill in the necessary details:     Note:  – Consumption Plan: Ideal for scenarios with unpredictable or low to moderate workloads, where you only pay for what you use.  – Standard Plan: Best for high-usage, mission-critical applications that require consistent performance, dedicated resources, and enhanced development capabilities.  Choosing between the Consumption and Standard plans depends on your specific requirements regarding cost, performance, scaling, and development preferences.        Steps to Upload File on blob        Create a Logic App to Read Data from Blob: Step-by-Step Guide   Step 1: Set Up Logic App Designer     Step 2: Add Blob Storage Action     Step 3: Configure Blob Storage Action     Step 4: Add Response Action & Configure     Step 5: Save and Test the Logic App     Step 6: Test your Logic App           Conclusion  With the help of Azure Logic Apps, you can easily build automated processes that connect to a wide range of services and applications. By following this guide, you have learned how to build a Logic App that reads data from Azure Blob Storage and responds with specific information. This foundational knowledge can be expanded to create more complex workflows, offering endless possibilities for automation and integration in your business processes.  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com  

Share Story :

Set Up a Log Analytics Workspace in Azure

Posted On August 12, 2024 by Richie Jacob Posted in Tagged in

Creating a Log Analytics Workspace in Azure is an essential step for monitoring and analyzing data from various sources within your Azure environment. This guide will walk you through the process, providing clear instructions and tips to help you set up your workspace efficiently. Azure Log Analytics Workspace is a powerful tool that allows you to collect and analyze data from various sources within your Azure environment. It provides insights that help you monitor the performance, availability, and health of your resources. Setting up a Log Analytics Workspace is crucial for effective cloud management and optimization. Access the Azure Portal – Log In: Start by logging into the Azure Portal. – Look for Log Analytics: Type “Log Analytics Workspaces” into the top search bar and choose it from the drop-down menu. Create a New Workspace – Initiate Creation: Click on the “Create” button to start the process. – Resource Group: To arrange your resources, select an already-existing group or establish a new one. – Name: Enter a unique name for your Log Analytics Workspace. Review and Create – Review Details: Check all the details you have entered to ensure they are correct. – Create Workspace: Click “Review + create,” and after validation, click “Create” to deploy your workspace. – If you go to Logs, you will have the ability to query the logs: Benefits of using Azure Log Analytics Workspace Practical Applications We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Integrating Azure Logic Apps with Common Data Service

Integrating Azure Logic Apps with the Common Data Service (CDS) opens up a world of possibilities for automating business processes and enhancing productivity within your organization. This blog will guide you through the steps to set up this integration, explaining the benefits and practical applications along the way. Azure Logic Apps is a powerful cloud-based service that allows you to automate workflows and integrate apps, data, and services across organizations. The Common Data Service, now known as Dataverse, provides a secure and scalable data storage solution that supports integration with various Microsoft and third-party applications. By integrating these two services, you can streamline data flow and automate complex workflows with ease. Prerequisites Before you begin, ensure you have the following: – An active Azure subscription. – Access to the Common Data Service (Dataverse) environment. – Necessary permissions to create and manage Logic Apps and CDS. Log in to the Azure Portal: Go to Azure Portal. Create a New Logic App: – Search for “Logic Apps” in the search bar. – Click on “Add” to create a new Logic App. – Fill in the required details (name, resource group) and click “Create”. Add a CDS Connector: – Once the Logic App is created, open the Workflow and Add one. – Click on “When a row is added” under Common Data Service triggers. – Sign in to your CDS environment and grant the necessary permissions. Configure the Trigger: – Select the relevant entity – Accounts and specify the trigger conditions – When a row is added. – Accordingly, when a record is added then create a Contact Record. – Save the Logic App Testing Out: Scenario – Create a new account in your Dynamics 365 / Power App. After a few moments, refresh and we see the contact has been created and assigned to the new Account. – So, we know our Logic App has run. Now let’s look at it in the Azure portal. Under Metrics, we see the Logic App has run. Why Integrate Azure Logic Apps with Common Data Service? We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Azure Integration with Dynamics 365 Finance & Operations

Introduction: Businesses in the digital age depend on cloud platforms and ERP systems integrating seamlessly. Dynamics 365 Finance & Operations (F&O) and azure integration is one such potent combination. Numerous advantages, such as improved scalability, agility, and data-driven decision-making capabilities, are made possible by this integration. The step-by-step instructions for connecting Azure with Dynamics 365 F&O will be provided in this blog. Steps to achieve the goal: Step 1: Setting up Azure Services a. Create an Azure account: Sign up for an Azure account if you don’t have one already. b. Provision Azure resources: Set up the required Azure resources such as virtual machines, databases, storage accounts, and other services according to your needs. Below are few links to create azure account. https://learn.microsoft.com/en-us/answers/questions/433827/how-to-get-an-azure-account-without-credit-card https://azure.microsoft.com/en-in/free/students Step 2: Configure Azure Active Directory (AAD) a. Click on New on the App Registration page. Set the name and set the type like below screenshots. b. Once you click on Ok button you would get notification like below. c. Now go to API Permission and click on Add permission d. Select Dynamics ERP e. Select Delegated Permission f. Select all permission and then click on Add Permission g. After selecting this permission again add permission on the screen this time selected Application Permission. h. Now we have to generate client secret value. Just select Certificates and secret. i. You will see the below screen where you can generate a new client secret j. Once you click on new you will see below screen where you can set the date to which this secret key would be valid. Max validity is 2 years. k. This is how the secret value would look like just copy Value. l. Now copy the Directory ID and Application ID Step 3: Connect Azure Services to F&O a. Go to Finance and Operations and serach globally Azure Active Directory/Microsoft Entra ID b. And then click on New and add your client id over here and set User ID as Admin. Please Note you should have the admin access right if not this won’t work. Conclusion: Azure integration with Dynamics 365 Finance & Operations empowers businesses to streamline processes, unlock data insights, and achieve operational excellence. Next blog would be how to connect standard API on postman and perform get and post function. Stay tuned! We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange