Azure Archives - Page 4 of 14 - - Page 4

Category Archives: Azure

Creating and Accessing Blob Storage with Azure Data Factory: A Complete Guide

Introduction: This guide will walk you through creating and accessing Azure Blob Storage and integrating it with Azure Data Factory to automate data pipelines. From setting up a storage account and managing containers to configuring pipelines and transferring data to an Azure SQL Database, this step-by-step tutorial ensures you gain a comprehensive understanding of the process. Steps: 3. Click on + Create to initiate the creation of a new storage account. 4. Fill in the required fields like subscription, resource group, and region. Review all the settings before proceeding. 5. Create a Storage Account 6. Once the storage account is created, go to the resource by clicking on Go to Resource. 7. In the storage account, navigate to the Containers section and click + Container to create a new container for storing your files. 8. Click on the container you just created to access its contents. 9. Upload the desired JSON file into the container by clicking on Upload and selecting the file from your local system. 10. Ensure that the uploaded file is now listed in the container. 11. Go back to the Azure Portal and search for Azure Data Factory to open the ADF service. 12. From the ADF home screen, go to Author > Datasets. Click + New Dataset to create a new dataset for your Blob Storage. 13. Select the Azure Blob Storage dataset type, as you are working with data stored in Blob Storage. 14. Choose the data format that matches the file you uploaded, such as JSON, and click Continue. 15. Enter the necessary details for your dataset, including the file path and format settings. Select the appropriate Authentication type and specify the Storage account where the Blob Storage resides. Click Create to finalize the dataset creation. 16. Verify the settings and click OK to confirm the dataset configuration. 17. Navigate to the Pipelines section and click + New Pipeline to create a pipeline that will define your data flow. 18. Pipeline gets created successfully as shown below. 19. In the pipeline, select the dataset type as Azure SQL Database and click Continue to set up the SQL Database dataset. 20. Provide the necessary Linked Service details for your SQL database and click Create. 21. After configuring both the source and target datasets, and the pipeline, publish all the elements to save your work. 22. Once the pipeline is running successfully, you can verify its functionality by querying the destination database to ensure data is being transferred properly. a. Go to the SQL Database and select the relevant database. b. Select the database on which we have perform a query. c. Log in with your credentials. d. Write a simple test query to verify data has been transferred from Blob Storage to the SQL Database. Execute the query and confirm that the expected output is returned. Conclusion: Integrating Azure Blob Storage with Azure Data Factory is a powerful way to manage and automate data workflows in the cloud. This guide walks you through creating a storage account, configuring containers, uploading data, and designing a pipeline to process and transfer data to Azure SQL Database. By following these steps, you can efficiently handle large-scale data integration and ensure seamless communication between your data sources and destinations. Azure Data Factory not only simplifies the process of orchestrating data pipelines but also provides robust options for monitoring and optimizing workflows. Whether you are managing JSON files, processing transactional data, or setting up complex ETL processes, Azure’s ecosystem offers a reliable and scalable solution. Start exploring these tools today to unlock new possibilities in data-driven operations! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Connecting Application Insights Logs and Query Through Logic Apps

Application Insights is a powerful monitoring tool within Azure that provides insights into application performance and diagnostics. Logic Apps, on the other hand, enable workflow automation for integrating various Azure services. By combining these tools, you can automate querying Application Insights logs and take actions based on the results. This blog explains how to set up this connection step-by-step. Prerequisites Before proceeding, ensure you have the following: Step 1: Enable Logs in Application Insights To ensure Application Insights data is accessible: Step 2: Create a KQL Query KQL (Kusto Query Language) is used to query Application Insights logs: Step 3: Set Up a Logic App Create a Logic App that will query Application Insights: Step 4: Configure Logic App Actions To execute and process the query: 2. Add a Body for the request: “`json { “query”: “traces | where timestamp >= ago(1h) | summarize Count=count() by severityLevel” } 3. Add actions to handle the response, such as sending an email or creating an alert based on the query results. Step 5: Test the Workflow Use Cases Conclusion Integrating Application Insights logs with Logic Apps is a straightforward way to automate log queries and responses. By leveraging the power of KQL and Azure’s automation capabilities, you can create robust workflows that monitor and react to your application’s performance metrics in real-time. Explore these steps to maximize the synergy between Application Insights and Logic Apps for a more proactive and automated approach to application monitoring and management. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.

Share Story :

Building Real-Time Dashboards with Azure Stream Analytics and Power BI

Real-time dashboards are essential for monitoring live data and gaining instant insights into business operations. Azure Stream Analytics and Power BI provide an efficient way to process and visualize streaming data. In this blog, we will walk through the steps to build a real-time dashboard using these tools, with illustrative images to guide you. Why Real-Time Dashboards Are Needed In today’s fast-paced world, businesses need to make decisions quickly based on live data. Real-time dashboards enable organizations to: Use Cases for Real-Time Dashboards Real-time dashboards can be applied across various industries, including: Prerequisites Before we begin, ensure you have the following: Step 1: Set Up Your Data Source

Share Story :

BizTalk vs. Azure Logic Apps: Choosing the Right Integration Platform

Integration platforms are critical to modern business operations, allowing different applications, data, and systems to communicate effectively. While both serve the purpose of integration, they cater to different needs and scenarios. In this blog, we’ll compare BizTalk and Azure Logic Apps, helping you choose the right platform for your business. Outline 1. Opening Section: 2. Introduction: 3. Core Content: Key Differences Between BizTalk and Azure Logic Apps: When to Choose BizTalk Server: When to Choose Azure Logic Apps: 4. Conclusion and CTA: In conclusion, BizTalk Server and Azure Logic Apps cater to different integration needs. While BizTalk excels in enterprise-grade, on-premises scenarios, Azure Logic Apps shines in cloud-native, modern workflows. Choosing the right platform depends on your organization’s integration requirements, scalability goals, and budget. CTA: If you’re still unsure which platform aligns best with your needs, our team of integration experts can help. Contact us for a detailed assessment and tailored recommendations for your business integration journey. Let’s streamline your operations and drive growth together

Share Story :

Streamlining Build Pipelines with YAML Template Extension: A Practical Guide

In modern development workflows, maintaining consistency across build pipelines is crucial. A well-organized build process ensures reliability and minimizes repetitive configuration. For developers using YAML-based pipelines (e.g., Azure DevOps or GitHub Actions), template extension is a powerful approach to achieve this. This blog explores how to use YAML templates effectively to manage build stages for multiple functions in your project. What is Template Extension in YAML? Template extension allows you to define reusable configurations in one place and extend them for specific use cases. Instead of repeating the same build steps for every function or service, you can create a single template with customizable parameters. Why Use Templates in Build Pipelines? – Scalability: Add new services or functions without duplicating code. – Maintainability: Update logic in one place instead of modifying multiple files. – Consistency: Ensure uniform processes across different builds. Step-by-Step Implementation Here’s how you can set up a build pipeline using template extension. 1. Create a Reusable Template A template defines the common steps in your build process. For example, consider the following file named buildsteps-template.yml: parameters: – name: buildSteps # the name of the parameter is buildSteps type: stepList # data type is StepList default: [] # default value of buildSteps stages: – stage: secure_buildstage pool: name: Azure Pipelines demands: – Agent.Name -equals Azure Pipelines x jobs: – job: steps: – task: UseDotNet@2 inputs: packageType: ‘sdk’ version: ‘8.x’ performMultiLevelLookup: true – ${{ each step in parameters.buildSteps }}: – ${{ each pair in step }}: ${{ pair.key }}: ${{ pair.value }} 2. Reference the Template in the Main Pipeline This is your main pipeline file: trigger: branches: include: – TEST {Branch name} paths: include: – {Repository Name}/{Function Name} variables: buildConfiguration: ‘Release’ extends: template: ..\buildsteps-template.yml {Template file name} parameters: buildSteps: – script: dotnet build {Repository Name}/{Function Name}/{Function Name}.csproj –output build_output –configuration $(buildConfiguration) displayName: ‘Build {Function Name} Project’ – script: dotnet publish {Repository Name}/{Function Name}/{Function Name}.csproj –output $(build.artifactstagingdirectory)/publish_output –configuration $(buildConfiguration) displayName: ‘Publish {Function Name} Project’ – script: (cd $(build.artifactstagingdirectory)/publish_output && zip -r {Function Name}.zip .) displayName: ‘Zip Files’ – script: echo “##vso[artifact.upload artifactname={Function Name}]$(build.artifactstagingdirectory)/publish_output/{Function Name}.zip” displayName: ‘Publish Artifact: {Function Name}’ condition: succeeded() Benefits in Action 1. Simplified Updates When you need to modify the build process (e.g., change the .NET SDK version), you only update the template.yml. The changes automatically apply to all functions. 2. Customization Each function can have its own build configuration without duplicating the pipeline logic. 3. Improved Collaboration By centralizing common configurations, teams can work independently on their functions while adhering to the same build standards. Best Practices Final Thoughts YAML template extension is a game-changer for developers managing multiple services or functions in a project. It simplifies pipeline creation, reduces duplication, and enhances scalability. By adopting this approach, you can focus on building great software while your pipelines handle the heavy lifting. If you haven’t already, try applying template extension in your next project—it’s a small investment with a big payoff. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Real-Life Use Case of CRUD Operations with Postman and Azure Logic Apps 

Posted On November 18, 2024 by Bhavika Shetty Posted in Tagged in

Having a robust Customer Relationship Management (CRM) system is crucial for managing customer data and interactions effectively. One way to enhance your CRM capabilities is through seamless integration with Azure Logic Apps, allowing for efficient CRUD (Create, Read, Update, Delete) operations via OData endpoints. In this blog post, we’ll dive into a real-life business use case that demonstrates how to perform CRUD operations on a CRM system using Postman and Azure Logic Apps.  What Are CRUD Operations?  CRUD operations form the backbone of any data-driven application. They enable you to:  The Setup: Using Postman for API Requests  Postman is an incredibly useful tool for testing APIs, and in our case, it will help us interact with our CRM’s OData endpoints. Before we begin, ensure that you have the necessary API access and permissions set up.  Creating a New Record in CRM  Step 1: Prepare Your Request  To create a new record, you’ll need to set up a POST request in Postman. Here’s how to do it:  Step 2: Set the Request Body  In the body of your POST request, include the necessary details for the new record. For example, if you’re creating a customer record, it might look something like this:  Step 3: Send the Request  Hit the Send button. You should receive a response containing the payload of the newly created entry (e.g., CustomersV3).  Step 4: Verify Creation in CRM  Next, navigate to your CRM dashboard to verify that the new customer entry has been successfully created.    Updating an Existing Record  Step 1: Prepare Your Update Request  To update an existing record, you’ll be sending a PATCH or PUT request. Here’s how to set it up in Postman:  Step 2: Set the Request Body  Include the changes you wish to make in the request body. For example, if you want to update John Doe’s phone number:  Step 3: Send the Request  Once you send the request, you should see a response indicating the payload of the updated account.     Step 4: Verify Update in CRM  Check your CRM to confirm that the changes were applied correctly.     Future Topics: Logic App Creation  In our next blog, we’ll dive deeper into the creation of Azure Logic Apps and how they can automate these CRUD operations further, enhancing your CRM’s functionality. We’ll cover:  – Setting up triggers and actions within Azure Logic Apps.  – Automating data flow between systems.  – Best practices for managing CRM data efficiently.  Conclusion  By leveraging Postman for CRUD operations and integrating with Azure Logic Apps, businesses can significantly enhance their CRM capabilities, streamline operations, and ensure that their customer data remains accurate and accessible. Stay tuned for our upcoming blog, where we’ll explore how to create Azure Logic Apps to automate these processes, making your CRM experience even more efficient.  We hope you found this article useful, and if you would like to discuss anything, you can schedule a call with us by clicking the button below.

Share Story :

Sending and Receiving Messages from Azure Service Bus Using Logic Apps

Azure Service Bus, paired with Logic Apps, offers a powerful combination for sending, receiving, and managing messages between different applications and services. In this blog, we’ll walk through the process of sending and receiving messages using Azure Service Bus and Logic Apps. Steps to send and receive messages from service bus using logic app Step 1: Create an Azure Service Bus Namespace Navigate to the Azure Portal: – Go to portal.azure.com and log in with your credentials. Create a Service Bus Namespace: – In the search bar at the top, type “Service Bus” and select Service Bus from the results. – Click + Create to start the creation process. – Fill in the required details: Click Review + Create, and then Create to deploy the namespace. Step 2: Create a Queue or Topic in the Service Bus Namespace Access the Service Bus Namespace: – After the namespace is deployed, navigate to it by clicking on the resource in the portal. Create a Queue or Topic depending on your use case I am going to use: – Creating a Queue: Step 3: Create a Logic App to Send Messages to the Service Bus Navigate to Logic Apps: – In the Azure portal, use the search bar to find and select Logic Apps. – Click + Create to start a new Logic App. Configure Your Logic App: – In the Basics tab, provide the following details: – Click Review + Create, and then Create. Design the Logic App: – Once the Logic App is created, open the Logic Apps Designer and a trigger “When a HTTP request is received” along with POST request. – Add a compose action and pass the input parameters. – Go to Service bus –> Shared access policies –> Copy the Connection String Endpoint url – Add action Service Send Message and paste the copied end point in Connection String. – Pass the Output of compose in content. – Add a response action and the logic app workflow. – Now Copy the Url from trigger and paste it in postman hit the url. – As soon as you hit the url you will get customer Id as response in postman body. – Now Go to azure portal and check the run history I will see the Date and Status has been added for that particular customer id. – Now, Let’s verify this particular message whether it has been sent at the logic level or not. – Go to queue in my case Queue name Is “receivingqueue” –> Go to Service bus Explorer –> Click on Peek form Start. – Now in order see the content/ Message select the sequence number Step 4: Create a Logic App to Receive Messages from the Service Bus – Create a New Logic App: Repeat the steps to create a new Logic App. – Go to Logic app designer. – Add the Trigger “When a message is received in a queue”. – Add a compose action – Add a Terminate action on Succeeded. – Now to verify you check the run history of logic app you can we are getting the content in base64 Format – You can decode it and check it’s the same data that we were sending. Conclusion We’ve successfully set up a messaging system with Logic Apps and Azure Service Bus by following these steps. This configuration makes it possible to automate workflows, integrate apps seamlessly, and create reliable cloud solutions. Whether you’re working with batch processing or real-time data, Azure’s tools give you the strength and flexibility you need to scale your business effectively.

Share Story :

Posting – Document processing – The remote certificate is invalid according to the validation procedure Error in D365 FNO

Introduction Encountering errors while working with Sales Orders in Dynamics 365 Finance and Operations (D365FO) can disrupt your workflow, especially in development environments. One common issue involves posting the packing slip due to an expired SSL certificate in cloud-hosted environments. SSL certificates in D365FO cloud-hosted setups are valid for one year, after which they need to be renewed for continued security and functionality I faced this issue while trying to post the packing slip for a Sales Order.  I faced this issue on Dev Environment. To resolve this issue, follow the below process: To maintain security, these certificates must be renewed through rotation. Credential rotation is a critical aspect of enterprise-level cybersecurity, and this process can be managed via LCS. To resolve this log into the LCS environment. – Select the Implementation Project and then click on Full details option. – Click on the Maintain drop down button and then select the Rotate Secrets. – After that click on Rotate SSL Secrets Certificates option. It will look like this. This process make take a few minutes to complete. This will resolve the issue. After completion you can see that the status will be changed to Deployed. Then the next and final step is to click on Apply updates option this will apply all the changes and updates. Conclusion Rotating SSL certificates in Dynamics 365 Finance and Operations is essential to maintain security and functionality in cloud-hosted environments. By following these steps in LCS, you can ensure that your environment remains secure and that tasks like posting packing slips proceed smoothly. Regularly checking and updating your SSL certificates will help prevent future disruptions and keep your operations running efficiently. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Integrating CRM and FNO Using Azure Logic Apps

Posted On October 22, 2024 by Bhavika Shetty Posted in Tagged in

Introduction Seamless integration between systems is essential for efficient operations and data accuracy. One of the common integration challenges is syncing data between Customer Relationship Management (CRM) systems and Finance and Operations (FNO) systems. Traditionally, dual write has been a solution for this integration, but it comes with limitations. In this blog, we’ll explore a real-life business use case where we replace dual write with Azure Logic Apps to enable real-time data synchronization between CRM and FNO systems. Understanding Dual-Write Dual write is a framework provided by Microsoft that ensures data consistency between Dynamics 365 Finance and Operations (FNO) and Dynamics 365 Customer Engagement (CRM) applications. It facilitates real-time and bi-directional data synchronization, maintaining records of table and field mappings between FNO and CRM. This ensures that any change made in one system is reflected in the other, providing a unified experience across the enterprise. However, dual-write has its limitations, such as complex setup, limited customization options, and potential performance issues in high-transaction environments. These limitations prompt businesses to seek more flexible and scalable integration solutions. The Business Use Case: Replacing Dual-Write with Azure Logic Apps Scenario: A manufacturing company uses Dynamics 365 CRM to manage customer interactions and Dynamics 365 FNO to handle finance and operations. The company relies on dual write to keep customer data synchronized between the two systems. However, they face issues with the dual-write setup, including occasional synchronization lags and difficulties in customizing data mappings. To overcome these challenges, they decide to implement Azure Logic Apps for real-time data synchronization between CRM and FNO. Objective: Create a Logic App that enables real-time data synchronization between CRM and FNO, replacing the existing dual-write setup. This Logic App will ensure that any changes in customer data in CRM are immediately reflected in FNO and vice versa, without the complexities and limitations of dual write. Steps to Implement the Solution Benefits of Using Azure Logic Apps Conclusion By replacing dual write with Azure Logic Apps, the manufacturing company can achieve a more reliable and customizable integration between their CRM and FNO systems. This solution not only enhances data consistency and real-time synchronization but also provides the flexibility to adapt to future business requirements. Azure Logic Apps empower businesses to streamline their operations, improve data accuracy, and ultimately deliver better customer experiences. In our next blog, we will explore in detail how this business use case can be fully implemented using Azure Logic Apps. Stay tuned for a step-by-step guide on setting up the Logic App, configuring connectors, and ensuring seamless real-time data synchronization between CRM and FNO. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

JSON to JSON Transformation using Azure Logic Apps and Liquid

Posted On October 18, 2024 by Deepak Chauhan Posted in Tagged in

Introduction  In this blog post, I’ll walk you through the process of transforming JSON to JSON using Azure Logic Apps and the Liquid Template Language. This step-by-step guide will demonstrate how you can use Azure Integration Services to achieve your transformation goals.                                      What is Liquid Template Language?  The Liquid Template Language (commonly referred to as “Liquid”) is a flexible, open-source template language developed by Shopify. It is widely used to render dynamic content in platforms such as Shopify themes, Jekyll websites, and web applications. Liquid uses placeholders, loops, and conditional statements to pull dynamic data into a web template, making it an effective tool for JSON transformation.  Prerequisites  To complete this tutorial, you’ll need:  Sample Input JSON  We will use the following sample JSON file for this tutorial:  {    “FirstName”: “Deepak”,    “LastName”: “Ch”,    “Add1”: “T square, Saki Vihar Road, Andheri East”,    “Add2”: “Mumbai”,    “Landmark”: “Near Car Showroom”,    “PhoneNo1”: 9812727261,    “PhoneNo2”: 2121233322  }  Desired Output JSON  The client’s requirement is to transform the input JSON into the following format:  {    “Full Name”: “Deepak Ch”,    “Address”: “T square, Saki Vihar Road, Andheri East, Mumbai, Near Car Showroom”,    “Phone”: “9812727261, 2121233322”  }  Step-by-Step Guide –   Step 1: Create a Free Azure Integration Account  Step 2: Add the Liquid Template Map  Step 3: Create a Logic App  Step 4: Transform JSON to JSON using Liquid  Here’s the Liquid template used for this transformation:  {    “Full Name”: “{{content.FirstName}} {{content.LastName}}”,    “Address”: “{{content.Add1}}, {{content.Add2}}, {{content.Landmark}}”,    “Phone”: “{{content.PhoneNo1}}, {{content.PhoneNo2}}”  }      Step 5: Test with Postman  Final Output  The output JSON will be:  {    “Full Name”: “Deepak Ch”,    “Address”: “T square, Saki Vihar Road, Andheri East, Mumbai, Near Car Showroom”,    “Phone”: “9812727261, 2121233322”  }  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange