Azure Blob Storage Archives -

Category Archives: Azure Blob Storage

Automating Document Vectorization from SharePoint Using Azure Logic Apps and Azure AI Search

In modern enterprises, documents stored across platforms like SharePoint often remain underutilized due to the lack of intelligent search capabilities. What if your organization could automatically extract meaning from those documents—turning them into searchable vectors for advanced retrieval systems? That’s exactly what we’ve achieved by integrating Azure Logic Apps with Azure AI Search. Workflow Overview Whenever a user uploads a file to a designated SharePoint folder, a scheduled Azure Logic App is triggered to: Once stored, a scheduled Azure Cognitive Search Indexer kicks in. This indexer: Technologies / resources used: –-> SharePoint: A common document repository for enterprise users, ideal for collaborative uploads. -> Azure Logic Apps: Provides low-code automation to monitor SharePoint for changes and sync files to Blob Storage. It ensures a reliable, scheduled trigger mechanism with minimal overhead. -> Blob Storage: Serves as the staging ground where documents are centrally stored for indexing—cheaper and more scalable than relying solely on SharePoint connectors. -> Azure AI Search (Cognitive Search): The intelligence layer that runs a skillset pipeline to extract, transform, and vectorize the content, enabling semantic search, multimodal RAG (Retrieval Augmented Generation), and other AI-enhanced scenarios. Why Not Vectorize Directly from SharePoint? Reference:-1. https://learn.microsoft.com/en-us/azure/search/search-howto-index-sharepoint-online2. https://learn.microsoft.com/en-us/azure/search/search-howto-indexing-azure-blob-storage How to achieve this? Stage 1: – Logic App to sync Sharepoint files to blob Firstly, create a designated Sharepoint directory to upload the required documents for vectorization. Then create the logic app to replicate the files along with it’s format and properties to the associated blob storage – 1] Assign the site address and the directory name where the documents are uploaded in Sharepoint – In the trigger action “When an item is created or modified”. 2] Assign a recurrence frequency, start time and time zone to check/verify for new documents and keep the blob container updated. 3] Add an action component – “Get file content using path”; and dynamically provide the full path (includes file extension), from the trigger 4] Finally, add an action to create blobs in the designated container that would be vectorized – provide the storage acc. name, directory path, the name of blob (Select to dynamically get the file name with extension for the trigger), blob content (from the get file content action). 5] On successfully saving & running this logic app, either manually or on trigger, the files are replicated in it’s exact form to the blob storage. Stage 2 :- Azure AI Search resource to vectorize the files in blob storage In Azure Portal (Home – Microsoft Azure), search for Azure AI Search service, and provide the necessary details, based on your requirement select a pricing tier. Once resource is successfully created, select “Import & vectorize data” From the 2 options – RAG and Multimodal RAG Index, select the latter one.RAG combines a retriever (to fetch relevant documents) with a generative language model (to generate answers) using text-only data. Multimodal RAG extends the RAG architecture to include multiple data types such as text, images, tables, PDFs, diagrams, audio, or video. Workflow: Now follow the steps and provide the necessary details for the index creation Enable deletion tracking, to remove the records of deleted documents from the index Provide a document intelligence resource to enable OCR, and to get location metadata for multiple document types. Select image verbalization (to verbalize text in images) or multimodal embedding to vectorize the whole image. Assign the LLM model for generating the embeddings for the text/images provide an image output location, to store images extracted from the files Assign a schedule to refresh the indexer and to keep the search index up to date with new documents. Once successfully created, search keywords in the search explorer of the index, to verify the vectorization, the results are provided based on it’s relevance and score/distance, to the user’s search query. Let us test this index in Custom Copilot Agent , by importing this index as an azure ai search knowledge source. On fetching details of certain document specific information, the index is searched for the most appropriate information, and the result is rendered in readable format by generative AI.  We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

How We Used Azure Blob Storage and Logic Apps to Centralize Dynamics 365 Integration Configurations

Managing multiple Dynamics 365 integrations across environments often becomes complex when each integration depends on static or hardcoded configuration values like API URLs, headers, secrets, or custom parameters. We faced similar challenges until we centralized our configuration strategy using Azure Blob Storage to host the configs and Logic Apps to dynamically fetch and apply them during execution. In this blog, we’ll walk through how we implemented this architecture and simplified config management across our D365 projects. Why We Needed Centralized Config Management In projects with multiple Logic Apps and D365 endpoints: Key problems: Solution Architecture Overview Key Components: Workflow: Step-by-Step Implementation Step 1: Store Config in Azure Blob Storage Example JSON: json CopyEdit {   “apiUrl”: “https://externalapi.com/v1/”,   “apiKey”: “xyz123abc”,   “timeout”: 60 } Step 2: Build Logic App to Read Config Step 3: Parse and Use Config Step 4: Apply to All Logic Apps Benefits of This Approach To conclude, centralizing D365 integration configs using Azure Blob and Logic Apps transformed our integration architecture. It made our systems easier to maintain, more scalable, and resilient to changes.Are you still hardcoding configs in your Logic Apps or Power Automate flows? Start organizing your integration configs in Azure Blob today, and build workflows that are smart, scalable, and maintainable. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

Building a Scalable Integration Architecture for Dynamics 365 Using Logic Apps and Azure Functions

If you’ve worked with Dynamics 365 CRM for any serious integration project, you’ve probably used Azure Logic Apps. They’re great — visual, no-code, and fast to deploy. But as your integration needs grow, you quickly hit complexity: multiple entities, large volumes, branching logic, error handling, and reusability. That’s when architecture becomes critical. In this blog, I’ll share how we built a modular, scalable, and reusable integration architecture using Logic Apps + Azure Functions + Azure Blob Storage — with a config-driven approach. Whether you’re syncing data between D365 and Finance & Operations, or automating CRM workflows with external APIs, this post will help you avoid bottlenecks and stay maintainable. Architecture Components Component Purpose Parent Logic App Entry point, reads config from blob, iterates entities Child Logic App(s) Handles each entity sync (Project, Task, Team, etc.) Azure Blob Storage Hosts configuration files, Liquid templates, checkpoint data Azure Function Performs advanced transformation via Liquid templates CRM & F&O APIs Source and target systems Step-by-Step Breakdown 1. Configuration-Driven Logic We didn’t hardcode URLs, fields, or entities. Everything lives in a central config.json in Blob Storage: { “integrationName”: “ProjectToFNO”,   “sourceEntity”: “msdyn_project”,   “targetEntity”: “ProjectsV2”,   “liquidTemplate”: “projectToFno.liquid”,   “primaryKey”: “msdyn_projectid” } 2. Parent–Child Logic App Model Instead of one massive workflow, we created a parent Logic App that: Each child handles: 3. Azure Function for Transformation Why not use Logic App’s Compose or Data Operations? Because complex mapping (especially D365 → F&O) quickly becomes unreadable. Instead: {   “ProjectName”: “{{ msdyn_subject }}”,   “Customer”: “{{ customerid.name }}” } 4. Handling Checkpoints For batch integration (daily/hourly), we store last run timestamp in Blob: {   “entity”: “msdyn_project”,   “modifiedon”: “2025-07-28T22:00:00Z” } This allows delta fetches like: ?$filter=modifiedon gt 2025-07-28T22:00:00Z After each run, we update the checkpoint blob. 5. Centralized Logging & Alerts We configured: This helped us track down integration mismatches fast. Why This Architecture Works Need How It’s Solved Reusability Config-based logic + modular templates Maintainability Each Logic App has one job Scalability Add new entities via config, not code Monitoring Blob + Monitor integration Transformation complexity Handled via Azure Functions + Liquid Key Takeaways To conclude, this architecture has helped us deliver scalable Dynamics 365 integrations, including syncing Projects, Tasks, Teams, and Time Entries to F&O all without rewriting Logic Apps every time a client asks for a tweak. If you’re working on medium to complex D365 integrations, consider going config-driven and breaking your workflows into modular components. It keeps things clean, reusable, and much easier to maintain in the long run. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange