How We Used Azure Blob Storage and Logic Apps to Centralize Dynamics 365 Integration Configurations

Managing multiple Dynamics 365 integrations across environments often becomes complex when each integration depends on static or hardcoded configuration values like API URLs, headers, secrets, or custom parameters.

We faced similar challenges until we centralized our configuration strategy using Azure Blob Storage to host the configs and Logic Apps to dynamically fetch and apply them during execution. In this blog, we’ll walk through how we implemented this architecture and simplified config management across our D365 projects.

Why We Needed Centralized Config Management

In projects with multiple Logic Apps and D365 endpoints:

  • -Configs were scattered across workflows.
  • -Changing values (like API tokens or keys) required modifying each Logic App manually.
  • -Lack of centralized management increased chances of mismatch and errors.

Key problems:

  • -Maintenance overhead
  • -No version control
  • -Hardcoding = risky

Solution Architecture Overview

Key Components:

  • Azure Blob Storage: Hosts configuration files (JSON format) per environment/project.
  • Logic Apps: Reads the correct config file and dynamically assigns values to headers, URIs, etc.
  • D365: The application uses the Logic App workflows for integrations.

Workflow:

  1. Trigger Logic App via schedule or HTTP.
  2. Read the correct config JSON from Azure Blob (based on environment, client, or operation).
  3. Parse JSON and apply values to Logic App actions (URIs, keys, payload templates).
  4. Continue with integration logic using dynamic values.

Step-by-Step Implementation

Step 1: Store Config in Azure Blob Storage

  • -Create a storage account and container (e.g., integration-configs).
  • -Upload a JSON file like client1-prod-config.json.

Example JSON:

json

CopyEdit

{

  “apiUrl”: “https://externalapi.com/v1/”,

  “apiKey”: “xyz123abc”,

  “timeout”: 60

}

Step 2: Build Logic App to Read Config

  • -Add HTTP trigger or recurrence trigger.
  • -Add Azure Blob Storage → Get Blob Content action.
    • 1.Provide connection to the storage account.
    • 2.Use dynamic path (e.g., /integration-configs/client1-prod-config.json).

Step 3: Parse and Use Config

  • -Add Parse JSON action to parse blob output.
  • -Reference parsed config in downstream steps:
    • 1.API calls (@body(‘Parse_JSON’)?[‘apiUrl’])
    • 2.Headers, timeout values, or even authentication fields.

Step 4: Apply to All Logic Apps

  • -Use consistent structure across configs.
  • -Deploy logic via templates or ARM automation.
  • -Switch environments simply by changing the filename or path.

Benefits of This Approach

  • No Hardcoding: Everything is dynamic.
  • Environment Agnostic: Easily switch between dev, test, and prod.
  • Version Control: Use GitHub or Azure DevOps to track JSON changes.
  • Scalable: Supports multiple clients/projects via a unified pattern.


To conclude, centralizing D365 integration configs using Azure Blob and Logic Apps transformed our integration architecture. It made our systems easier to maintain, more scalable, and resilient to changes.
Are you still hardcoding configs in your Logic Apps or Power Automate flows? Start organizing your integration configs in Azure Blob today, and build workflows that are smart, scalable, and maintainable.

I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.


Share Story :

SEARCH BLOGS :

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange