Building a Scalable Integration Architecture for Dynamics 365 Using Logic Apps and Azure Functions
If you’ve worked with Dynamics 365 CRM for any serious integration project, you’ve probably used Azure Logic Apps. They’re great — visual, no-code, and fast to deploy. But as your integration needs grow, you quickly hit complexity: multiple entities, large volumes, branching logic, error handling, and reusability.
That’s when architecture becomes critical. In this blog, I’ll share how we built a modular, scalable, and reusable integration architecture using Logic Apps + Azure Functions + Azure Blob Storage — with a config-driven approach.
Whether you’re syncing data between D365 and Finance & Operations, or automating CRM workflows with external APIs, this post will help you avoid bottlenecks and stay maintainable.
Architecture Components
Component | Purpose |
Parent Logic App | Entry point, reads config from blob, iterates entities |
Child Logic App(s) | Handles each entity sync (Project, Task, Team, etc.) |
Azure Blob Storage | Hosts configuration files, Liquid templates, checkpoint data |
Azure Function | Performs advanced transformation via Liquid templates |
CRM & F&O APIs | Source and target systems |
Step-by-Step Breakdown
1. Configuration-Driven Logic
We didn’t hardcode URLs, fields, or entities. Everything lives in a central config.json in Blob Storage:
{ “integrationName”: “ProjectToFNO”, “sourceEntity”: “msdyn_project”, “targetEntity”: “ProjectsV2”, “liquidTemplate”: “projectToFno.liquid”, “primaryKey”: “msdyn_projectid” } |
2. Parent–Child Logic App Model
Instead of one massive workflow, we created a parent Logic App that:
- -Loads all configs from Blob
- -Loops through each integration entity
- -Triggers the relevant child Logic App
Each child handles:
- -Data pull from CRM
- -Transform via Azure Function (with Liquid)
- -Existence checks in F&O
- -Create/update logic
- -Checkpoint update
3. Azure Function for Transformation
Why not use Logic App’s Compose or Data Operations?
Because complex mapping (especially D365 → F&O) quickly becomes unreadable. Instead:
- -The child Logic App sends raw CRM JSON and config to an Azure Function.
- -The function applies a Liquid template hosted in Blob Storage.
- -Output is a valid F&O payload.
{ “ProjectName”: “{{ msdyn_subject }}”, “Customer”: “{{ customerid.name }}” } |
4. Handling Checkpoints
For batch integration (daily/hourly), we store last run timestamp in Blob:
{ “entity”: “msdyn_project”, “modifiedon”: “2025-07-28T22:00:00Z” } |
This allows delta fetches like:
?$filter=modifiedon gt 2025-07-28T22:00:00Z |
After each run, we update the checkpoint blob.
5. Centralized Logging & Alerts
We configured:
- -Custom error handling scopes in each child app
- -Azure Monitor alerts on failures
- -Blob-based logs for raw input/output of each transaction
This helped us track down integration mismatches fast.
Why This Architecture Works
Need | How It’s Solved |
Reusability | Config-based logic + modular templates |
Maintainability | Each Logic App has one job |
Scalability | Add new entities via config, not code |
Monitoring | Blob + Monitor integration |
Transformation complexity | Handled via Azure Functions + Liquid |
Key Takeaways
- Don’t overuse a single Logic App. You’ll regret it when you need to debug.
- Avoid hardcoded field mappings. Templates in Blob make life easier.
- Paginate and $expand carefully. CRM APIs limit data return.
- Test with real-world volume. What works on 5 records may fail on 500.
To conclude, this architecture has helped us deliver scalable Dynamics 365 integrations, including syncing Projects, Tasks, Teams, and Time Entries to F&O all without rewriting Logic Apps every time a client asks for a tweak.
If you’re working on medium to complex D365 integrations, consider going config-driven and breaking your workflows into modular components. It keeps things clean, reusable, and much easier to maintain in the long run.
I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.