How to Integrate AutoGen for retail banking with Docker for production AI
AutoGen for retail banking gives you the orchestration layer for multi-agent banking workflows. Docker gives you repeatable runtime isolation, which is what you want when those agents touch customer data, internal APIs, or model endpoints in production.
Put them together and you get a clean pattern: AutoGen handles the conversation and tool routing, while Docker keeps each agent service pinned to a known image, dependency set, and network boundary.
Prerequisites
- •Python 3.10+
- •Docker Engine 24+
- •
pipandvenv - •An AutoGen package installed for your retail banking agent stack
- •Docker SDK for Python:
pip install docker - •AutoGen packages used by your project, for example:
pip install pyautogen - •A running local model endpoint or API key if your AutoGen agents call an LLM
- •Access to your bank sandbox APIs, if you plan to connect account lookup or payment tools
- •A Docker image that contains your agent runtime and dependencies
Integration Steps
- •
Build a containerized banking agent runtime
Start by packaging the agent code into a Docker image. This image should contain the AutoGen dependencies, your banking tools, and any internal certificates or config files needed for sandbox access.
from pathlib import Path dockerfile = Path("Dockerfile") dockerfile.write_text( """ FROM python:3.11-slim WORKDIR /app COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt COPY . . CMD ["python", "banking_agent.py"] """ ) print("Dockerfile written")Keep the image deterministic. If your compliance team asks what changed between deployments, you want an immutable image digest, not a mutable laptop environment.
- •
Define the AutoGen banking agent
Use AutoGen’s
AssistantAgentto represent the banking assistant and wire in a tool for customer lookup or policy checks. In production, this tool should call a real service, not hardcoded sample data.import os import autogen llm_config = { "model": os.getenv("OPENAI_MODEL", "gpt-4o-mini"), "api_key": os.environ["OPENAI_API_KEY"], "temperature": 0, } def get_customer_profile(customer_id: str) -> dict: # Replace with internal CRM/banking API call return { "customer_id": customer_id, "segment": "retail", "kyc_status": "verified", "risk_band": "low", } banker = autogen.AssistantAgent( name="retail_banker", llm_config=llm_config, system_message="You assist retail banking operations with policy-safe answers." ) banker.register_for_llm(name="get_customer_profile", description="Fetch customer profile")(get_customer_profile)This is the core pattern: keep business logic in tools, not in prompt text.
- •
Run the agent inside Docker with the Python SDK
Use the Docker SDK to start the container from Python so your orchestrator can manage lifecycle, logs, and health checks. This is better than shelling out to
docker runbecause you can inspect status and recover programmatically.import docker client = docker.from_env() image_tag = "retail-banking-agent:latest" container = client.containers.run( image_tag, detach=True, name="retail-banking-agent", environment={ "OPENAI_API_KEY": os.environ["OPENAI_API_KEY"], "OPENAI_MODEL": os.getenv("OPENAI_MODEL", "gpt-4o-mini"), }, ports={"8000/tcp": 8000}, remove=True, ) print(container.id)If your agent exposes an HTTP endpoint, map it here. If it runs as a worker, skip ports and use logs plus exit codes for observability.
- •
Connect AutoGen orchestration to the containerized service
In production systems, I prefer keeping AutoGen orchestration outside the container that runs the worker. That lets one controller talk to multiple isolated services across different environments.
import requests import autogen user_proxy = autogen.UserProxyAgent( name="ops_user", human_input_mode="NEVER", max_consecutive_auto_reply=3, code_execution_config=False, ) def call_containerized_agent(message: str) -> str: response = requests.post( "http://localhost:8000/chat", json={"message": message}, timeout=30, ) response.raise_for_status() return response.json()["reply"] user_proxy.register_for_execution(name="call_containerized_agent")(call_containerized_agent) result = user_proxy.initiate_chat( banker, message="Check customer profile for ID 12345 and summarize KYC status." ) print(result) - •
Add container health checks before sending banking tasks
Don’t send regulated workloads into a dead container. Check status first, then route work only when the service is healthy.
import time def wait_for_container(container, timeout=60): deadline = time.time() + timeout while time.time() < deadline: container.reload() if container.status == "running": return True time.sleep(2) return False if not wait_for_container(container): raise RuntimeError("Banking agent container failed health check") logs = container.logs(tail=20).decode("utf-8") print(logs)
Testing the Integration
Use a simple smoke test that confirms three things:
- •The container starts
- •The agent endpoint responds
- •The AutoGen flow returns a structured answer
import requests
payload = {"message": "Retrieve customer profile for ID 12345"}
resp = requests.post("http://localhost:8000/chat", json=payload, timeout=30)
resp.raise_for_status()
data = resp.json()
print(data["reply"])
Expected output:
Customer 12345 is verified under retail segment with low risk band.
If you want stronger validation, assert on fields instead of free text:
assert data["customer_id"] == "12345"
assert data["kyc_status"] == "verified"
Real-World Use Cases
- •
Customer service triage
- •Route balance inquiries, card disputes, and account-status questions through separate AutoGen agents running in isolated Docker containers.
- •
Fraud ops assistant
- •Let one agent query transaction streams while another checks policy rules and escalation thresholds in a locked-down runtime.
- •
Branch operations copilot
- •Build an internal assistant that helps staff verify customer identity, summarize account history, and generate next-step actions without exposing local developer machines to sensitive data.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit