How to Integrate LangGraph for wealth management with Kubernetes for startups

By Cyprian AaronsUpdated 2026-04-21
langgraph-for-wealth-managementkubernetesstartups

Wealth management agents need two things that usually fight each other: structured decision logic and predictable infrastructure. LangGraph gives you the orchestration layer for portfolio workflows, risk checks, and human approvals; Kubernetes gives you repeatable deployment, scaling, and isolation for startup environments where one bad rollout can break client-facing flows.

Prerequisites

  • Python 3.10+
  • A Kubernetes cluster:
    • local: kind, minikube, or k3d
    • cloud: EKS, GKE, or AKS
  • kubectl configured and pointing at your cluster
  • A container registry your cluster can pull from
  • LangGraph installed in your app environment
  • Kubernetes Python client installed
  • Basic knowledge of:
    • workflow state machines
    • Docker images and deployments
    • ConfigMaps and Secrets

Install the Python packages:

pip install langgraph kubernetes pydantic requests

Integration Steps

  1. Define the wealth management workflow in LangGraph

    Model the agent as a graph with explicit steps: ingest request, assess risk, propose allocation, and route to approval if needed. Keep state typed so you can validate inputs before anything hits production APIs.

from typing import TypedDict, Literal, Optional
from langgraph.graph import StateGraph, END

class WealthState(TypedDict):
    client_id: str
    risk_score: int
    requested_action: str
    recommendation: Optional[str]
    approval_status: Optional[Literal["pending", "approved", "rejected"]]

def assess_risk(state: WealthState) -> WealthState:
    score = state["risk_score"]
    state["recommendation"] = "conservative" if score < 40 else "balanced"
    return state

def route_for_approval(state: WealthState) -> str:
    return "human_review" if state["risk_score"] >= 70 else "auto_approve"

def human_review(state: WealthState) -> WealthState:
    state["approval_status"] = "pending"
    return state

def auto_approve(state: WealthState) -> WealthState:
    state["approval_status"] = "approved"
    return state

graph = StateGraph(WealthState)
graph.add_node("assess_risk", assess_risk)
graph.add_node("human_review", human_review)
graph.add_node("auto_approve", auto_approve)

graph.set_entry_point("assess_risk")
graph.add_conditional_edges("assess_risk", route_for_approval, {
    "human_review": "human_review",
    "auto_approve": "auto_approve",
})
graph.add_edge("human_review", END)
graph.add_edge("auto_approve", END)

app = graph.compile()
  1. Wrap the graph in a service that Kubernetes can run

    Expose the compiled graph behind a small HTTP API. In startups, this is the cleanest boundary between orchestration logic and infrastructure concerns.

from fastapi import FastAPI
from pydantic import BaseModel
from langgraph.graph import StateGraph

app_api = FastAPI()

class RequestPayload(BaseModel):
    client_id: str
    risk_score: int
    requested_action: str

@app_api.post("/wealth/recommendation")
def get_recommendation(payload: RequestPayload):
    initial_state = {
        "client_id": payload.client_id,
        "risk_score": payload.risk_score,
        "requested_action": payload.requested_action,
        "recommendation": None,
        "approval_status": None,
    }
    result = app.invoke(initial_state)
    return result
  1. Load cluster configuration from Kubernetes inside the service

    Use the Kubernetes Python client to read runtime config like namespace, feature flags, or downstream service endpoints. This keeps environment-specific values out of code.

from kubernetes import client, config

def load_k8s_context():
    try:
        config.load_incluster_config()
    except Exception:
        config.load_kube_config()

def read_runtime_settings(namespace="wealth-ai"):
    load_k8s_context()
    v1 = client.CoreV1Api()
    cm = v1.read_namespaced_config_map("wealth-agent-config", namespace)
    return cm.data

settings = read_runtime_settings()
print(settings.get("RISK_THRESHOLD"))
  1. Use Kubernetes APIs to trigger or observe workflow execution

    A practical pattern is to run one agent service per namespace and use Kubernetes metadata to tag requests by tenant or product line. You can also emit events back into the cluster for ops visibility.

from kubernetes import client, config

config.load_kube_config()
batch_v1 = client.BatchV1Api()

job_manifest = client.V1Job(
    metadata=client.V1ObjectMeta(name="wealth-agent-batch"),
    spec=client.V1JobSpec(
        template=client.V1PodTemplateSpec(
            metadata=client.V1ObjectMeta(labels={"app": "wealth-agent"}),
            spec=client.V1PodSpec(
                restart_policy="Never",
                containers=[
                    client.V1Container(
                        name="agent",
                        image="your-registry/wealth-agent:latest",
                        env=[
                            client.V1EnvVar(name="NAMESPACE", value="wealth-ai"),
                        ],
                    )
                ],
            ),
        )
    ),
)

batch_v1.create_namespaced_job(namespace="wealth-ai", body=job_manifest)
  1. Deploy with ConfigMaps and Secrets for production settings

    Keep model endpoints, thresholds, and credentials in Kubernetes primitives. The LangGraph app reads them at startup and during refresh cycles.

from kubernetes import client, config

config.load_kube_config()
v1 = client.CoreV1Api()

config_map = client.V1ConfigMap(
    metadata=client.V1ObjectMeta(name="wealth-agent-config"),
    data={
        "RISK_THRESHOLD": "70",
        "APPROVAL_QUEUE": "advisor-review",
        "MODEL_PROVIDER": "openai",
    },
)

v1.create_namespaced_config_map(namespace="wealth-ai", body=config_map)

secret = client.V1Secret(
    metadata=client.V1ObjectMeta(name="wealth-agent-secret"),
    string_data={
        "OPENAI_API_KEY": "replace-me",
    },
)

v1.create_namespaced_secret(namespace="wealth-ai", body=secret)

Testing the Integration

Run a local invocation first, then verify Kubernetes objects exist in-cluster.

from kubernetes import client, config

test_state = {
    "client_id": "c-10021",
    "risk_score": 82,
    "requested_action": "rebalance_portfolio",
    "recommendation": None,
    "approval_status": None,
}

result = app.invoke(test_state)
print(result)

config.load_kube_config()
v1 = client.CoreV1Api()
cm = v1.read_namespaced_config_map("wealth-agent-config", namespace="wealth-ai")
print(cm.data["RISK_THRESHOLD"])

Expected output:

{'client_id': 'c-10021', 'risk_score': 82, 'requested_action': 'rebalance_portfolio', 'recommendation': 'balanced', 'approval_status': 'pending'}
70

Real-World Use Cases

  • Advisor-assisted portfolio rebalancing

    • LangGraph handles policy checks and approval routing.
    • Kubernetes runs isolated tenant workloads per advisory team or region.
  • Client onboarding automation

    • Use the graph to collect KYC inputs, score suitability, and branch to manual review.
    • Use Kubernetes Jobs for batch verification tasks like document parsing.
  • Risk monitoring agents

    • Run continuous monitoring workflows that react to market events.
    • Scale consumers horizontally in Kubernetes when event volume spikes.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides