How to Integrate Azure OpenAI for pension funds with CosmosDB for AI agents

By Cyprian AaronsUpdated 2026-04-21
azure-openai-for-pension-fundscosmosdbai-agents

Why this integration matters

If you’re building AI agents for pension funds, you need two things to work together: a model that can reason over policy, member queries, and document context, and a datastore that can hold structured pension records, retrieval chunks, and agent memory. Azure OpenAI gives you the reasoning layer; Cosmos DB gives you low-latency persistence for member profiles, case state, and retrieval data.

That combination is useful for things like member self-service agents, retirement planning assistants, document Q&A over scheme rules, and internal ops copilots that need to remember prior interactions without stuffing everything into the prompt.

Prerequisites

  • An Azure subscription with:
    • Azure OpenAI resource deployed
    • A chat model deployment name, for example gpt-4o-mini
    • Azure Cosmos DB for NoSQL account
  • Python 3.10+
  • Installed packages:
    • openai
    • azure-cosmos
    • python-dotenv
  • Azure OpenAI endpoint and API key
  • Cosmos DB endpoint and key
  • A Cosmos database and container already created
  • Basic familiarity with Python async or sync HTTP clients

Install dependencies:

pip install openai azure-cosmos python-dotenv

Set environment variables:

export AZURE_OPENAI_ENDPOINT="https://your-openai-resource.openai.azure.com/"
export AZURE_OPENAI_API_KEY=" ձեր-key "
export AZURE_OPENAI_DEPLOYMENT="gpt-4o-mini"
export COSMOS_ENDPOINT="https://your-account.documents.azure.com:443/"
export COSMOS_KEY="your-cosmos-key"
export COSMOS_DATABASE="pension-agents"
export COSMOS_CONTAINER="agent-memory"

Integration Steps

1) Initialize Azure OpenAI and Cosmos DB clients

Start by creating both clients in one place. Keep configuration outside code so you can rotate keys without redeploying.

import os
from openai import AzureOpenAI
from azure.cosmos import CosmosClient

azure_openai_client = AzureOpenAI(
    api_key=os.environ["AZURE_OPENAI_API_KEY"],
    api_version="2024-06-01",
    azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
)

cosmos_client = CosmosClient(
    url=os.environ["COSMOS_ENDPOINT"],
    credential=os.environ["COSMOS_KEY"],
)

database = cosmos_client.get_database_client(os.environ["COSMOS_DATABASE"])
container = database.get_container_client(os.environ["COSMOS_CONTAINER"])

deployment_name = os.environ["AZURE_OPENAI_DEPLOYMENT"]

2) Store pension member context in Cosmos DB

For an agent system, keep member state in Cosmos DB as JSON documents. That lets you store current intent, last conversation turn, risk flags, and retrieval metadata.

from datetime import datetime

member_context = {
    "id": "member-10027",
    "memberId": "10027",
    "scheme": "retirement-income-plan",
    "lastIntent": "benefit_projection",
    "riskFlag": False,
    "updatedAt": datetime.utcnow().isoformat(),
    "notes": [
        {"role": "user", "content": "Can I retire at 60?"},
        {"role": "assistant", "content": "I need your current balance and contribution history."}
    ]
}

container.upsert_item(member_context)
print("Saved member context")

3) Retrieve the record before calling Azure OpenAI

Before generating a response, pull the latest member data from Cosmos DB. This is the part that makes the agent stateful instead of stateless.

query = """
SELECT * FROM c
WHERE c.memberId = @memberId
"""

params = [{"name": "@memberId", "value": "10027"}]

items = list(
    container.query_items(
        query=query,
        parameters=params,
        enable_cross_partition_query=True,
    )
)

if not items:
    raise ValueError("No member context found")

member = items[0]
print(member["lastIntent"])

4) Send retrieved context to Azure OpenAI

Now combine the stored context with the user request. For pension fund workflows, keep the prompt constrained: ask for grounded answers and avoid inventing policy details.

user_question = "Based on my profile, what information do you still need to estimate retirement income?"

messages = [
    {
        "role": "system",
        "content": (
            "You are a pension fund assistant. "
            "Use only the provided member context. "
            "If information is missing, ask for it clearly."
        ),
    },
    {
        "role": "user",
        "content": f"""
Member context:
{member}

Question:
{user_question}
""".strip(),
    },
]

response = azure_openai_client.chat.completions.create(
    model=deployment_name,
    messages=messages,
    temperature=0.2,
)

answer = response.choices[0].message.content
print(answer)

5) Persist the agent response back into Cosmos DB

Write the model output back into Cosmos DB so future turns can reuse it. In production, store both the answer and metadata like model name, latency, and correlation ID.

from datetime import datetime

conversation_event = {
    "id": f"event-{datetime.utcnow().timestamp()}",
    "memberId": member["memberId"],
    "type": "assistant_response",
    "question": user_question,
    "answer": answer,
    "model": deployment_name,
    "createdAt": datetime.utcnow().isoformat(),
}

container.upsert_item(conversation_event)
print("Saved assistant response")

Testing the Integration

Run a single end-to-end test: write a record to Cosmos DB, read it back, call Azure OpenAI, then persist the response.

def test_pension_agent_flow():
    test_member_id = "10027"

    records = list(container.query_items(
        query="SELECT * FROM c WHERE c.memberId = @memberId",
        parameters=[{"name": "@memberId", "value": test_member_id}],
        enable_cross_partition_query=True,
    ))

    assert len(records) > 0, f"No record found for {test_member_id}"

    prompt = [
        {"role": "system", "content": "You are a pension support assistant."},
        {"role": "user", "content": f"Summarize this record: {records[0]}"},
    ]

    result = azure_openai_client.chat.completions.create(
        model=deployment_name,
        messages=prompt,
        temperature=0.1,
        max_tokens=200,
    )

    text = result.choices[0].message.content
    assert text is not None and len(text) > 0

    print("Integration OK")
    print(text[:300])

test_pension_agent_flow()

Expected output:

Integration OK
The member record indicates...

Real-World Use Cases

  • Member servicing agent

    • Answer retirement eligibility questions using live member context from Cosmos DB.
    • Store conversation history so follow-up questions stay coherent across sessions.
  • Policy document assistant

    • Index scheme rules or trustee guidance in Cosmos DB.
    • Use Azure OpenAI to summarize clauses, extract action items, or answer policy questions grounded in stored content.
  • Operations copilot

    • Help pension admin teams triage cases.
    • Read case state from Cosmos DB, generate next-step recommendations with Azure OpenAI, then write decisions back for auditability.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides