How to Integrate Azure OpenAI for pension funds with CosmosDB for production AI

By Cyprian AaronsUpdated 2026-04-21
azure-openai-for-pension-fundscosmosdbproduction-ai

Opening

If you are building AI for pension funds, you need two things: a model that can reason over policy-heavy questions, and a datastore that can keep member context, case history, and audit trails. Azure OpenAI gives you the reasoning layer; Cosmos DB gives you the operational memory for production AI.

That combination is useful for pension admin assistants, document Q&A over scheme rules, member servicing workflows, and regulated case triage. The pattern is simple: retrieve the right member or policy context from Cosmos DB, send it to Azure OpenAI, then persist the model output back into Cosmos DB with traceability.

Prerequisites

  • An Azure subscription with:
    • Azure OpenAI resource deployed
    • Cosmos DB account created
  • A deployed Azure OpenAI model:
    • Example: gpt-4o-mini or another chat-completions deployment name
  • A Cosmos DB database and container:
    • Example database: pension_ai
    • Example container: cases
    • Partition key: /memberId
  • Python 3.10+
  • Installed packages:
    pip install openai azure-cosmos python-dotenv
    
  • Environment variables set:
    • AZURE_OPENAI_ENDPOINT
    • AZURE_OPENAI_API_KEY
    • AZURE_OPENAI_DEPLOYMENT
    • COSMOS_ENDPOINT
    • COSMOS_KEY
    • COSMOS_DATABASE
    • COSMOS_CONTAINER

Integration Steps

  1. Initialize both clients

    Start by wiring up the Azure OpenAI client and the Cosmos DB client in one place. Keep this in a shared module so your agent services do not duplicate connection logic.

    import os
    from openai import AzureOpenAI
    from azure.cosmos import CosmosClient
    
    azure_openai_client = AzureOpenAI(
        api_key=os.environ["AZURE_OPENAI_API_KEY"],
        api_version="2024-02-15-preview",
        azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
    )
    
    cosmos_client = CosmosClient(
        url=os.environ["COSMOS_ENDPOINT"],
        credential=os.environ["COSMOS_KEY"]
    )
    
    database = cosmos_client.get_database_client(os.environ["COSMOS_DATABASE"])
    container = database.get_container_client(os.environ["COSMOS_CONTAINER"])
    
  2. Fetch pension member context from Cosmos DB

    In production AI, do not send raw user input straight to the model. Pull the member record, recent cases, and policy metadata first.

    def get_member_context(member_id: str) -> dict:
        query = """
        SELECT * FROM c
        WHERE c.memberId = @memberId
        ORDER BY c.updatedAt DESC
        """
    
        items = list(container.query_items(
            query=query,
            parameters=[{"name": "@memberId", "value": member_id}],
            enable_cross_partition_query=True
        ))
    
        return {
            "memberId": member_id,
            "records": items[:5]
        }
    
    context = get_member_context("M12345")
    print(context)
    
  3. Call Azure OpenAI with retrieved context

    Build a prompt that includes only the minimum necessary pension data. For regulated workflows, keep the prompt structured and deterministic.

    def generate_response(member_context: dict, user_question: str) -> str:
        system_message = (
            "You are a pension administration assistant. "
            "Answer using only the provided context. "
            "If information is missing, say what is missing."
        )
    
        response = azure_openai_client.chat.completions.create(
            model=os.environ["AZURE_OPENAI_DEPLOYMENT"],
            messages=[
                {"role": "system", "content": system_message},
                {
                    "role": "user",
                    "content": f"""
                    Member context:
                    {member_context}
    
                    Question:
                    {user_question}
                    """
                }
            ],
            temperature=0.2,
            max_tokens=500,
        )
    
        return response.choices[0].message.content
    
    answer = generate_response(context, "What is the status of this member's transfer request?")
    print(answer)
    
  4. Persist the AI result back into Cosmos DB

    Store both the input and output so your operations team can audit decisions later. Use an explicit document shape with timestamps and model metadata.

    from datetime import datetime, timezone
    
    def save_ai_case(member_id: str, question: str, answer: str) -> None:
        doc = {
            "id": f"{member_id}-{datetime.now(timezone.utc).timestamp()}",
            "memberId": member_id,
            "question": question,
            "answer": answer,
            "model": os.environ["AZURE_OPENAI_DEPLOYMENT"],
            "createdAt": datetime.now(timezone.utc).isoformat(),
            "type": "ai_response"
        }
    
        container.upsert_item(doc)
    
    save_ai_case(
        member_id="M12345",
        question="What is the status of this member's transfer request?",
        answer=answer
    )
    
  5. Wrap it in one production-friendly function

    This is the pattern your agent service should expose: retrieve context, call the model, persist output.

     def handle_member_query(member_id: str, question: str) -> str:
         member_context = get_member_context(member_id)
         answer = generate_response(member_context, question)
         save_ai_case(member_id, question, answer)
         return answer
    
     final_answer = handle_member_query(
         "M12345",
         "Summarize any outstanding actions on this pension case."
     )
    
     print(final_answer)
    

Testing the Integration

Use a known member ID with at least one document in Cosmos DB. Then run a direct end-to-end call and confirm that both the model response and persistence work.

test_member_id = "M12345"
test_question = "Summarize this member's latest pension case status."

result = handle_member_query(test_member_id, test_question)
print("MODEL OUTPUT:")
print(result)

saved_docs = list(container.query_items(
    query="SELECT * FROM c WHERE c.memberId = @memberId AND c.type = 'ai_response'",
    parameters=[{"name": "@memberId", "value": test_member_id}],
    enable_cross_partition_query=True
))

print(f"Saved AI docs: {len(saved_docs)}")

Expected output:

MODEL OUTPUT:
The latest case shows a transfer request pending verification...
Saved AI docs: 1

If Saved AI docs stays at 0, your write path is broken. If the model returns empty or generic text, your prompt is probably missing usable context from Cosmos DB.

Real-World Use Cases

  • Member servicing assistant

    • Answer questions about contributions, transfers, retirement options, and case status using live data from Cosmos DB.
  • Case triage for operations teams

    • Classify incoming pension requests by urgency or topic, then store routing decisions and reasoning for audit.
  • Policy-aware document Q&A

    • Let staff ask questions over scheme rules or benefit guides while keeping source documents indexed in Cosmos DB for retrieval and traceability.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides