How to Integrate Azure OpenAI for lending with CosmosDB for production AI

By Cyprian AaronsUpdated 2026-04-21
azure-openai-for-lendingcosmosdbproduction-ai

Why this integration matters

If you’re building lending workflows, the hard part is not generating text. It’s grounding every response in customer data, application state, and policy rules so the agent can explain decisions without hallucinating.

Azure OpenAI gives you the reasoning layer for borrower-facing and internal assistant flows. CosmosDB gives you low-latency storage for applications, documents, conversation state, and retrieval-ready context. Together, they let you build lending agents that answer with context, persist state across sessions, and support production-grade auditability.

Prerequisites

  • An Azure subscription with:
    • Azure OpenAI resource deployed
    • Cosmos DB account provisioned
  • An Azure OpenAI deployment name for a chat model
  • A Cosmos DB database and container ready for:
    • loan applications
    • customer profiles
    • conversation memory or case notes
  • Python 3.10+
  • Installed packages:
    • openai
    • azure-cosmos
    • python-dotenv
  • Environment variables set:
    • AZURE_OPENAI_ENDPOINT
    • AZURE_OPENAI_API_KEY
    • AZURE_OPENAI_DEPLOYMENT
    • COSMOS_ENDPOINT
    • COSMOS_KEY
    • COSMOS_DATABASE
    • COSMOS_CONTAINER

Install dependencies:

pip install openai azure-cosmos python-dotenv

Integration Steps

  1. Set up configuration and clients

Use environment variables so your code stays deployable across dev, staging, and prod.

import os
from dotenv import load_dotenv
from openai import AzureOpenAI
from azure.cosmos import CosmosClient

load_dotenv()

azure_openai_client = AzureOpenAI(
    api_key=os.environ["AZURE_OPENAI_API_KEY"],
    api_version="2024-02-15-preview",
    azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
)

cosmos_client = CosmosClient(
    url=os.environ["COSMOS_ENDPOINT"],
    credential=os.environ["COSMOS_KEY"],
)

database_name = os.environ["COSMOS_DATABASE"]
container_name = os.environ["COSMOS_CONTAINER"]

database = cosmos_client.get_database_client(database_name)
container = database.get_container_client(container_name)

deployment_name = os.environ["AZURE_OPENAI_DEPLOYMENT"]
  1. Define a lending record schema in CosmosDB

For lending agents, store one document per application or case. Keep the partition key stable, usually customerId or applicationId.

from datetime import datetime, timezone

def upsert_loan_application(application: dict):
    doc = {
        "id": application["applicationId"],
        "customerId": application["customerId"],
        "productType": application["productType"],
        "requestedAmount": application["requestedAmount"],
        "income": application["income"],
        "debtToIncome": application["debtToIncome"],
        "status": application.get("status", "received"),
        "createdAt": datetime.now(timezone.utc).isoformat()
    }
    container.upsert_item(doc)
    return doc


sample_application = {
    "applicationId": "loan-1001",
    "customerId": "cust-7788",
    "productType": "personal-loan",
    "requestedAmount": 15000,
    "income": 72000,
    "debtToIncome": 0.31
}

saved_doc = upsert_loan_application(sample_application)
print(saved_doc)
  1. Retrieve application context before calling Azure OpenAI

This is the core pattern: fetch authoritative data from CosmosDB first, then send only relevant fields to the model.

def get_application_context(application_id: str):
    query = """
    SELECT * FROM c
    WHERE c.id = @application_id
    """
    items = list(container.query_items(
        query=query,
        parameters=[{"name": "@application_id", "value": application_id}],
        enable_cross_partition_query=True
    ))
    return items[0] if items else None


context = get_application_context("loan-1001")
print(context)
  1. Generate a lending response with Azure OpenAI using stored context

Keep prompts short and structured. Ask the model to explain based on provided fields only.

def generate_lending_summary(application_context: dict):
    messages = [
        {
            "role": "system",
            "content": (
                "You are a lending operations assistant. "
                "Use only the provided application data. "
                "Do not invent missing facts."
            )
        },
        {
            "role": "user",
            "content": f"""
Summarize this loan application for an underwriter.

Application data:
{application_context}
"""
        }
    ]

    response = azure_openai_client.chat.completions.create(
        model=deployment_name,
        messages=messages,
        temperature=0.2,
        max_tokens=300
    )

    return response.choices[0].message.content


if context:
    summary = generate_lending_summary(context)
    print(summary)
  1. Persist the model output back into CosmosDB

In production AI systems, don’t just return the answer. Store it for audit trails, human review, and later retrieval.

from datetime import datetime, timezone

def save_model_output(application_id: str, summary: str):
    record = {
        "id": f"{application_id}-summary",
        "applicationId": application_id,
        "type": "underwriter-summary",
        "summary": summary,
        "updatedAt": datetime.now(timezone.utc).isoformat()
    }
    container.upsert_item(record)
    return record


if context:
    saved_summary = save_model_output("loan-1001", summary)
    print(saved_summary)

Testing the Integration

Run a single end-to-end check: write an application to CosmosDB, read it back, send it to Azure OpenAI, then store the result.

def test_integration():
    app = upsert_loan_application({
        "applicationId": "loan-test-01",
        "customerId": "cust-test-01",
        "productType": "auto-loan",
        "requestedAmount": 18000,
        "income": 85000,
        "debtToIncome": 0.27,
        "status": "pending-review"
    })

    ctx = get_application_context("loan-test-01")
    assert ctx is not None

    result = generate_lending_summary(ctx)
    assert isinstance(result, str) and len(result) > 0

    saved = save_model_output("loan-test-01", result)
    assert saved["applicationId"] == "loan-test-01"

test_integration()
print("Integration test passed")

Expected output:

Integration test passed

If you want a quick manual check, print the summary string as well. You should see a grounded underwriting note that references the loan amount, income, debt-to-income ratio, and status from CosmosDB.

Real-World Use Cases

  • Loan pre-screening assistant

    • Pull applicant data from CosmosDB
    • Ask Azure OpenAI to generate a concise eligibility summary for ops teams
  • Customer-facing status bot

    • Store conversation history and case state in CosmosDB
    • Use Azure OpenAI to answer “What’s missing from my application?” based on current records
  • Underwriter copilot

    • Retrieve income docs, bureau notes, and prior decisions from CosmosDB
    • Have Azure OpenAI draft decision rationales that reviewers can edit before submission

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides