CrewAI Tutorial (TypeScript): connecting to PostgreSQL for beginners
This tutorial shows you how to connect a CrewAI TypeScript project to PostgreSQL, store agent outputs in a database, and read them back for verification. You’d use this when you want your agents to persist state, log runs, or share structured data with other services instead of keeping everything in memory.
What You'll Need
- •Node.js 18+ and npm
- •A PostgreSQL instance running locally or in the cloud
- •A PostgreSQL connection string, for example:
- •
postgresql://postgres:postgres@localhost:5432/crewai_demo
- •
- •A CrewAI TypeScript project already initialized
- •These packages:
- •
crewai - •
pg - •
dotenv - •
typescript - •
tsxorts-nodefor running TypeScript directly
- •
- •Basic familiarity with CrewAI agents, tasks, and crews
- •A
.envfile for secrets
Step-by-Step
- •Install the dependencies and set up your environment variables. Keep the database URL out of source control and make sure your PostgreSQL user can create tables in the target database.
npm install crewai pg dotenv
npm install -D typescript tsx @types/node @types/pg
OPENAI_API_KEY=your_openai_key_here
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/crewai_demo
- •Create a small PostgreSQL helper that opens a connection pool and creates a table for agent results. This keeps database logic separate from your CrewAI code and makes it easy to reuse later.
import { Pool } from "pg";
import "dotenv/config";
export const pool = new Pool({
connectionString: process.env.DATABASE_URL,
});
export async function initDb() {
await pool.query(`
CREATE TABLE IF NOT EXISTS crew_runs (
id SERIAL PRIMARY KEY,
task_name TEXT NOT NULL,
result ტექST NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
)
`);
}
- •Define your agent, task, and crew using real CrewAI imports. The key idea is simple: let the crew generate structured output, then save that output to PostgreSQL after the run completes.
import { Agent, Task, Crew } from "crewai";
export const analyst = new Agent({
role: "PostgreSQL Analyst",
goal: "Summarize customer support data clearly",
backstory: "You turn raw operational data into short business summaries.",
});
export const summarizeTask = new Task({
description: "Write a one-paragraph summary of the latest support trends.",
expectedOutput: "A concise summary of support trends.",
agent: analyst,
});
export const crew = new Crew({
agents: [analyst],
tasks: [summarizeTask],
});
- •Run the crew, then insert the result into PostgreSQL. This is the part beginners usually miss: CrewAI handles the reasoning, while your application code handles persistence.
import "dotenv/config";
import { initDb, pool } from "./db";
import { crew } from "./crew";
async function main() {
await initDb();
const result = await crew.kickoff();
const output = String(result);
await pool.query(
"INSERT INTO crew_runs (task_name, result) VALUES ($1, $2)",
["support_summary", output]
);
console.log("Saved crew output to PostgreSQL:", output);
}
main()
.catch(console.error)
.finally(() => pool.end());
- •Add a reader script so you can verify what was stored. If you can write and read rows cleanly, your PostgreSQL connection is working correctly.
import "dotenv/config";
import { pool } from "./db";
async function main() {
const { rows } = await pool.query(
"SELECT id, task_name, result, created_at FROM crew_runs ORDER BY id DESC LIMIT 5"
);
console.table(rows);
}
main()
.catch(console.error)
.finally(() => pool.end());
Testing It
Run your main script first with npx tsx src/main.ts. If everything is wired correctly, you should see the crew output printed in the terminal and a new row inserted into crew_runs.
Then run your reader script with npx tsx src/read.ts and confirm that the row appears in the table. If you get a connection error, check DATABASE_URL, verify PostgreSQL is running, and make sure the target database exists.
If the insert fails because of permissions, your database user likely cannot create tables or write rows. Fix that at the PostgreSQL level before debugging anything in TypeScript.
Next Steps
- •Store structured JSON instead of plain text by changing the column type to
JSONB - •Add retries and transaction handling around inserts for production workloads
- •Use separate tables for runs, tasks, tool calls, and audit logs
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit