Haystack Tutorial (TypeScript): deploying to AWS Lambda for beginners
This tutorial shows how to package a Haystack pipeline written in TypeScript and run it on AWS Lambda behind an API Gateway endpoint. You need this when you want a lightweight, serverless inference or retrieval endpoint without managing servers.
What You'll Need
- •Node.js 20+
- •AWS account with permission to create:
- •Lambda functions
- •IAM roles
- •API Gateway HTTP APIs
- •CloudWatch logs
- •AWS CLI configured locally
- •A Haystack project in TypeScript
- •An OpenAI API key if you want to use
OpenAIChatGenerator - •These npm packages:
- •
@haystack-ai/core - •
@haystack-ai/integrations/openai - •
esbuild - •
aws-lambda - •
@types/aws-lambda
- •
Step-by-Step
- •Start with a minimal Haystack pipeline that can answer a question from a single prompt. For Lambda, keep the pipeline stateless and fast to initialize so cold starts stay predictable.
import { Pipeline } from "@haystack-ai/core";
import { OpenAIChatGenerator } from "@haystack-ai/integrations/openai";
const pipeline = new Pipeline();
const llm = new OpenAIChatGenerator({
model: "gpt-4o-mini",
apiKey: process.env.OPENAI_API_KEY!,
});
pipeline.addComponent("llm", llm);
export async function runPrompt(question: string): Promise<string> {
const result = await pipeline.run({
llm: {
messages: [{ role: "user", content: question }],
},
});
return result.llm.replies[0].content;
}
- •Add an AWS Lambda handler that reads JSON from API Gateway, calls the pipeline, and returns a plain HTTP response. Keep the handler thin; all business logic should stay in your Haystack code.
import type { APIGatewayProxyHandlerV2 } from "aws-lambda";
import { runPrompt } from "./pipeline";
export const handler: APIGatewayProxyHandlerV2 = async (event) => {
const body = event.body ? JSON.parse(event.body) : {};
const question = body.question ?? "What is Haystack?";
try {
const answer = await runPrompt(question);
return {
statusCode: 200,
headers: { "content-type": "application/json" },
body: JSON.stringify({ answer }),
};
} catch (error) {
console.error(error);
return {
statusCode: 500,
headers: { "content-type": "application/json" },
body: JSON.stringify({ error: "Internal Server Error" }),
};
}
};
- •Set up a small project structure and install dependencies. This keeps local development and Lambda packaging aligned, which matters when you start debugging runtime differences.
mkdir haystack-lambda-ts
cd haystack-lambda-ts
npm init -y
npm install @haystack-ai/core @haystack-ai/integrations/openai aws-lambda
npm install -D typescript esbuild @types/aws-lambda @types/node
npx tsc --init --rootDir src --outDir dist --module nodenext --target es2022 \
--moduleResolution nodenext --esModuleInterop true --strict true
mkdir src
- •Add a build script that bundles your handler for Lambda. Use
esbuildso you ship one file instead of a large node_modules tree.
{
"name": "haystack-lambda-ts",
"version": "1.0.0",
"type": "module",
"scripts": {
"build": "esbuild src/handler.ts --bundle --platform=node --target=node20 --format=esm --outfile=dist/index.mjs",
"zip": "cd dist && zip lambda.zip index.mjs"
}
}
- •Build the bundle and prepare the deployment artifact. At this point, your Lambda package is just the bundled handler plus environment variables for secrets.
npm run build
npm run zip
aws lambda create-function \
--function-name haystack-ts-demo \
--runtime nodejs20.x \
--handler index.handler \
--role arn:aws:iam::<YOUR_ACCOUNT_ID>:role/<YOUR_LAMBDA_ROLE> \
--zip-file fileb://dist/lambda.zip \
--environment Variables="{OPENAI_API_KEY=$OPENAI_API_KEY}"
- •Connect the function to an HTTP API and test it with curl. For beginner setups, HTTP API is simpler than REST API and is enough for most agent endpoints.
API_ID=$(aws apigatewayv2 create-api \
--name haystack-ts-api \
--protocol-type HTTP \
--query 'ApiId' \
--output text)
INVOKE_ARN=$(aws lambda get-function \
--function-name haystack-ts-demo \
--query 'Configuration.FunctionArn' \
--output text)
aws apigatewayv2 create-integration \
--api-id "$API_ID" \
--integration-type AWS_PROXY \
--integration-uri "$INVOKE_ARN" \
--payload-format-version '2.0'
curl -X POST "https://$API_ID.execute-api.<REGION>.amazonaws.com/" \
-H 'content-type: application/json' \
-d '{"question":"Give me one sentence about Haystack."}'
Testing It
First, test the handler locally before deploying by running the compiled file with Node and mocking an event if needed. Then check CloudWatch logs after your first real invocation; Lambda failures usually show up there immediately.
If you get a timeout, increase the function timeout to at least 15 seconds while testing. If you get a module import error, verify that your bundle output matches the Lambda handler name and that you built for Node.js ESM correctly.
A good smoke test is sending the same question three times and confirming you get consistent JSON responses with status code 200. If the response shape changes, fix it before adding more components like retrievers or tools.
Next Steps
- •Add a retriever component and move from single-prompt answers to RAG.
- •Store secrets in AWS Secrets Manager instead of Lambda environment variables.
- •Put API Gateway behind authentication if this endpoint will be used by internal teams or customers
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit