LlamaIndex Tutorial (TypeScript): building prompt templates for advanced developers

By Cyprian AaronsUpdated 2026-04-21
llamaindexbuilding-prompt-templates-for-advanced-developerstypescript

This tutorial shows you how to build reusable prompt templates in LlamaIndex TypeScript and wire them into an index/query pipeline. You need this when the default prompts are too generic and you want tighter control over output format, tone, retrieval behavior, or domain-specific constraints.

What You'll Need

  • Node.js 18+
  • TypeScript 5+
  • An OpenAI API key set as OPENAI_API_KEY
  • Packages:
    • llamaindex
    • typescript
    • tsx or ts-node for running TypeScript directly
  • A project with package.json and a working TypeScript config

Install the dependencies:

npm install llamaindex
npm install -D typescript tsx @types/node

Set your API key:

export OPENAI_API_KEY="your-key-here"

Step-by-Step

  1. Start by defining a prompt template that accepts variables and produces a predictable instruction block. In LlamaIndex TypeScript, PromptTemplate is the right primitive when you want to keep prompt text versioned and reusable across retrieval flows.
import { PromptTemplate } from "llamaindex";

const qaPrompt = new PromptTemplate({
  template: `
You are a senior assistant for bank operations.
Use only the context below.

Context:
{context}

Question:
{question}

Return:
- A direct answer
- A short rationale
- If the context is insufficient, say "I don't know"
`,
});

console.log(
  qaPrompt.format({
    context: "The chargeback window is 120 days.",
    question: "What is the chargeback window?",
  })
);
  1. Build a prompt that enforces structured output. This is useful when downstream code expects stable fields instead of free-form prose, especially in regulated environments where you need deterministic parsing.
import { PromptTemplate } from "llamaindex";

const jsonPrompt = new PromptTemplate({
  template: `
You are extracting policy details.

Policy text:
{policyText}

Return valid JSON with these keys:
{
  "policyName": string,
  "effectiveDate": string,
  "coverageLimit": string
}
`,
});

const rendered = jsonPrompt.format({
  policyText: "Premier Home Cover effective 2024-01-01 with coverage limit of $500,000.",
});

console.log(rendered);
  1. Use your template inside a query engine. The point here is not just rendering strings; it is injecting your own prompt into retrieval so the model answers in your preferred shape and tone.
import {
  Document,
  VectorStoreIndex,
} from "llamaindex";

async function main() {
  const docs = [
    new Document({ text: "The chargeback window is 120 days for card disputes." }),
    new Document({ text: "Wire transfer reversals require manager approval." }),
  ];

  const index = await VectorStoreIndex.fromDocuments(docs);

  const queryEngine = index.asQueryEngine({
    textQaTemplate: new PromptTemplate({
      template: `
You are answering questions for an internal banking assistant.
Use only the context.

Context:
{context_str}

Question:
{query_str}

Answer in one paragraph.
`,
    }),
  });

  const response = await queryEngine.query({
    query: "What is the chargeback window?",
  });

  console.log(response.toString());
}

main();
  1. Create separate templates for different tasks instead of overloading one giant prompt. In production, I usually keep answer generation, extraction, and summarization as distinct templates so each one stays easy to test and change.
import { PromptTemplate } from "llamaindex";

const summarizePrompt = new PromptTemplate({
  template: `
Summarize this insurance claim note in three bullets.

Note:
{text}

Bullets:
`,
});

const classifyPrompt = new PromptTemplate({
  template: `
Classify the message into one label:
- fraud
- billing
- support

Message:
{text}

Label:
`,
});

console.log(summarizePrompt.format({ text: "Customer reported water damage after pipe burst." }));
console.log(classifyPrompt.format({ text: "I was charged twice for my premium." }));
  1. Wrap prompt creation in a small factory so your app can swap templates by use case or tenant. This keeps prompt logic out of route handlers and makes it easier to A/B test prompts without touching retrieval code.
import { PromptTemplate } from "llamaindex";

type PromptKind = "qa" | "summary" | "classification";

function getPrompt(kind: PromptKind): PromptTemplate {
  switch (kind) {
    case "qa":
      return new PromptTemplate({
        template: `Context:\n{context_str}\n\nQuestion:\n{query_str}\n\nAnswer concisely.`,
      });
    case "summary":
      return new PromptTemplate({
        template: `Summarize this text in two sentences:\n{text}`,
      });
    case "classification":
      return new PromptTemplate({
        template: `Classify this message as fraud, billing, or support:\n{text}`,
      });
  }
}

console.log(getPrompt("qa").format({ context_str: "Policy allows refunds within 30 days.", query_str: "What is the refund window?" }));

Testing It

Run each file with tsx and confirm that .format() returns exactly the variables you expect substituted into the template. For query-engine usage, check that answers follow your formatting rules and do not drift into generic responses.

If you are using structured output prompts, validate the model output with JSON.parse() before sending it further down your pipeline. If parsing fails, tighten the prompt or add a post-processing retry path.

For retrieval prompts, test with documents that contain conflicting facts so you can see whether your instructions correctly constrain the answer to retrieved context only. That is where prompt quality shows up fast.

Next Steps

  • Add output validation with Zod so your prompt contracts become enforceable at runtime.
  • Learn how to override more LlamaIndex response synthesis prompts for custom RAG behavior.
  • Build a prompt registry so product teams can version templates per use case and environment

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides