Haystack Tutorial (TypeScript): building conditional routing for beginners
This tutorial shows you how to build a simple conditional router in Haystack TypeScript that sends each query to the right branch based on the input. You need this when one pipeline should handle multiple request types, like FAQ lookup, summarization, or fallback chat, without forcing every query through the same components.
What You'll Need
- •Node.js 18+ and npm
- •A TypeScript project with
tsconfig.json - •Haystack TypeScript package installed
- •An OpenAI API key for the example generator component
- •Basic familiarity with Haystack pipelines and components
Install the dependencies first:
npm install @haystack/core @haystack/openai dotenv
npm install -D typescript tsx @types/node
Step-by-Step
- •Start by creating a small pipeline with one classifier component that decides which route to take. The classifier returns a label like
faq,summarize, orchatbased on simple keyword checks.
import { defineComponent } from "@haystack/core";
export const RouteClassifier = defineComponent({
inputs: ["query"],
outputs: ["route", "query"],
run({ query }: { query: string }) {
const q = query.toLowerCase();
let route = "chat";
if (q.includes("faq") || q.includes("policy")) route = "faq";
if (q.includes("summarize") || q.includes("summary")) route = "summarize";
return { route, query };
},
});
- •Next, create one component per branch. In a real app these would call retrievers, prompts, or tools; here we keep them explicit so you can see how the routing works end to end.
import { defineComponent } from "@haystack/core";
export const FaqHandler = defineComponent({
inputs: ["query"],
outputs: ["answer"],
run({ query }: { query: string }) {
return { answer: `FAQ answer for: ${query}` };
},
});
export const SummarizeHandler = defineComponent({
inputs: ["query"],
outputs: ["answer"],
run({ query }: { query: string }) {
return { answer: `Summary for: ${query}` };
},
});
export const ChatHandler = defineComponent({
inputs: ["query"],
outputs: ["answer"],
run({ query }: { query: string }) {
return { answer: `Chat response for: ${query}` };
},
});
- •Now wire the branches into a pipeline using conditional connections. The key idea is that the classifier output drives which downstream component receives the input.
import { Pipeline } from "@haystack/core";
import { RouteClassifier } from "./RouteClassifier";
import { FaqHandler, SummarizeHandler, ChatHandler } from "./handlers";
const pipeline = new Pipeline();
pipeline.addComponent("classifier", RouteClassifier);
pipeline.addComponent("faq", FaqHandler);
pipeline.addComponent("summarize", SummarizeHandler);
pipeline.addComponent("chat", ChatHandler);
pipeline.connect("classifier.query", "faq.query");
pipeline.connect("classifier.query", "summarize.query");
pipeline.connect("classifier.query", "chat.query");
- •Add the actual routing logic by branching on the classifier output before executing the final handler. This keeps routing decisions in one place and makes the behavior easy to test.
async function runQuery(query: string) {
const classified = await pipeline.run({ classifier: { query } });
const route = classified.classifier.route as string;
if (route === "faq") return await pipeline.run({ faq: { query } });
if (route === "summarize") return await pipeline.run({ summarize: { query } });
return await pipeline.run({ chat: { query } });
}
- •Put it together in a runnable file and print the result for a few sample inputs. This is enough to verify that each branch is being selected correctly before you swap in real LLM-backed components.
async function main() {
const samples = [
"Show me the FAQ for claims",
"Summarize this policy document",
"Can you help me with my account?",
];
for (const sample of samples) {
const result = await runQuery(sample);
console.log(sample);
console.log(result);
console.log("---");
}
}
main();
Testing It
Run the file with tsx so TypeScript executes directly without a build step. You should see each sample routed to a different handler based on keywords in the input.
If all three queries produce different outputs, your conditional routing is working. If every request goes to one branch, check your classifier rules first, then verify that your pipeline inputs match the component names exactly.
For production use, replace the keyword classifier with an LLM-based router or a lightweight intent model. Keep the branch handlers separate so you can test and deploy them independently.
Next Steps
- •Replace keyword routing with an LLM router using
@haystack/openai - •Add a fallback branch for low-confidence classifications
- •Chain routed branches into retrievers and generators instead of static handlers
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit