Guide: How to Branch AI Conversations
A step-by-step guide to branch AI conversations using the Langbase SDK.
When conversations with LLMs get long, they often become messy. Mixing multiple topics in a single thread leads to context rot, and low response quality. The solution is to branch conversations.
By branching conversations, you can:
- Prevent context drift between different discussion paths
- Reduce token usage by avoiding repeated long histories
- Explore in parallel without polluting the main thread
- Merge insights back when ready
Branching is context engineering in action. With Langbase, it takes just a few lines of code.
In this guide, you will learn how to branch conversations.
Install dependencies
Start by installing Langbase SDK.
Install dependencies
npm i langbase dotenv
Set up by importing necessary packages:
Initial setup
import dotenv from 'dotenv';
import { Langbase, ThreadMessage } from 'langbase';
dotenv.config();
const langbase = new Langbase({
apiKey: process.env.LANGBASE_API_KEY!,
});
Step #1Get Langbase API Key
Every request you send to Langbase needs an API key. This guide assumes you already have one. In case, you do not have an API key, please check the instructions below.
Create an .env
file in the root of your project and add your Langbase API key.
.env
LANGBASE_API_KEY=xxxxxxxxx
Replace xxxxxxxxx with your Langbase API key.
Step #2Create the Initial Conversation
Start a conversation about choosing a React state management library.
index.ts
// Helper to create initial conversation
async function createConversation() {
const thread = await langbase.threads.create({
messages: [
{
role: 'user',
content: 'I need to add state management to my React app',
},
{
role: 'assistant',
content: 'I can help you with state management in React. How complex is your app \
and what are your main requirements?',
},
{
role: 'user',
content: "It's a medium-sized app with user data, API calls, and real-time updates",
},
{
role: 'assistant',
content: 'For your needs, you could use Redux for its mature ecosystem, or Zustand \
for a simpler, more modern approach. Which direction interests you?',
},
],
});
return thread.id;
}
Step #3Branch the Conversation
Fork the conversation at a decision point and create a new branch.
index.ts
async function branchThread(threadId: string, branchAt: number) {
const messages = await langbase.threads.messages.list({ threadId });
const messagesToKeep = messages.slice(0, branchAt);
const branch = await langbase.threads.create({
messages: messagesToKeep as ThreadMessage[],
metadata: {
parent: threadId,
branchedAt: branchAt.toString(),
},
});
return branch.id;
}
Step #4Continue Each Branch
Now continue the original with Redux, and the branch with Zustand.
index.ts
async function main() {
// Create original conversation
const originalId = await createConversation();
// Branch at decision point (after state management options presented)
const branchId = await branchThread(originalId, 4);
// Continue original thread with Redux
await langbase.threads.append({
threadId: originalId,
messages: [
{role: 'user', content: "Let's go with Redux"},
{
role: 'assistant',
content: 'Great choice for a robust solution! Redux with Redux Toolkit makes it \
much easier. Let me show you the setup...',
},
],
});
// Branch explores Zustand
await langbase.threads.append({
threadId: branchId,
messages: [
{role: 'user', content: 'Tell me about Zustand'},
{
role: 'assistant',
content: "Zustand is lightweight and simple! It's only 2KB and doesn't need \
providers. Here's how to get started...",
},
],
});
console.log('Original thread:', await langbase.threads.messages.list({threadId: originalId}));
console.log('Branched thread:', await langbase.threads.messages.list({threadId: branchId}));
}
main();
Step #5Run the Example
Run your branching conversation example:
Run project
npx tsx index.ts
You should see two independent threads logged:
- Original → continues with Redux
- Branch → explores Zustand
Choosing between two state management libraries is just an example. You can apply the concept to any number of topics. Think of branching like Git for conversations:
- Fork at decision points
- Explore in parallel without polluting context
- Merge or summarize back to main when ready
This makes AI interactions modular, adaptive, and efficient.
Next Steps
- Build something cool with Langbase APIs and SDK.
- Join our Discord community for feedback, requests, and support.
- Build a composable RAG chatbot on your company docs
- We use threads to manage conversations and context. Learn more about threads primitive