Rag Chatbot lets you upload your documents and chat about their content. Enjoy personalized insights and detailed responses based on your specific documents, enhancing your understanding and productivity.
rag
technology
productivity
Meta
ON
Tools
Add
No tools added to the Pipe.
Readme
Build a RAG Chatbot with Pipes — ⌘ Langbase
A RAG Chatbot example to help you build and deploy a chatbot to talk to your documents. This chatbot is built by using an AI Pipe and Memory on Langbase, it works with 30+ LLMs (OpenAI, Gemini, Mistral, Llama, Gemma, etc), any Data (10M+ context with Memory sets), and any Framework (standard web API you can use with any software).
Create a Memory on Langbase, upload the document you want to talk to, and attach it to the Pipe you just forked.
Go to the API tab to copy the Pipe's API key (to be used on server-side only).
Download the example project folder from here or clone the repository.
cd into the project directory and open it in your code editor.
Duplicate the .env.example file in this project and rename it to .env.local.
Add the following environment variables:
sh
DownloadCopy code
1# Replace `PIPE_API_KEY` with the copied API key.
2NEXT_LB_PIPE_API_KEY="PIPE_API_KEY"
34# Install the dependencies using the following command:
5npm install
67# Run the project using the following command:
8npm run dev
Your app template should now be running on localhost:3000.
NOTE:
This is a Next.js project, so you can build and deploy it to any platform of your choice, like Vercel, Netlify, Cloudflare, etc.
Authors
This project is created by Langbase team members, with contributions from: