Learn about vectors and embeddings and what is a Langbase’s vector (memory) store.
Learn about OpenAI’s moderation tools and how Langbase supports them for users’ safety.
Multi-agent infrastructure is an effective approach for building AI apps. By using a multi-agent AI setup, each agent can focus on a specific task, enhancing performance and reliability. In this guide, we will build a multi-agent AI support that will leverage AI pipes, tools, and memory to answer support query from docs and connect to a live agent.
Learn to create an AI memory in Langbase using API.
Learn to upload documents to an AI memory using API.
Learn how an AI agent pipe on Langbase is different from a typical AI agent. Also, understand how a pipe overcomes LLMs’ limitations and followed by creating a serverless AI agent pipe locally.
Learn what are tool calls and how you can create self-healing agents with it.
Step-by-step guide on how to create a GeoCities-style web page using Langbase AI agent pipes, Netlify, and modern LLMs.