FAQs
Let's take a look at some frequently asked questions about Memory.
Memory allows you to store, organize, and retrieve information. It can be used to build powerful Retrieval Augmented Generation (RAG) based AI agents that can use your data to assist with your queries.
RAG is an AI assistant that uses a combination of a large language model (LLM) and your data (Memory) to provide more accurate and relevant responses to user queries.
LLMs are powerful at understanding text but they can't store information. Memory is used to store and organize your information. RAG uses LLMs on your information to provide more accurate and relevant responses. It makes it ideal for building AI agents personalized to your data.
Querying LLMs directly can be expensive, and slow. Not to mention, they have a limited context window, which limits the amount of information they can process. Memory allows you to store and organize your information in a way that is optimized for retrieval. Only relevant piece of information is retrieved from memory in an RAG system and passed to the LLM for generating a response.
Currently, we support .txt
, .pdf
, .md
, .csv
, and all the major plaincoding files.
You can convert your data into a supported format before importing it into Memory. There are many online tools available to convert files into different formats. We are continuously working on adding support for more file formats. If you have a specific file format you would like us to support, please let us know and we will prioritize it.
A chunk is a piece of information given to the LLM. The size of these chunks affects how well RAG performs. During retrieval, the query is converted into a vector, and a search returns only the relevant chunks. If chunks are too small, they might not have enough information to be useful. If they are too large, they might include irrelevant information. You can adjust the chunk size based on your needs to optimize RAG's performance.
Yes, you can attach multiple Memory Sets to a single Pipe.
The maximum file size that can be imported into Memory is 10MB.
Yes, Langbase Memory API provides you with programmatic access to managing memories in your Langbase account. Since documents are stored in memories, you can also manage documents using the Memory API.