Memory: Create v1

The create memory API endpoint allows you to create a new memory on Langbase dynamically with API. This endpoint requires a User or Org API key.


Generate a User/Org API key

You will need to generate an API key to authenticate your requests. For more information, visit the User/Org API key documentation.


POST/v1/memory

Create a new memory

Create a new memory by sending the memory data inside the request body.

Headers

  • Name
    Content-Type
    Type
    string
    Required
    Required
    Description

    Request content type. Needs to be application/json.

  • Name
    Authorization
    Type
    string
    Required
    Required
    Description

    Replace <YOUR_API_KEY> with your user/org API key.

Body Parameters

  • Name
    name
    Type
    string
    Required
    Required
    Description

    Name of the memory.

  • Name
    description
    Type
    string
    Description

    Short description of the memory.

    Default: ''

Optional Parameters

  • Name
    embedding_model
    Type
    string
    Description

    Name of the embedding model to use for the memory.

    Default: openai:text-embedding-3-large

    Supported models:

    • openai:text-embedding-3-large
    • cohere:embed-v4.0
    • cohere:embed-multilingual-v3.0
    • cohere:embed-multilingual-light-v3.0
    • google:text-embedding-004
  • Name
    chunk_size
    Type
    number
    Description

    Maximum number of characters in a single chunk.

    Default: 10000

    Maximum: 30000

    Appropriate value

    Cohere has a limit of 512 tokens (1 token ~= 4 characters in English). If you are using Cohere models, adjust the chunk_size accordingly. For most use cases, default values should work fine.

  • Name
    chunk_overlap
    Type
    number
    Description

    Number of characters to overlap between chunks.

    Default: 2048

    Maximum: Less than chunk_size

  • Name
    top_k
    Type
    number
    Description

    Number of chunks to return.

    Default: 10

    Minimum: 1 Maximum: 100

Usage example

Install the SDK

npm i langbase

Environment variables

.env file

LANGBASE_API_KEY="<YOUR_API_KEY>"

Create memory

Create Memory

POST
/v1/memory
curl https://api.langbase.com/v1/memory \ -H 'Content-Type: application/json' \ -H "Authorization: Bearer <YOUR_API_KEY>" \ -d '{ "name": "knowledge-base", "description": "An AI memory for storing company internal docs.", "embedding_model": "openai:text-embedding-3-large", "chunk_size": 10000, "chunk_overlap": 2048, "top_k": 10 }'

Response

  • Name
    Memory
    Type
    object
    Description

    The response object returned by the API endpoint.

    Memory

    interface Memory { name: string; description: string; owner_login: string; url: string; chunk_size: number; chunk_overlap: number; embedding_model: | 'openai:text-embedding-3-large' | 'cohere:embed-multilingual-v3.0' | 'cohere:embed-multilingual-light-v3.0' | 'google:text-embedding-004'; }
    • Name
      name
      Type
      string
      Description

      Name of the memory.

    • Name
      description
      Type
      string
      Description

      Description of the AI memory.

    • Name
      owner_login
      Type
      string
      Description

      Login of the memory owner.

    • Name
      url
      Type
      string
      Description

      Memory studio URL.

    • Name
      embedding_model
      Type
      string
      Description

      The embedding model used by the AI memory.

      • openai:text-embedding-3-large
      • cohere:embed-multilingual-v3.0
      • cohere:embed-multilingual-light-v3.0
      • google:text-embedding-004

API Response

{ "name": "knowledge-base", "description": "Advanced memory with multilingual support.", "chunk_size": 10000, "chunk_overlap": 2048, "owner_login": "user123", "url": "https://langbase.com/memorysets/user123/knowledge-base", "embedding_model": "openai:text-embedding-3-large", }