Create Memory langbase.memories.create()

Create a new memory on Langbase using the langbase.memories.create() function.


Generate a User/Org API key

You will need to generate an API key to authenticate your requests. For more information, visit the User/Org API key documentation.


API reference

langbase.memories.create()

Function Signature

langbase.memories.create(options); // with types. langbase.memories.create(options: MemoryCreateOptions);

options

  • Name
    options
    Type
    MemoryCreateOptions
    Description

    MemoryCreateOptions Object

    name: string; description?: string; top_k?: number; chunk_size?: number; chunk_overlap?: number; embedding_model?: | 'openai:text-embedding-3-large' | 'cohere:embed-v4.0' | 'cohere:embed-multilingual-v3.0' | 'cohere:embed-multilingual-light-v3.0' | 'google:text-embedding-004';

    Following are the properties of the options object.


  • Name
    name
    Type
    string
    Required
    Required
    Description

    Name of the memory.

  • Name
    description
    Type
    string
    Description

    Description of the memory.

  • Name
    top_k
    Type
    number
    Description

    Number of chunks to return.

    Default: 10

    Minimum: 1 Maximum: 100

  • Name
    chunk_size
    Type
    number
    Description

    Maximum number of characters in a single chunk.

    Default: 10000

    Maximum: 30000

    Appropriate value

    Cohere has a limit of 512 tokens (1 token ~= 4 characters in English). If you are using Cohere models, adjust the chunk_size accordingly. For most use cases, default values should work fine.

  • Name
    chunk_overlap
    Type
    number
    Description

    Number of characters to overlap between chunks.

    Default: 2048

    Maximum: Less than chunk_size

  • Name
    embedding_model
    Type
    string
    Description

    The model to use for text embeddings. Available options:

    • openai:text-embedding-3-large
    • cohere:embed-multilingual-v3.0
    • cohere:embed-multilingual-light-v3.0
    • google:text-embedding-004

    Default: openai:text-embedding-3-large

Usage example

Install the SDK

npm i langbase

Environment variables

Environment variables

LANGBASE_API_KEY="<USER/ORG-API-KEY>"

Create memory

Create memory on Langbase

import {Langbase} from 'langbase'; const langbase = new Langbase({ apiKey: process.env.LANGBASE_API_KEY!, }); async function main() { const memory = await langbase.memories.create({ name: 'knowledge-base', description: 'An AI memory for storing company internal docs.', }); console.log('Memory created:', memory); } main();

Response

  • Name
    MemoryCreateResponse
    Type
    object
    Description

    The response object returned by the langbase.memories.create() function.

    MemoryCreateResponse

    name: string; description: string; owner_login: string; url: string; embedding_model: | 'openai:text-embedding-3-large' | 'cohere:embed-v4.0' | 'cohere:embed-multilingual-v3.0' | 'cohere:embed-multilingual-light-v3.0'; chunk_size: number; chunk_overlap: number; top_k: number;
    • Name
      name
      Type
      string
      Description

      Name of the memory.

    • Name
      description
      Type
      string
      Description

      Description of the AI memory.

    • Name
      owner_login
      Type
      string
      Description

      Login of the memory owner.

    • Name
      url
      Type
      string
      Description

      Memory access URL.

    • Name
      embedding_model
      Type
      string
      Description

      The embedding model used by the AI memory.

      • openai:text-embedding-3-large
      • cohere:embed-multilingual-v3.0
      • cohere:embed-multilingual-light-v3.0

Response of langbase.memories.create()

{ "name": "knowledge-base", "description": "An AI memory for storing company internal docs.", "owner_login": "user123", "url": "https://langbase.com/user123/document-memory", "embedding_model": "openai:text-embedding-3-large", "chunk_size": 10000, "chunk_overlap": 2048, "top_k": 10, }