Chat API

For chat-style LLM integration with OpenAI, Mistral, etc., use the chat endpoint with a chat pipe. It supports thread creation, history tracking, and seamless conversation continuation. Langbase can store all messages and threads if desired for easy chat app development.


POST/beta/chat

Generate a chat completion

Generate a chat completion by sending messages array inside request body.

Required headers

  • Name
    Content-Type
    Type
    string
    Description

    Request content type. Needs to be application/json

  • Name
    Authorization
    Type
    string
    Description

    Replace {PIPE_API_KEY} with your Pipe API key

Required attributes

  • Name
    messages
    Type
    array
    Description

    An array containing message objects

  • Name
    role
    Type
    string
    Description

    The role of the message. Can be system, assistant, or user

  • Name
    content
    Type
    string
    Description

    The content of the message

Optional attributes

    • Name
      variables
      Type
      array
      Description

      An array containing different variable objects

    • Name
      name
      Type
      string
      Description

      The name of the variable

    • Name
      value
      Type
      string
      Description

      The value of the variable

  • Name
    threadId
    Type
    array
    Description

    The ID of an existing chat thread. The conversation will continue in this thread.

Response headers

  • Name
    lb-thread-id
    Type
    array
    Description

    The ID of the new chat thread. Use this ID in the next request to continue the conversation.

Chat API
Streaming
Existing thread

Chat API

POST
/beta/chat
curl https://api.langbase.com/beta/chat \
-H 'Content-Type: application/json' \
-H "Authorization: Bearer {PIPE_API_KEY}" \
-d '{
  "messages": [
    {
      "role": "user",
      "content": "Hello!"
    }
  ],
}'

Response

Hello! How can I assist you today?

Response Header

lb-thread-id: "{new_chat_thread_id}"