Pipe API: Chat beta

For chat-style LLM integration with OpenAI, Mistral, etc., use the chat endpoint with a chat pipe. It supports thread creation, history tracking, and seamless conversation continuation. Langbase can store all messages and threads if desired for easy chat app development.


Deprecation Notice

This API endpoint has been deprecated. Please use the new run pipe API endpoint.


Deprecated/beta/chat

Generate a chat completion

Generate a chat completion by sending messages array inside request body.

Required headers

  • Name
    Content-Type
    Type
    string
    Description

    Request content type. Needs to be application/json

  • Name
    Authorization
    Type
    string
    Description

    Replace PIPE_API_KEY with your Pipe API key

Required attributes

  • Name
    messages
    Type
    array
    Description

    An array containing message objects

  • Name
    messages[0].role
    Type
    string
    Description

    The role of the message, i.e.,system | user | assistant | tool

  • Name
    messages[0].content
    Type
    string
    Description

    The content of the message

Optional attributes

  • Name
    messages[0].tool_call_id
    Type
    string
    Description

    The id of the called LLM tool if the role is tool

  • Name
    messages[0].name
    Type
    string
    Description

    The name of the called tool if the role is tool

    • Name
      variables
      Type
      array
      Description

      An array containing different variable objects

    • Name
      variables[0].name
      Type
      string
      Description

      The name of the variable

    • Name
      variables[0].value
      Type
      string
      Description

      The value of the variable

  • Name
    threadId
    Type
    string
    Description

    The ID of an existing chat thread. The conversation will continue in this thread.

Response headers

  • Name
    lb-thread-id
    Type
    string
    Description

    The ID of the new chat thread. Use this ID in the next request to continue the conversation.

Learn how to use function calling with the Chat API.

Chat API with stream on

POST
/beta/chat
# NOTE: How chat thread works
# 1. You send first request without a threadId
# 2. In response headers you get back the `lb-thread-id`
# 3. To maintain the same chat thread, you send the `lb-thread-id` in all next requests
# NOTE: To start a new thread, you send a request without `threadId`.

curl https://api.langbase.com/beta/chat \
 -H 'Content-Type: application/json' \
 -H "Authorization: Bearer PIPE_API_KEY" \
 -d '{
   "threadId": "<lb-thread-id>",
  "messages": [{ "role": "user", "content": "Hello!" }]
 }'

Response

{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1719848588,"model":"gpt-4o-mini","system_fingerprint":"fp_44709d6fcb","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}

{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1719848588,"model":"gpt-4o-mini","system_fingerprint":"fp_44709d6fcb","choices":[{"index":0,"delta":{"content":"Hello"},"logprobs":null,"finish_reason":null}]}

...

{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1719848588,"model":"gpt-4o-mini","system_fingerprint":"fp_44709d6fcb","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}

Response Header

HTTP/2 200
lb-thread-id: "…-…-…-…-… ID of chat thread"
… … … rest of the headers … : … … …