Pipe API: Create beta

The create pipe API endpoint allows you to create a new pipe on Langbase dynamically with API. You can use this endpoint to create a new pipe with all the custom configuration. This endpoint requires a User or Org API key. To generate a User or Org API key visit your profile/organization settings page on Langbase.


Deprecation Notice

This API endpoint has been deprecated. Please use the new create pipe API endpoint.


Generate a User/Org API key

You will need to generate an API key to authenticate your requests. For more information, visit the User/Org API key documentation.


Deprecated/beta/org/{org}/pipes

Create a new org pipe

Create a new organization pipe by sending the pipe configuration inside the request body.

Required headers

  • Name
    Content-Type
    Type
    string
    Description

    Request content type. Needs to be application/json.

  • Name
    Authorization
    Type
    string
    Description

    Replace YOUR_API_KEY with your Organization API key.

Required path parameters

  • Name
    org
    Type
    string
    Description

    The organization name.

    Replace {org} with your organization name.

Required attributes

  • Name
    name
    Type
    string
    Description

    Name of the pipe.

Optional attributes

    • Name
      description
      Type
      array
      Description

      Short description of the pipe.

      Default: ''

    • Name
      status
      Type
      string
      Description

      Status of the pipe.

      Default: public

      Can be one of: public, private

    • Name
      type
      Type
      string
      Description

      Type of the pipe.

      Default: generate

      Can be one of: generate, chat

    • Name
      config
      Type
      object
      Description

      Configuration object of the pipe.

      Default: {}

    • Name
      config.meta.stream
      Type
      boolean
      Description

      If enabled, the output will be streamed in real-time like ChatGPT. This is helpful if user is directly reading the text.

      Default: true

    • Name
      config.meta.json
      Type
      boolean
      Description

      Enforce the output to be in JSON format.

      Default: false

    • Name
      config.meta.store
      Type
      boolean
      Description

      If enabled, both prompt and completions will be stored in the database. Otherwise, only system prompt and few shot messages will be saved.

      Default: true

    • Name
      config.meta.moderate
      Type
      boolean
      Description

      If enabled, Langbase blocks flagged requests automatically.

      Default: false

    • Name
      config.model.name
      Type
      string
      Description

      ID of the LLM model. You can copy the ID of a model from the list of supported LLM models at Langbase.

      Default: gpt-4o-mini

    • Name
      config.model.provider
      Type
      string
      Description

      Name of the LLM model provider. Check out the list of all the supported LLM providers at Langbase.

      Default: OpenAI

      Can be one of the supported providers: OpenAI, Together, Anthropic, Groq, Google, Cohere.

    • Name
      config.model.tool_choice
      Type
      'auto' | 'required' | 'object'
      Description

      Controls which (if any) tool is called by the model.

      • auto - the model can pick between generating a message or calling one or more tools.
      • required - the model must call one or more tools.
      • object - Specifying a particular tool via {"type": "function", "function": {"name": "my_function"}} forces the model to call that tool.

      Default: auto

    • Name
      config.model.parallel_tool_calls
      Type
      boolean
      Description

      Call multiple tools in parallel, allowing the effects and results of these function calls to be resolved in parallel.

      Default: true

    • Name
      config.model.params.top_p
      Type
      number
      Description

      An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.

      Default: 1

    • Name
      config.model.params.max_tokens
      Type
      number
      Description

      Maximum number of tokens in the response message returned.

      Default: 1000

    • Name
      config.model.params.temperature
      Type
      number
      Description

      What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random. Lower values like 0.2 will make it more focused and deterministic.

      Default: 0.7

    • Name
      config.model.params.presence_penalty
      Type
      number
      Description

      Penalizes a word based on its occurrence in the input text.

      Default: 1

    • Name
      config.model.params.frequency_penalty
      Type
      number
      Description

      Penalizes a word based on how frequently it appears in the training data.

      Default: 1

    • Name
      config.model.params.stop
      Type
      array
      Description

      Up to 4 sequences where the API will stop generating further tokens.

      Default: []

    • Name
      config.prompt.system
      Type
      string
      Description

      System prompt. Insert variables in the prompt with syntax like .

      Default: You're a helpful AI assistant.

    • Name
      config.prompt.opening
      Type
      string
      Description

      Chat opening prompt.

      Default: Welcome to Langbase. Prompt away!

    • Name
      config.prompt.safety
      Type
      string
      Description

      AI Safety prompt.

      Default: ''

    • Name
      config.prompt.messages
      Type
      array
      Description

      An array containing message objects.

      Default: []

    • Name
      config.prompt.variables
      Type
      array
      Description

      An array containing different variable objects.

      Default: []

    • Name
      config.prompt.json
      Type
      string
      Description

      Use this prompt to define the JSON output format, schema, and more. It will be appended to the system prompt.

      Default: ''

    • Name
      config.prompt.rag
      Type
      string
      Description

      Use this prompt to make the LLM answer questions from Memoryset documents.

      Default: ''

    • Name
      config.tools
      Type
      array
      Description

      An array of objects with valid tool definitions.

      Read more about valid tool definition

      Default: []

    • Name
      config.memorysets
      Type
      array
      Description

      An array of memoryset names.

      Default: []

Basic Pipe

POST
/beta/org/{org}/pipes
curl https://api.langbase.com/beta/org/{org}/pipes \
-H 'Content-Type: application/json' \
-H "Authorization: Bearer <YOUR_API_KEY>" \
-d '{
  "name": "Test Pipe",
  "description": "This is a test pipe",
  "status": "public",
  "type": "chat"
}'

Response

{
  "name": "test-pipe",
  "type": "chat",
  "description": "This is a create Pipe test from API",
  "status": "private",
  "api_key": "pipe_4FVBn2DgrzfJf...",
  "owner_login": "langbase",
  "url": "https://langbase.com/langbase/test-pipe"
}

Deprecated/beta/user/pipes

Create a new user pipe

Create a new user pipe by sending the pipe configuration inside the request body.

Required headers

  • Name
    Content-Type
    Type
    string
    Description

    Request content type. Needs to be application/json.

  • Name
    Authorization
    Type
    string
    Description

    Replace YOUR_API_KEY with your User API key.

Required attributes

  • Name
    name
    Type
    string
    Description

    Name of the pipe.

Optional attributes

    • Name
      description
      Type
      array
      Description

      Short description of the pipe.

      Default: ''

    • Name
      status
      Type
      string
      Description

      Status of the pipe.

      Default: public

      Can be one of: public, private

    • Name
      type
      Type
      string
      Description

      Type of the pipe.

      Default: generate

      Can be one of: generate, chat

    • Name
      config
      Type
      object
      Description

      Configuration object of the pipe.

      Default: {}

    • Name
      config.meta.stream
      Type
      boolean
      Description

      If enabled, the output will be streamed in real-time like ChatGPT. This is helpful if user is directly reading the text.

      Default: true

    • Name
      config.meta.json
      Type
      boolean
      Description

      Enforce the output to be in JSON format.

      Default: false

    • Name
      config.meta.store
      Type
      boolean
      Description

      If enabled, both prompt and completions will be stored in the database. Otherwise, only system prompt and few shot messages will be saved.

      Default: true

    • Name
      config.meta.moderate
      Type
      boolean
      Description

      If enabled, Langbase blocks flagged requests automatically.

      Default: false

    • Name
      config.model.name
      Type
      string
      Description

      ID of the LLM model. You can copy the ID of a model from the list of supported LLM models at Langbase.

      Default: gpt-3.5-turbo

    • Name
      config.model.provider
      Type
      string
      Description

      Name of the LLM model provider. Check out the list of all the supported LLM providers at Langbase.

      Default: OpenAI

      Can be one of the supported providers: OpenAI, Together, Anthropic, Groq, Google, Cohere

    • Name
      config.model.tool_choice
      Type
      'auto' | 'required' | 'object'
      Description

      Controls which (if any) tool is called by the model.

      • auto - the model can pick between generating a message or calling one or more tools.
      • required - the model must call one or more tools.
      • object - Specifying a particular tool via {"type": "function", "function": {"name": "my_function"}} forces the model to call that tool.

      Default: auto

    • Name
      config.model.parallel_tool_calls
      Type
      boolean
      Description

      Call multiple tools in parallel, allowing the effects and results of these function calls to be resolved in parallel.

      Default: true

    • Name
      config.model.params.top_p
      Type
      number
      Description

      An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.

      Default: 1

    • Name
      config.model.params.max_tokens
      Type
      number
      Description

      Maximum number of tokens in the response message returned.

      Default: 1000

    • Name
      config.model.params.temperature
      Type
      number
      Description

      What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random. Lower values like 0.2 will make it more focused and deterministic.

      Default: 0.7

    • Name
      config.model.params.presence_penalty
      Type
      number
      Description

      Penalizes a word based on its occurrence in the input text.

      Default: 1

    • Name
      config.model.params.frequency_penalty
      Type
      number
      Description

      Penalizes a word based on how frequently it appears in the training data.

      Default: 1

    • Name
      config.model.params.stop
      Type
      array
      Description

      Up to 4 sequences where the API will stop generating further tokens.

      Default: []

    • Name
      config.prompt.system
      Type
      string
      Description

      System prompt. Insert variables in the prompt with syntax like .

      Default: You're a helpful AI assistant.

    • Name
      config.prompt.opening
      Type
      string
      Description

      Chat opening prompt.

      Default: Welcome to Langbase. Prompt away!

    • Name
      config.prompt.safety
      Type
      string
      Description

      AI Safety prompt.

      Default: ''

    • Name
      config.prompt.messages
      Type
      array
      Description

      An array containing message objects.

      Default: []

    • Name
      config.prompt.variables
      Type
      array
      Description

      An array containing different variable objects.

      Default: []

    • Name
      config.prompt.json
      Type
      string
      Description

      Use this prompt to define the JSON output format, schema, and more. It will be appended to the system prompt.

      Default: ''

    • Name
      config.prompt.rag
      Type
      string
      Description

      Use this prompt to make the LLM answer questions from Memoryset documents.

      Default: ''

    • Name
      config.tools
      Type
      array
      Description

      An array of objects with valid tool definitions.

      Read more about valid tool definition

      Default: []

    • Name
      config.memorysets
      Type
      array
      Description

      An array of memoryset names.

      Default: []

Basic Pipe

POST
/beta/user/pipes
curl https://api.langbase.com/beta/user/pipes \
-H 'Content-Type: application/json' \
-H "Authorization: Bearer <YOUR_API_KEY>" \
-d '{
  "name": "Test Pipe",
  "description": "This is a test pipe",
  "status": "public",
  "type": "chat"
}'

Response

{
  "name": "test-pipe",
  "type": "chat",
  "description": "This is a create Pipe test from API",
  "status": "private",
  "api_key": "pipe_4FVBn2DgrzfJf...",
  "owner_login": "langbase",
  "url": "https://langbase.com/langbase/test-pipe"
}