Pipe: Run v1
The Run API allows you to execute any pipe and receive its response. It supports all use cases of Pipes, including chat interactions, single generation tasks, and function calls.
The Run API supports:
- Single generation requests for straightforward tasks.
- Dynamic variables to create adaptable prompts in real-time.
- Thread management for handling multi-turn conversations.
- Seamless conversation continuation, ensuring smooth transitions across interactions.
If needed, Langbase can store messages and conversation threads, allowing for persistent conversation history for chat use cases.
Run a pipe
Run a pipe by sending the required data with the request. For basic request, send a messages array inside request body.
Headers
- Name
Content-Type
- Type
- string
- Required
- Required
- Description
Request content type. Needs to be
application/json
- Name
Authorization
- Type
- string
- Required
- Required
- Description
Replace
PIPE_API_KEY
with your Pipe API key
- Name
LB-LLM-Key
- Type
- string
- Description
LLM API key for the request. If not provided, the LLM key from Pipe/User/Organization keyset will be used.
Body Parameters
- Name
messages
- Type
- Array<Message>
- Required
- Required
- Description
An array containing message objects.
Message Object
interface Message { role: string; content?: string | ContentType[] | null; tool_call_id?: string; name?: string; }
- Name
role
- Type
- string
- Required
- Required
- Description
The role of the message, i.e.,
system
|user
|assistant
|tool
- Name
content
- Type
- string | ContentType[] | null
- Description
The content of the message.
-
String
For text generation, it's a plain string. -
Null
orundefined
Tool call messages can have no content. -
ContentType[]
Array used in vision and audio models, where content consists of structured parts (e.g., text, image URLs).
ContentType Object
interface ContentType { type: string; text?: string | undefined; image_url?: | { url: string; detail?: string | undefined; } | undefined; };
-
- Name
tool_call_id
- Type
- string
- Description
The id of the called LLM tool if the role is
tool
- Name
name
- Type
- string
- Description
The name of the called tool if the role is
tool
- Name
variables
- Type
- Record<string, string>
- Description
An object containing pipe variables. The key is the variable name and the value is the variable value.
Default:
{}
- Name
threadId
- Type
- string
- Description
The ID of an existing chat thread. The conversation will continue in this thread.
- Name
tools
- Type
- Array<Tools>
- Description
A list of tools the model may call.
Tools Object
interface ToolsOptions { type: 'function'; function: FunctionOptions }
- Name
type
- Type
- 'function'
- Description
The type of the tool. Currently, only
function
is supported.
- Name
function
- Type
- FunctionOptions
- Description
The function that the model may call.
FunctionOptions Object
export interface FunctionOptions { name: string; description?: string; parameters?: Record<string, unknown> }
- Name
name
- Type
- string
- Description
The name of the function to call.
- Name
description
- Type
- string
- Description
The description of the function.
- Name
parameters
- Type
- Record<string, unknown>
- Description
The parameters of the function.
- Name
memory
- Type
- Array<Memory>
- Description
An array of memory objects that specify the memories your pipe should use at run time.
If memories are defined here, they will override the default pipe memories, which will be ignored. All referenced memories must exist in your account.
Run time Memory array example
"memory": [ { "name": "runtime-memory-1" }, { "name": "runtime-memory-2" } ]
If this property is not set or is empty, the pipe will fall back to using its default memories.
Default:
undefined
Each memory in the array follows this structure:
Memory Object
interface Memory { name: string; }
- Name
name
- Type
- string
- Description
The name of the memory.
Usage example
Install the SDK
npm i langbase
Run an agent pipe
Basic request without streaming
curl https://api.langbase.com/v1/pipes/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer <PIPE_API_KEY>' \
-d '{
"messages": [
{
"role": "user",
"content": "Hello!"
}
]
}'
Response Headers
- Name
lb-thread-id
- Type
- string
- Description
The ID of the new/existing thread. If you want to continue conversation in this thread, send it as
threadId
in the next request.
Response Header
HTTP/2 200
lb-thread-id: "…-…-…-…-… ID of the thread"
… … … rest of the headers … : … … …
Response Body
Response of the endpoint is a Promise<RunResponse | RunResponseStream>
object.
RunResponse Object
RunResponse Object
interface RunResponse {
completion: string;
raw: RawResponse;
}
- Name
completion
- Type
- string
- Description
The generated text completion.
- Name
raw
- Type
- RawResponse
- Description
The raw response object.
RawResponse Object
interface RawResponse { id: string; object: string; created: number; model: string; choices: ChoiceGenerate[]; usage: Usage; system_fingerprint: string | null; }
Copy - Name
id
- Type
- string
- Description
The ID of the raw response.
- Name
object
- Type
- string
- Description
The object type name of the response.
- Name
created
- Type
- number
- Description
The timestamp of the response creation.
- Name
model
- Type
- string
- Description
The model used to generate the response.
- Name
choices
- Type
- ChoiceGenerate[]
- Description
A list of chat completion choices. Can contain more than one elements if n is greater than 1.
Choice Object for langbase.pipes.run() with stream off
interface ChoiceGenerate { index: number; message: Message; logprobs: boolean | null; finish_reason: string; }
Copy
- Name
usage
- Type
- Usage
- Description
The usage object including the following properties.
Usage Object
interface Usage { prompt_tokens: number; completion_tokens: number; total_tokens: number; }
Copy prompt_tokens completion_tokens total_tokens
- Name
system_fingerprint
- Type
- string
- Description
This fingerprint represents the backend configuration that the model runs with.
index message logprobs finish_reason
RunResponseStream Object
Response of the endpoint with stream: true
is a Promise<RunResponseStream>
.
RunResponseStream Object
interface RunResponseStream {
id: string;
object: string;
created: number;
model: string;
system_fingerprint: string | null;
choices: ChoiceStream[];
}
- Name
id
- Type
- string
- Description
The ID of the response.
- Name
object
- Type
- string
- Description
The object type name of the response.
- Name
created
- Type
- number
- Description
The timestamp of the response creation.
- Name
model
- Type
- string
- Description
The model used to generate the response.
- Name
system_fingerprint
- Type
- string
- Description
This fingerprint represents the backend configuration that the model runs with.
- Name
choices
- Type
- ChoiceStream[]
- Description
A list of chat completion choices. Can contain more than one elements if n is greater than 1.
Choice Object with stream true
interface ChoiceStream { index: number; delta: Delta; logprobs: boolean | null; finish_reason: string; }
Copy
index
delta
logprobs
finish_reason
RunResponse type
{
"completion": "AI Engineer is a person who designs, builds, and maintains AI systems.",
"raw": {
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1720131129,
"model": "gpt-4o-mini",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "AI Engineer is a person who designs, builds, and maintains AI systems."
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 28,
"completion_tokens": 36,
"total_tokens": 64
},
"system_fingerprint": "fp_123"
}
}
RunResponseStream type with stream true
// A stream chunk looks like this …
{
"id": "chatcmpl-123",
"object": "chat.completion.chunk",
"created": 1719848588,
"model": "gpt-4o-mini",
"system_fingerprint": "fp_44709d6fcb",
"choices": [{
"index": 0,
"delta": { "content": "Hi" },
"logprobs": null,
"finish_reason": null
}]
}
// More chunks as they come in...
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1719848588,"model":"gpt-4o-mini","system_fingerprint":"fp_44709d6fcb","choices":[{"index":0,"delta":{"content":"there"},"logprobs":null,"finish_reason":null}]}
…
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1719848588,"model":"gpt-4o-mini","system_fingerprint":"fp_44709d6fcb","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}