Stream Text pipe.streamText()

You can use a pipe to get any LLM to stream text based on a user prompt. Streaming provides a better user experience, as the moment an LLM starts generating text the user can start seeing words print out in a stream just like ChatGPT.

For example, you can ask a pipe to stream a text completion based on a user prompt like "Who is an AI Engineer?" or give it a an entire doc and ask it to summarize it.

The Langbase AI SDK provides a streamText() function to stream text using pipes with any LLM.

Deprecation Notice

This SDK method has been deprecated. Please use the new run SDK method with stream true.


Deprecated

Stream a text completion using streamText() function.

Function Signature

streamText(options) // With types. streamText(options: StreamOptions)

  • Name
    options
    Type
    StreamOptions
    Description

    StreamOptions Object

    interface StreamOptions { messages?: Message[]; variables?: Variable[]; chat?: boolean; threadId?: string | null; }

    Following are the properties of the options object.


messages

  • Name
    messages
    Type
    Array<Message>
    Description

    A messages array including the following properties. Optional if variables are provided.

    Message Object

    interface Message { role: 'user' | 'assistant' | 'system'| 'tool'; content: string; name?: string; tool_call_id?: string; }
    • Name
      role
      Type
      'user' | 'assistant' | 'system'| 'tool'
      Description

      The role of the author of this message.

    • Name
      content
      Type
      string
      Description

      The contents of the chunk message.

    • Name
      name
      Type
      string
      Description

      The name of the tool called by LLM

    • Name
      tool_call_id
      Type
      string
      Description

      The id of the tool called by LLM


variables

  • Name
    variables
    Type
    Array<Variable>
    Description

    A variables array including the name and value params. Optional if messages are provided.

    Variable Object

    interface Variable { name: string; value: string; }
    • Name
      name
      Type
      string
      Description

      The name of the variable.

    • Name
      value
      Type
      string
      Description

      The value of the variable.


chat

  • Name
    chat
    Type
    boolean
    Description

    For a chat pipe, set chat to true.

    This is useful when you want to use a chat pipe to generate text as it returns a threadId. Defaults to false.


threadId

  • Name
    threadId
    Type
    string | null
    Description

    The ID of the thread. Useful for a chat pipe to continue the conversation in the same thread. Optional.

    • If threadId is not provided, a new thread will be created. E.g. first message of a new chat will not have a threadId.
    • After the first message, a new threadId will be returned.
    • Use this threadId to continue the conversation in the same thread from the second message onwards.

Install the SDK

npm i langbase pnpm i langbase yarn add langbase

.env file

# Add your Pipe API key here. LANGBASE_MY_PIPE_API_KEY="pipe_12345" # … add more keys if you have more pipes.

Generate Pipe: Use streamText()

import { Pipe } from 'langbase'; // 1. Initiate your Pipes. `myPipe` as an example. const myPipe = new Pipe({ apiKey: process.env.LANGBASE_MY_PIPE_API_KEY!, }); // 2. SIMPLE example. Stream text by asking a question. const {stream} = await myPipe.streamText({ messages: [{role: 'user', content: 'Who is an AI Engineer?'}], }); // 3. Print the stream // NOTE: This is a Node.js only example. // Stream works differently in browsers. // For browers, Next.js, and more examples: // https://langbase.com/docs/sdk/examples for await (const chunk of stream) { // Streaming text part — a single word or several. const textPart = chunk.choices[0]?.delta?.content || ''; // Demo: Print the stream to shell output — you can use it however. process.stdout.write(textPart); }

Variables with streamText()

// 1. Initiate the Pipe. // … same as above // 2. Stream text by asking a question. const {stream} = await myPipe.streamText({ messages: [{role: 'user', content: 'Who is {{question}}?'}], variables: [{name: 'question', value: 'AI Engineer'}], }); // 3. Print the stream // … same as above

Chat Pipe: Use streamText()

import { Pipe } from 'langbase'; // 1. Initiate the Pipe. const myPipe = new Pipe({ apiKey: process.env.LANGBASE_MY_PIPE_API_KEY!, }); // 2. Stream text by asking a question. const {stream, threadId} = await myPipe.streamText({ messages: [{role: 'user', content: 'My company is called Langbase'}], chat: true, }); // 3. Print the stream // NOTE: This is a Node.js only example. // Stream works differently in browsers. // For browers, Next.js, and more examples: // https://langbase.com/docs/sdk/examples for await (const chunk of stream) { // Streaming text part — a single word or several. const textPart = chunk.choices[0]?.delta?.content || ''; // Demo: Print the stream to shell output — you can use it however. process.stdout.write(textPart); } // 4. Continue the conversation in the same thread by sending `threadId` from the second message onwards. const {stream} = await myPipe.streamText({ messages: [{role: 'user', content: 'Tell me the name of my company?'}], chat: true, threadId, }); // You'll see any LLM will know the company is `Langbase` // since it's the same chat thread. This is how you can // continue a conversation in the same thread.

Response of streamText() is a Promise<StreamResponse> which is an object with stream and threadId for chat pipes.

StreamResponse Object

interface StreamResponse = { threadId: string | null; stream: StreamText; };

threadId

  • Name
    threadId
    Type
    string
    Description

    The ID of the thread. Useful for a chat pipe to continue the conversation in the same thread. Optional.


stream

  • Name
    stream
    Type
    StreamText
    Description

    Stream is a StreamText object with a streamed sequence of StreamChunk objects.

StreamResponse Object

type StreamText = Stream<StreamChunk>;

StreamChunk

  • Name
    StreamChunk
    Type
    StreamChunk
    Description

    Represents a streamed chunk of a completion response returned by model, based on the provided input.

StreamResponse Object

interface StreamChunk { id: string; object: string; created: number; model: string; choices: ChoiceStream[]; }

A StreamChunk object has the following properties.

  • Name
    id
    Type
    string
    Description

    The ID of the response.

  • Name
    object
    Type
    string
    Description

    The object type name of the response.

  • Name
    created
    Type
    number
    Description

    The timestamp of the response creation.

  • Name
    model
    Type
    string
    Description

    The model used to generate the response.

  • Name
    choices
    Type
    ChoiceStream[]
    Description

    A list of chat completion choices. Can contain more than one elements if n is greater than 1.

    Choice Object for streamText()

    interface ChoiceStream { index: number; delta: Delta; logprobs: boolean | null; finish_reason: string; }

Response of streamText()

// Response of a streamText() call is a Promise<StreamResponse>. { threadId: 'string-uuid-123', stream: StreamText // example of streamed chunks below. }

StreamText has stream chunks

// A stream chunk looks like this … { "id":"chatcmpl-123", "object":"chat.completion.chunk", "created":1719848588, "model":"gpt-4o-mini", "system_fingerprint":"fp_44709d6fcb" "choices":[{ "index":0, "delta":{"content":"Hi"}, "logprobs":null, "finish_reason":null }] } // More chunks as they come in... {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1719848588,"model":"gpt-4o-mini","system_fingerprint":"fp_44709d6fcb","choices":[{"index":0,"delta":{"content":"there"},"logprobs":null,"finish_reason":null}]} {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1719848588,"model":"gpt-4o-mini","system_fingerprint":"fp_44709d6fcb","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}

Response stream with tool fn calls

// Stream chunks with tool fn calls have content null and include a `tool_calls` array. {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1723757387,"model":"gpt-4o-mini","system_fingerprint":null,"choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_123","type":"function","function":{"name":"get_current_weather","arguments":""}}]},"logprobs":null,"finish_reason":null}]} {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1723757387,"model":"gpt-4o-mini","system_fingerprint":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}]} {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1723757387,"model":"gpt-4o-mini","system_fingerprint":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"location"}}]},"logprobs":null,"finish_reason":null}]} ... {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1723757387,"model":"gpt-4o-mini","system_fingerprint":null,"choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}]}