What is Langbase Agent?

Agent, an AI Primitive by Langbase, works as a runtime LLM agent. You can specify all parameters at runtime and get the response from the agent.

Agent uses our unified LLM API to provide a consistent interface for interacting with 100+ LLMs across all the top LLM providers. See the list of supported models and providers here.

All cutting-edge LLM features are supported, including streaming, JSON mode, tool calling, structured outputs, vision, and more. It is designed to be used in a variety of applications, including agentic workflows, chatbots, virtual assistants, and other AI-powered applications.



In this guide, we'll use the Langbase SDK to create an AI agent that can summarize user support queries.


Step #1

Every request you send to Langbase needs an API key. This guide assumes you already have one. If not, please check the instructions below.


Step #2

Create a new directory for your project and navigate to it.

Project setup

mkdir agent && cd agent

Initialize the project

Create a new Node.js project.

Initialize project

npm init -y

Install dependencies

You will use the Langbase SDK to run the agent and dotenv to manage environment variables.

Install dependencies

npm i langbase dotenv

Create an env file

Create a .env file in the root of your project. You will need two environment variables:

  1. LANGBASE_API_KEY: Your Langbase API key.
  2. LLM_API_KEY: Your LLM provider API key.

.env

LANGBASE_API_KEY=your_api_key_here LLM_API_KEY=your_llm_api_key_here

Step #2

Now let's create a new file called agent.ts in the root of your project. This file will contain the code and configuration of the agent.

We will use OpenAI GPT-4.1 model, but you can use any other supported model listed here.

In instructions, which are like system prompts, we will specify that the agent is a support agent and should summarize user support queries. Finally, we will provide the user query as input to the agent.

agent.ts

import { Langbase } from 'langbase'; import dotenv from 'dotenv'; dotenv.config(); // Initialize the Langbase client const langbase = new Langbase({ apiKey: process.env.LANGBASE_API_KEY! }); async function main() { const response = await langbase.agent.run({ model: 'openai:gpt-4.1', stream: false, apiKey: process.env.LLM_API_KEY!, instructions: 'You are an AI agent that summarizes user support queries for a support agent.', input: 'I am having trouble logging into my account. I keep getting an error message that says "Invalid credentials." I have tried resetting my password, but it still does not work. Can you help me?', }); console.log('Agent Response:', response.output); } main();

Run the agent by executing the script agent.ts.

Run the script

npx tsx agent.ts

You should see an output similar to:

Agent Response: User can't log in. Gets "Invalid credentials" error even after password reset. Needs help.

Now that you have a basic understanding of how to create and run an agent, you can explore more advanced features and configurations. Here are some suggestions: