Tools
Tools, an AI Primitive by Langbase, allows you to extend the capabilities of your AI applications. They enable you to integrate functionality such as web search, crawling, and other specialized tasks into your AI workflows.
By using tools, you can enhance the performance and versatility of your AI agents, making them more capable of handling complex tasks and providing valuable insights.
Quickstart: Using Langbase Tools
Let's get started
In this guide, we'll use the Langbase SDK to interact with the Tools API, specifically focusing on web crawling and web search capabilities:
Step #1Generate Langbase API key
Every request you send to Langbase needs an API key. This guide assumes you already have one. If not, please check the instructions below.
Step #2Setup your project
Create a new directory for your project and navigate to it.
Project setup
mkdir langbase-tools && cd langbase-tools
Initialize the project
Create a new Node.js project.
Initialize project
npm init -y
Install dependencies
You will use the Langbase SDK and dotenv
to manage environment variables.
Install dependencies
npm i langbase dotenv
Create an env file
Create a .env
file in the root of your project and add your API keys:
.env
LANGBASE_API_KEY=your_langbase_api_key_here
SPIDER_CLOUD_API_KEY=your_spider_cloud_api_key_here
EXA_API_KEY=your_exa_api_key_here
Step #3Web Crawling with Langbase Tools
Let's create a file named web-crawler.ts
that demonstrates how to use the web crawling tool:
web-crawler.ts
import 'dotenv/config';
import { Langbase } from 'langbase';
const langbase = new Langbase({
apiKey: process.env.LANGBASE_API_KEY!,
});
async function main() {
// Use the crawl tool to extract content from these URLs
const crawlResults = await langbase.tools.crawl({
url: ['https://langbase.com', 'https://langbase.com/about'],
apiKey: process.env.SPIDER_CLOUD_API_KEY!,
maxPages: 1 // Limit the crawl to 1 pages
});
// Display the results
console.log(crawlResults);
}
main()
Step #4Run the web crawler
Run the script to crawl the specified websites:
Run the crawler
npx tsx web-crawler.ts
You should see output showing the crawled URLs and extracted content:
[
{
"url": "https://langbase.com/about",
"content": "⌘Langbase –Serverless AI Agents platform# # ⌘Langbase –Serverless AI Agents platformThe most powerful serverless platform for building AI agents. Build. Deploy. Scale."
},
{
"url": "https://langbase.com",
"content": "⌘Langbase –Serverless AI Agents platform# # ⌘Langbase –Serverless AI Agents platformThe most powerful serverless platform for building AI agents. Build. Deploy. Scale."
}
]
Step #5Web Search with Langbase Tools
Now, let's create a file named web-search.ts
that demonstrates how to use the web search tool:
web-search.ts
import 'dotenv/config';
import { Langbase } from 'langbase';
const langbase = new Langbase({
apiKey: process.env.LANGBASE_API_KEY!,
});
async function main() {
// Perform a web search query
const results = await langbase.tools.webSearch({
service: 'exa',
totalResults: 2,
query: 'What is Langbase?',
domains: ['https://langbase.com'],
apiKey: process.env.EXA_API_KEY!, // Find Exa key: https://dashboard.exa.ai/api-keys
});
console.log(results);
}
main()
Step #6Run the web search
Run the script to perform web searches:
Run the search
npx tsx web-search.ts
You should see output showing the search results:
[
{
"url": "https://langbase.com/",
"content": "The most powerful serverless platform for building AI products. BaseAI: The first Web AI Framework for developers Build agentic ( pipes memory tools )"
},
{
"url": "https://langbase.com/about",
"content": "The most powerful serverless platform for building AI products. BaseAI: The first Web AI Framework for developers Build agentic ( pipes memory tools )",
}
]
Next Steps
- Combine tools with other Langbase primitives like Embed and Chunk to build more powerful AI apps
- Use these tools to enhance your RAG (Retrieval-Augmented Generation) systems with real-time web data
- Build something cool with Langbase SDK and APIs
- Join our Discord community for feedback, requests, and support