MCP Servers
Langbase provides comprehensive support for Model Context Protocol (MCP). Connect AI models to both Langbase-provided and third-party MCP servers.
Table of contents
- What is MCP?
- Langbase Remote MCP Server
- Langbase Docs MCP Server
- Third-party MCP servers with Langbase
Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. It provides a standardized way to connect AI models to different data sources and tools. You can deploy servers locally or remotely depending on your use case here are some of useful resources:
- Official MCP Documentation - The official documentation of the Model Context Protocol by Anthropic.
- MCP GitHub Repository - The official github repository with SDKs, servers implementations and more.
- Cloudflare Remote MCP Server Guide - Step-by-step guide for deploying the Remote MCP Server.
Langbase provides a fully compliant Remote MCP server that enables direct interaction with your Langbase pipes, memories, and agents. Create new pipe agents, upload documents to memory, run existing pipes, and manage your entire Langbase workspace seamlessly from your IDE.
This server transforms your development environment into a powerful AI workspace where you can build, test, and deploy AI agents without leaving your code editor.
The Langbase Docs MCP server allows IDEs like Cursor, Windsurf, etc., to access Langbase documentation. It provides LLMs with up-to-date context about Langbase APIs, SDK, and other resources present in the docs, enabling LLMs to deliver accurate and relevant answers to your Langbase-related queries.
Perfect for getting instant help with Langbase SDK, API, and troubleshooting directly in your IDE.
Langbase provides comprehensive support for third-party MCP servers. Leverage external tools and data sources with our platform. Integrate popular MCP servers like Slack, Cloudflare Browser Rendering, Intercom and more.
MCP servers work with agent primitive. This enables you to create powerful AI agents that access external tools through a unified interface.
To use third-party MCP servers, include them in the mcp_servers
parameter:
Using third-party MCP servers with agent
const response = await langbase.agent.run({
model: 'openai:gpt-4.1-mini',
instructions: 'You are a helpful assistant.',
input: [
{
role: 'user',
content: 'What transport protocols does the MCP spec support?',
},
],
mcp_servers: [
{
type: 'url',
name: 'deepwiki',
url: 'https://mcp.deepwiki.com/sse',
},
],
stream: false,
});
This approach combines Langbase infrastructure with specialized MCP servers.
- Build something cool with Langbase Remote MCP Server from your IDEs.
- Integrate Langbase Docs MCP Server inside your IDEs and build with our Langbase SDK.
- Explore Popular MCP Servers that work with Langbase.
- Join our Discord community for feedback, requests, and support.