vercel-ai-docs
A Model Context Protocol (MCP) server that provides AI-powered search and querying capabilities for the Vercel AI SDK documentation. This project enables developers to ask questions about the Vercel AI SDK and receive accurate, contextualized responses based on the official documentation.
claude mcp add --transport stdio ivanamador-vercel-ai-docs-mcp node ABSOLUTE_PATH_TO_PROJECT/dist/main.js \ --env GOOGLE_GENERATIVE_AI_API_KEY="your-google-api-key-here"
How to use
The Vercel AI SDK Documentation MCP Agent provides an AI-powered interface to search, query, and summarize the official Vercel AI SDK documentation. It exposes three main tools: agent-query for natural-language questions that leverage the built index and an AI model to generate contextual answers, direct-query for fast semantic search against the documentation index, and clear-memory to reset conversation history for one or all sessions. The agent-query tool can synthesize information from multiple docs and present structured responses, while direct-query returns precise passages or snippets from the docs. Session memory ensures your follow-up questions stay grounded in the prior context of the conversation.
How to install
Prerequisites:
- Node.js 18+ with npm
- Access to a Google Gemini API key (Google Generative AI)
- Git (optional, for cloning the repo)
Install and run locally:
-
Clone the repository: git clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git cd vercel-ai-docs-mcp-agent
-
Install dependencies: npm install
-
Build the project: npm run build
-
Build the documentation index: npm run build:index
-
Start the MCP server: npm run start
Environment setup:
- Create a .env file at the project root (as needed by the project) and provide your Google Gemini API key: GOOGLE_GENERATIVE_AI_API_KEY=your-google-api-key-here
If you plan to integrate with Claude Desktop or other MCP clients, configure the client with the server path and the same environment variable for API access.
Additional notes
Tips and considerations:
- Ensure the Google Generative AI API key has access to the Gemini model you intend to use and that API quotas are sufficient for your testing.
- If you encounter index loading issues, run npm run build:index to regenerate the FAISS index before starting the server.
- When wiring the MCP server with clients (Claude Desktop, Cursor, etc.), provide the absolute path to the built main.js in dist and propagate the GOOGLE_GENERATIVE_AI_API_KEY through environment variables as shown in the example configuration.
- Keep your dependencies up to date to maintain compatibility with MCP protocol changes and the Gemini model endpoints.
- For production deployments, consider securing environment variables and using a process manager to keep the server running (e.g., PM2) and to handle restarts gracefully.
Related MCP Servers
iterm
A Model Context Protocol server that executes commands in the current iTerm session - useful for REPL and CLI assistance
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
editor
MCP Server for Phaser Editor
DoorDash
MCP server from JordanDalton/DoorDash-MCP-Server
mcp
MCP сервер для автоматического создания и развертывания приложений в Timeweb Cloud