Get the FREE Ultimate OpenClaw Setup Guide →

vercel-ai-docs

A Model Context Protocol (MCP) server that provides AI-powered search and querying capabilities for the Vercel AI SDK documentation. This project enables developers to ask questions about the Vercel AI SDK and receive accurate, contextualized responses based on the official documentation.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ivanamador-vercel-ai-docs-mcp node ABSOLUTE_PATH_TO_PROJECT/dist/main.js \
  --env GOOGLE_GENERATIVE_AI_API_KEY="your-google-api-key-here"

How to use

The Vercel AI SDK Documentation MCP Agent provides an AI-powered interface to search, query, and summarize the official Vercel AI SDK documentation. It exposes three main tools: agent-query for natural-language questions that leverage the built index and an AI model to generate contextual answers, direct-query for fast semantic search against the documentation index, and clear-memory to reset conversation history for one or all sessions. The agent-query tool can synthesize information from multiple docs and present structured responses, while direct-query returns precise passages or snippets from the docs. Session memory ensures your follow-up questions stay grounded in the prior context of the conversation.

How to install

Prerequisites:

  • Node.js 18+ with npm
  • Access to a Google Gemini API key (Google Generative AI)
  • Git (optional, for cloning the repo)

Install and run locally:

  1. Clone the repository: git clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git cd vercel-ai-docs-mcp-agent

  2. Install dependencies: npm install

  3. Build the project: npm run build

  4. Build the documentation index: npm run build:index

  5. Start the MCP server: npm run start

Environment setup:

  • Create a .env file at the project root (as needed by the project) and provide your Google Gemini API key: GOOGLE_GENERATIVE_AI_API_KEY=your-google-api-key-here

If you plan to integrate with Claude Desktop or other MCP clients, configure the client with the server path and the same environment variable for API access.

Additional notes

Tips and considerations:

  • Ensure the Google Generative AI API key has access to the Gemini model you intend to use and that API quotas are sufficient for your testing.
  • If you encounter index loading issues, run npm run build:index to regenerate the FAISS index before starting the server.
  • When wiring the MCP server with clients (Claude Desktop, Cursor, etc.), provide the absolute path to the built main.js in dist and propagate the GOOGLE_GENERATIVE_AI_API_KEY through environment variables as shown in the example configuration.
  • Keep your dependencies up to date to maintain compatibility with MCP protocol changes and the Gemini model endpoints.
  • For production deployments, consider securing environment variables and using a process manager to keep the server running (e.g., PM2) and to handle restarts gracefully.

Related MCP Servers

Sponsor this space

Reach thousands of developers