Get the FREE Ultimate OpenClaw Setup Guide →

context

Self-hosted MCP server for your documentation

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio dodopayments-context-mcp npx -y contextmcp \
  --env ENVIRONMENT="optional-env-or-placeholder" \
  --env OPENAI_API_KEY="your-openai-api-key" \
  --env PINECONE_API_KEY="your-pinecone-api-key"

How to use

ContextMCP creates a searchable knowledge base from your documentation and exposes it via the Model Context Protocol (MCP) so AI assistants can query it. This MCP server scaffolds a documentation-focused pipeline: it collects content from your docs, APIs, and READMEs, chunks it into semantically meaningful pieces, embeds those chunks with OpenAI embeddings, and stores them in Pinecone for fast similarity search. The deployment workflow also includes a Cloudflare Worker to serve the MCP API in a serverless edge environment. To start, install and configure the CLI-based scaffolding, point the system to your documentation sources, and run the reindex command to index content. Once indexed, you can deploy the Cloudflare worker to expose the MCP endpoints publicly and integrate with chat assistants that use MCP for context.

Key tools and capabilities available in this server setup include:

  • npx contextmcp: Initialize and scaffold a new MCP project, install dependencies, and manage config.
  • reindex: Trigger the indexing pipeline to parse, chunk, embed, and store content in Pinecone.
  • Environment configuration: Set your OpenAI API key and Pinecone API key to enable embedding and vector storage.
  • Cloudflare deployment: Deploys an edge-serving MCP API via a Cloudflare Worker for low-latency access.
  • Documentation sources support: MDX, Markdown, and OpenAPI content types for building a rich knowledge base.

How to install

Prerequisites:

  • Node.js 18+ installed on your system
  • npm or corepack available
  • Access to Pinecone (API key) and OpenAI (API key) for embeddings

Installation steps:

  1. Install the CLI globally or use npx to scaffold a new MCP project: npx contextmcp init my-docs-mcp

  2. Change into the project directory and install dependencies: cd my-docs-mcp npm install

  3. Configure API keys and environment variables:

    • Copy the example env file if provided and fill in keys cp .env.example .env
    • Edit .env and set: PINECONE_API_KEY=your-pinecone-api-key OPENAI_API_KEY=your-openai-api-key
  4. Edit configuration and sources:

    • Edit config.yaml to point to your documentation sources
    • Add any additional environment or deployment settings as needed
  5. Build and index content: npm run reindex

  6. Deploy the Cloudflare Worker (if using Cloudflare): cd cloudflare-worker npm install npm run deploy

Optional:

  • Run development servers locally to test before deploying
  • Use npm run dev:* scripts to run individual components in isolation

Additional notes

Notes and tips:

  • Keep your API keys secure and do not commit them to version control. Use a .env file or a secrets manager in production.
  • The reindex step may take time depending on the size of your documentation and the OpenAI/Pinecone quotas; monitor usage to avoid exceeding limits.
  • Ensure your OpenAI embeddings model is suitable for your content type and length; adjust chunking settings in config.yaml if needed.
  • If you encounter deployment issues with the Cloudflare Worker, verify environment compatibility and Cloudflare credentials, and consult the deployment logs.
  • The npm package name for the MCP scaffolding is contextmcp; use npx contextmcp to initialize projects and manage configurations.

Related MCP Servers

Sponsor this space

Reach thousands of developers