Get the FREE Ultimate OpenClaw Setup Guide →

company-docs

AI-powered company knowledge MCP. Unified place for internal policies, values, documentation, and governance. Agents can search, cite, and answer questions using real company docs.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio southleft-company-docs-mcp npx -y company-docs-mcp \
  --env SUPABASE_URL="https://your-project.supabase.co" \
  --env SUPABASE_ANON_KEY="your-anon-key" \
  --env SUPABASE_SERVICE_KEY="your-service-key" \
  --env CLOUDFLARE_ACCOUNT_ID="your-account-id"

How to use

Company Docs MCP turns your documentation into an AI-searchable knowledge base. You ingest Markdown, HTML, PDFs, URLs, or even crawl an entire website, publish the content to a Supabase database with vector search, and then allow your team to query it through MCP-compatible AI tools like Claude, Cursor, or Slack. The system is designed so team members can publish updates and maintain a centralized knowledge source that supports semantic search rather than simple keyword matching. To use it, first ingest your content using the CLI, publish the resulting vectors to your database, and then connect your AI tool to the MCP server URL to start asking questions. The architecture relies on Cloudflare Workers AI for vectorization and Supabase for storage, with the CLI providing a straightforward workflow for ingestion and publishing.

Two primary roles exist: admins who set up and maintain the server and standard users who query the knowledge base. Admins publish content by running ingest and publish commands, while end users simply point their AI tool at the server URL and ask questions. The CLI supports multiple ingestion formats (Markdown, HTML, PDFs, URLs, and site crawling), making it easy to convert existing documentation into searchable content. Tools like Claude or Cursor can be configured to query the MCP server directly, returning answers sourced from your docs rather than generic web results.

How to install

Prerequisites:

  • Node.js 18+ installed on your machine
  • npm (comes with Node.js)
  • Access to a Supabase project and a Cloudflare account for hosting and AI features

Step 1: Install the MCP package locally (or globally)

  • In your project directory:
npm install company-docs-mcp

Or install globally:

npm install -g company-docs-mcp

Step 2: Prepare your Supabase database

  • Create a Supabase project
  • Retrieve the API URL, anon key, and service role key from Settings > API
  • Create a local .env file (see Step 4) with SUPABASE_URL, SUPABASE_ANON_KEY, and SUPABASE_SERVICE_KEY

Step 3: Connect to Cloudflare (CLI login)

  • If prompted, log in via Wrangler to enable Cloudflare AI:
npx wrangler login

Step 4: Configure environment variables

  • Create a .env file in your project root and add:
# Supabase credentials
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_ANON_KEY=your-anon-key
SUPABASE_SERVICE_KEY=your-service-key

# Cloudflare account
CLOUDFLARE_ACCOUNT_ID=your-account-id

Step 5: Run ingestion and publishing

  • Ingest content (examples follow; adjust paths/formats as needed):
# Ingest Markdown directory
npx company-docs ingest markdown --dir=./docs

# Ingest a single HTML file
npm run ingest -- html ./docs/page.html

# Ingest a PDF
npm run ingest -- pdf ./policies/summary.pdf

# Ingest a URL
npm run ingest -- url https://example.com/guide

Step 6: Start querying once the server URL is available in your deployment

  • Connect your MCP-compatible AI tool to the published server URL and begin asking questions.

Note: The MCC server architecture relies on three services (Cloudflare Workers AI, Supabase, and your client tooling). You typically do not need to run all services locally when using a hosted server; the CLI workflow focuses on ingesting and publishing content to your hosted database.

Additional notes

Tips and common issues:

  • Keep your .env file private and add it to .gitignore to avoid leaking credentials.
  • If you encounter authentication issues with Cloudflare, re-run npx wrangler login as the login session can expire.
  • When ingesting large sites, consider using website crawl ingestion to index all pages, then run a targeted publish to create vectors.
  • Ensure your Supabase schema is initialized by applying the SQL from the provided schema.sql package file if you’re hosting locally.
  • The npm package name is company-docs-mcp; the command examples assume you’re using npx or npm scripts from within a project that has the package installed.

Related MCP Servers

Sponsor this space

Reach thousands of developers