gnosis
Zero-config MCP server for searchable documentation (SQLite default, PostgreSQL optional)
claude mcp add --transport stdio nicholasglazer-gnosis-mcp python -m gnosis_mcp serve \ --env GNOSIS_MCP_REST="Enable REST endpoints (optional; set to true to enable)" \ --env GNOSIS_MCP_API_KEY="Bearer token for authenticated REST access (optional)" \ --env GNOSIS_MCP_CORS_ORIGINS="Comma-separated list of allowed origins for REST (optional)"
How to use
Gnosis MCP turns your documentation into a searchable, AI-friendly knowledge base. It ingests a variety of document formats (markdown, text, notebooks, JSON, CSV, etc.) and builds a hybrid search index that combines keyword search with local embeddings for semantic retrieval. Use the ingest command to load your docs, then run the server to enable searching, viewing document excerpts with highlights, and exposing a REST API for integration with editors and agents. You’ll also get features like git history ingestion, web crawling for site-wide content, and optional PostgreSQL support for larger scales. Typical workflows involve ingesting your docs, optionally enabling embeddings for semantic search, and serving the MCP locally or remotely for your AI agents to query.
Available tools and capabilities include:
- ingest: load local docs into the index (SQLite by default) and auto-create a searchable corpus.
- serve: run the MCP server to expose search_docs, get_doc, and related tools to AI agents.
- ingest-git: turn commit histories into searchable context to explain why changes happened.
- crawl: perform web crawling with sitemap or link-based discovery to ingest external docs.
- embeddings (optional): generate local embeddings to enable semantic search without external API keys.
- REST API (optional): expose endpoints for health, search, docs, and categories that can be queried by web apps.
- editors integrations: plug the MCP server config into editors to enable tools automatically for agents.
How to install
Prerequisites:
- Python 3.8+ (recommended latest patch)
- pip (comes with Python)
- Optional: a virtual environment tool (venv, pyenv, etc.)
Step-by-step installation:
-
Create and activate a virtual environment (optional but recommended):
python -m venv venv
Windows
venv\Scripts\activate.bat
macOS/Linux
source venv/bin/activate
-
Upgrade pip and install gnosis-mcp from PyPI:
python -m pip install --upgrade pip pip install gnosis-mcp
-
(Optional) Install with embedding support if you plan to use semantic search:
pip install gnosis-mcp[embeddings]
-
Verify installation by listing the CLI help:
gnosis-mcp --help
-
Start ingesting docs and serving:
gnosis-mcp ingest ./docs/ # load docs into SQLite (auto-created) gnosis-mcp serve # starts MCP server
If you plan to crawl websites or enable REST, ensure any required environment variables are set as described in the mcp_config section.
Additional notes
Tips and common considerations:
- Default backend is SQLite with full-text search. For larger datasets, consider using the PostgreSQL backend and enabling embeddings for improved semantic search.
- Embeddings are local (no API keys required) when enabled via gnosis-mcp[embeddings].
- The REST API provides health, search, and doc endpoints; you can enable it by starting the server with REST support and setting GNOSIS_MCP_REST=true.
- Ingest-git and crawl features help keep your knowledge base up to date with contextual history and external docs.
- If you encounter encoding or formatting issues in certain files, ensure the files are properly encoded (UTF-8) and that the MCP supports the document format you’re ingesting.
- For editor integrations, add a config block pointing to the same server and the serve command to expose tools like search_docs, get_doc, and get_related.
Related MCP Servers
Remote
A type-safe solution to remote MCP communication, enabling effortless integration for centralized management of Model Context.
boilerplate
TypeScript Model Context Protocol (MCP) server boilerplate providing IP lookup tools/resources. Includes CLI support and extensible structure for connecting AI systems (LLMs) to external data sources like ip-api.com. Ideal template for creating new MCP integrations via Node.js.
asterisk
Asterisk Model Context Protocol (MCP) server.
brainstorm
MCP server for multi-round AI brainstorming debates between multiple models (GPT, DeepSeek, Groq, Ollama, etc.)
vscode-context
MCP Server to Connect with VS Code IDE
Email MCP server with full IMAP + SMTP support — read, search, send, manage, and organize email from any AI assistant via the Model Context Protocol