Get the FREE Ultimate OpenClaw Setup Guide →

knowledge-base

This MCP server provides tools for listing and retrieving content from different knowledge bases.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jeanibarz-knowledge-base-mcp-server node /path/to/knowledge-base-mcp-server/build/index.js \
  --env OLLAMA_MODEL="dengcao/Qwen3-Embedding-0.6B:Q8_0" \
  --env OLLAMA_BASE_URL="http://localhost:11434" \
  --env EMBEDDING_PROVIDER="ollama" \
  --env KNOWLEDGE_BASES_ROOT_DIR="/path/to/knowledge_bases"

How to use

Knowledge Base MCP Server provides two tools to work with your knowledge bases: list_knowledge_bases and retrieve_knowledge. Use list_knowledge_bases to see which knowledge bases are available on the server. Use retrieve_knowledge to perform a semantic search across one or more knowledge bases and return the most relevant chunks. The results are produced as markdown-formatted content, showing the most similar chunks and their contexts. By default, up to 10 chunks with a similarity score below the threshold are returned, but you can adjust these parameters when calling retrieve_knowledge. The server indexes text files found under each knowledge base directory, splitting content into chunks and storing them in a FAISS index for fast similarity search. You can configure embedding providers (Ollama, OpenAI, or HuggingFace) and environment variables to tailor indexing and retrieval to your setup.

How to install

Prerequisites:

  • Node.js (version 16 or higher)
  • npm (Node Package Manager)

Manual installation steps:

  1. Clone the repository:
git clone <repository_url>
cd knowledge-base-mcp-server
  1. Install dependencies:
npm install
  1. Configure environment variables (choose one embedding provider):

Option 1: Ollama (recommended)

export EMBEDDING_PROVIDER=ollama
export OLLAMA_BASE_URL=http://localhost:11434
export OLLAMA_MODEL=dengcao/Qwen3-Embedding-0.6B:Q8_0
export KNOWLEDGE_BASES_ROOT_DIR=$HOME/knowledge_bases

Option 2: OpenAI

export EMBEDDING_PROVIDER=openai
export OPENAI_API_KEY=your_api_key_here
export OPENAI_MODEL_NAME=text-embedding-ada-002
export KNOWLEDGE_BASES_ROOT_DIR=$HOME/knowledge_bases

Option 3: HuggingFace

export EMBEDDING_PROVIDER=huggingface
export HUGGINGFACE_API_KEY=your_hf_api_key
export HUGGINGFACE_MODEL_NAME=sentence-transformers/all-MiniLM-L6-v2
export KNOWLEDGE_BASES_ROOT_DIR=$HOME/knowledge_bases

Additional configuration:

  • Optional: FAISS index path via FAISS_INDEX_PATH (default: $HOME/knowledge_bases/.faiss)
  • Optional: LOG_FILE and LOG_LEVEL for logging
  1. Build the server:
npm run build
  1. Add the MCP server to your MCP settings (example shown):
"knowledge-base-mcp-ollama": {
  "command": "node",
  "args": [
    "/path/to/knowledge-base-mcp-server/build/index.js"
  ],
  "disabled": false,
  "autoApprove": [],
  "env": {
    "KNOWLEDGE_BASES_ROOT_DIR": "/path/to/knowledge_bases",
    "EMBEDDING_PROVIDER": "ollama",
    "OLLAMA_BASE_URL": "http://localhost:11434",
    "OLLAMA_MODEL": "dengcao/Qwen3-Embedding-0.6B:Q8_0"
  },
  "description": "Retrieves similar chunks from the knowledge base based on a query using Ollama."
}
  1. Create knowledge base directories and index files as described in the README (subdirectories under KNOWLEDGE_BASES_ROOT_DIR, with text files, ignoring hidden files). The server will compute SHA256 hashes, build a FAISS index, and keep it updated on changes.

Additional notes

Tips and common issues:

  • Ensure the knowledge bases root directory is writable by the MCP server process.
  • If embeddings are not indexing, check that the embedding provider is reachable (Ollama must be running if using Ollama).
  • For large knowledge bases, monitor memory usage and consider increasing the FAISS index update interval or manually triggering rebuilds.
  • The server skips hidden files and directories (those starting with a dot).
  • If you change embedding providers, update KNOWNLEDGE_BASES_ROOT_DIR and related provider-specific env vars accordingly.
  • Logging can be redirected to a file using LOG_FILE; adjust LOG_LEVEL to debug for troubleshooting.

Related MCP Servers

Sponsor this space

Reach thousands of developers