Get the FREE Ultimate OpenClaw Setup Guide →

obsidian-notebook

MCP server to let claude connect to my obsidian notes for vector and full text search

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jarmentor-obsidian-notebook-mcp node /path/to/ai-note-searcher-5000/mcp-server.js \
  --env MCP_SERVER="true" \
  --env OLLAMA_URL="http://127.0.0.1:11434" \
  --env QDRANT_URL="http://127.0.0.1:6333" \
  --env NOTEBOOK_PATH="/path/to/your/notebook"

How to use

This MCP server powers an Obsidian-oriented semantic search system. It exposes a set of tools that allow an LLM to perform high-quality searches over your Obsidian notes, retrieve full note contents, and perform file/directory related operations as needed by complex prompts. The server relies on a local vector store (Qdrant) populated with embeddings generated from your Obsidian vault using Ollama, and it uses the MCP protocol to make its capabilities accessible to an LLM client. Typical usage involves starting the stack with Docker, ensuring Ollama is available for embeddings, and pointing the MCP client at the server (e.g., Claude Desktop) via the provided configuration snippet. Available MCP tools include search_notes for semantic note search and get_note_content for retrieving complete note text, along with additional file management tools to support broader prompts. Integrating this with Claude Desktop or another LLM interface requires configuring the MCP server entry with the path to mcp-server.js and environment variables that point at your Qdrant and Ollama instances as well as your Obsidian notebook path.

How to install

Prerequisites:

  • Docker and Docker Compose installed on your machine
  • Ollama installed and running locally with the nomic-embed-text:latest model
  • Access to your Obsidian vault/notebook folder

Installation steps:

  1. Clone the repository and navigate to the project directory:
git clone <repository-url>
cd ai-note-searcher-5000
  1. Ensure Docker Compose file points to your notebook path and that Ollama is pulling the embedding model:
  • Update docker-compose.yml volumes to mount your Obsidian notebook, for example:
volumes:
  - /path/to/your/obsidian/notebook:/app/notebook:ro
  1. Pull the embedding model ( Ollama ):
ollama pull nomic-embed-text:latest
  1. Start the services:
docker-compose up
  1. Verify services:

For local development without Docker, install dependencies and run the dev server per the repository’s package.json scripts, ensuring Qdrant and Ollama are available locally.

Additional notes

Tips and common considerations:

  • If you see "fetch failed" errors in MCP responses, use 127.0.0.1 instead of localhost for service URLs in your client configuration.
  • If you don’t see search results, check Docker logs and ensure that the file watcher is processing your Obsidian vault and that embeddings have been generated.
  • MCP JSON parsing errors can occur if the MCP_SERVER flag isn’t enabled; enabling MCP_SERVER=true can help with clean console logging and MCP message handling.
  • Ensure the Qdrant data directory is persisted (e.g., docker-compose volumes) to avoid data loss when restarting containers.
  • For production, consider configuring additional environment variables for security and performance tuning, such as enabling authentication for Qdrant or restricting accessible endpoints.

Related MCP Servers

Sponsor this space

Reach thousands of developers