Get the FREE Ultimate OpenClaw Setup Guide →

living-knowledge

Privacy-first RAG over Obsidian notes using LibreChat + MCP

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio sahlstra-living-knowledge docker compose -p living-knowledge up -d --build \
  --env DATABASE_URL="mongodb://username:password@localhost:27017" \
  --env OPENAI_API_KEY="your-openai-api-key" \
  --env MEILISEARCH_HOST="http://localhost:7700" \
  --env ANTHROPIC_API_KEY="your-anthropic-api-key"

How to use

Living Knowledge turns your local Obsidian vault into a private, AI-enabled memory that can read, search, and write to your notes. The system leverages devrag for semantic search over your Markdown files and a filesystem MCP server to read and write notes directly within your vault. Access is secured via a Cloudflare Tunnel when you enable external access. To use it, start the Docker-based MCP stack locally, then open LibreChat to interact with the AI. The AI can perform semantic searches across your notes, summarize conversations, create new notes, and update existing ones, all while keeping your data on your own hardware. For outside access, configure the Cloudflare Tunnel and use Zero Trust for authentication so that only authorized users can reach your notes.

How to install

Prerequisites:

  • Docker and Docker Compose installed on your machine
  • Access to an Anthropic API key (or an OpenAI API key) for the AI backend
  • A Cloudflare account if you plan to enable external access
  • An Obsidian vault or folder of Markdown files to index

Step-by-step:

  1. Clone the repository and navigate to the project folder (as described in the Quick Start guide in the README).
  2. Save environment variables: create a .env file or rely on the provided example. Include keys like ANTHROPIC_API_KEY and OPENAI_API_KEY as needed, and point vault-related settings to your Obsidian vault.
  3. Copy and customize the docker-compose override to point at your vault path: cp docker-compose.override.yml.living-knowledge.example docker-compose.override.yml Edit the vault path to your local Obsidian vault location, e.g. /home/user/Documents/ObsidianVault
  4. Build and run the MCP stack: docker compose -p living-knowledge up -d --build
  5. Verify startup by checking logs for the api service: docker compose -p living-knowledge logs -f api
  6. Open LibreChat in your browser at http://localhost:3080 and create your account to begin interacting with the AI.

Optional external access (Cloudflare Tunnel): follow the Cloudflare Tunnel setup steps in the README to expose the service securely outside your local network.

Additional notes

Tips and common issues:

  • Ensure your vault path is accessible with correct permissions from Docker.
  • If the AI cannot access your keys, verify environment variables are loaded in the container (check .env and docker-compose overrides).
  • For semantic search to work well, keep your Markdown notes well-structured with headings and clear content.
  • If you use Cloudflare Tunnel, consider enabling Zero Trust to restrict access to authenticated users only.
  • Monitor resource usage (CPU/RAM) on the host, as large vaults and embedding workloads can be memory-intensive.
  • Regularly back up your Obsidian vault and database (MongoDB/MeiliSearch) data.
  • If you need to update the stack, run docker compose -p living-knowledge pull && docker compose -p living-knowledge up -d --build to refresh images.

Related MCP Servers

Sponsor this space

Reach thousands of developers