opencode-personal-knowledge
🧠Personal knowledge MCP server with vector database for Opencode. Store and retrieve knowledge using semantic search, powered by local embeddings.
claude mcp add --transport stdio nocturnlabs-opencode-personal-knowledge bunx opencode-personal-knowledge \ --env OPENCODE_PK_DATA_DIR="Path to data directory (optional; defaults to ~/.local/share/opencode-personal-knowledge/)"
How to use
This MCP server provides a personal knowledge base with local embeddings and a vector store. It lets you store knowledge entries, perform semantic searches, and manage entries with a suite of knowledge tooling. It also supports session-based memory for OpenCode sessions, enabling you to log messages and search across sessions. The tooling includes semantic search, text search, and management commands to store, update, retrieve, list, and delete knowledge entries, as well as session logging and retrieval features. All processing runs locally, using an embedded vector database and local embeddings, so there are no external API keys required.
How to install
Prerequisites:
- Bun installed (bun.sh)
- Basic Node.js/Bun environment
Install and run the MCP server:
# 1. Clone the repository
git clone https://github.com/NocturnLabs/opencode-personal-knowledge.git
cd opencode-personal-knowledge
# 2. Install dependencies
bun install
# 3. Start the MCP server
bun run mcp
Usage (during development):
- The MCP server starts with the command above and exposes tools you can invoke via the MCP framework. By default, data is stored under ~/.local/share/opencode-personal-knowledge/ and embeddings are generated locally via FastEmbed with LanceDB for vector storage.
Environment configuration (optional):
- You can customize the data directory by setting OPENCODE_PK_DATA_DIR, e.g.:
export OPENCODE_PK_DATA_DIR=/custom/path
Additional notes
Notes and tips:
- Data location: By default, knowledge and vector data are stored under ~/.local/share/opencode-personal-knowledge/; override with OPENCODE_PK_DATA_DIR.
- Embeddings: Uses BGE-small-en-v1.5 with FastEmbed (auto-downloads on first use).
- Runtime: The MCP server is run via Bun (bunx). Ensure Bun is installed and available in your PATH.
- Embeddings and vector store are fully local; no external API keys are required.
- If you encounter permission issues writing to the data directory, check directory permissions and adjust OPENCODE_PK_DATA_DIR accordingly.
Related MCP Servers
mcp-memory-libsql
🧠High-performance persistent memory system for Model Context Protocol (MCP) powered by libSQL. Features vector search, semantic knowledge storage, and efficient relationship management - perfect for AI agents and knowledge graph applications.
mcp
🤖 A Model Context Protocol (MCP) library for use with Agentic chat bots
CanvasMCPClient
Canvas MCP Client is an open-source, self-hostable dashboard application built around an infinite, zoomable, and pannable canvas. It provides a unified interface for interacting with multiple MCP (Model Context Protocol) servers through a flexible, widget-based system.
qdrant
MCP server for semantic search using local Qdrant vector database and OpenAI embeddings
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
Youtube
YouTube MCP Server is an AI-powered solution designed to revolutionize your YouTube experience. It empowers users to search for YouTube videos, retrieve detailed transcripts, and perform semantic searches over video content—all without relying on the official API. By integrating with a vector database, this server streamlines content discovery.