mcp-snippets
MCP server from micro-agent/mcp-snippets-server
claude mcp add --transport stdio micro-agent-mcp-snippets-server go run main.go \ --env LIMIT="Similarity threshold (default: 0.6)" \ --env MAX_RESULTS="Maximum search results (default: 2)" \ --env MCP_HTTP_PORT="HTTP server port (default: 9090)" \ --env EMBEDDING_MODEL="Embedding model name (default: ai/mxbai-embed-large:latest)" \ --env JSON_STORE_FILE_PATH="Vector store file path (default: rag-memory-store.json)" \ --env MODEL_RUNNER_BASE_URL="OpenAI-compatible API endpoint (default: http://localhost:12434/engines/llama.cpp/v1/)"
How to use
This MCP server processes Markdown files containing code snippets, builds a vector store from them, and exposes an MCP tool named search_snippet. It uses semantic embeddings to perform retrieval-augmented search over the code snippets and returns the most relevant matches. To use it, start the server and then call the MCP tool via the provided /mcp endpoint. The available tool is search_snippet, which accepts a topic string and returns code snippets that semantically relate to that topic.
How to install
Prerequisites:
- Go installed (go1.20+ recommended)
- Access to a starting directory containing a snippets/ folder and Markdown files
Installation steps:
- Clone the repository or download the source code.
- Ensure your environment variables are set (see mcp_config below) or create a .env file.
- Build/run the server:
# From the project root
go run main.go
If you prefer Docker, see the Docker instructions in the README and use the provided docker-compose setup with the mcp-snippets-server image. The server will process Markdown files on first run and persist embeddings to the configured JSON store.
Additional notes
Tips and common considerations:
- The vector store is persisted to JSON (default rag-memory-store.json) to speed up subsequent startups.
- Update the snippets/ directory by adding or removing Markdown files and restart the server to reprocess and refresh embeddings.
- Configure OPENAI-compatible embeddings via EMBEDDING_MODEL and MODEL_RUNNER_BASE_URL for local or cloud embeddings.
- If using Docker, ensure the MCP_HTTP_PORT mapping matches your environment and consider using the provided compose.yml for quick deployment.
- The MCP tool endpoint is exposed at /mcp, and you can call search_snippet with a topic string to retrieve relevant code snippets.
Related MCP Servers
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
station
Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.
tiger-cli
Tiger CLI is the command-line interface for Tiger Cloud. It includes an MCP server for helping coding agents write production-level Postgres code.
gopls
MCP server for golang projects development: Expand AI Code Agent ability boundary to have a semantic understanding and determinisic information for golang projects.
kubernetes
A Model Context Protocol (MCP) server for the Kubernetes API.
gcp-cost
💰 An MCP server that enables AI assistants to estimate Google Cloud costs, powered by Cloud Billing Catalog API and built with Genkit for Go