Get the FREE Ultimate OpenClaw Setup Guide →

fs

文件资料查找mcp服务

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio boleyn-fs-mcp-server python main.py \
  --env SAFE_DIRECTORY="/your/project/directory" \
  --env DEFAULT_ENCODING="utf-8" \
  --env MAX_FILE_SIZE_MB="100" \
  --env EMBEDDING_CHUNK_SIZE="1000" \
  --env EMBEDDING_MODEL_NAME="BAAI/bge-m3" \
  --env OPENAI_EMBEDDINGS_API_KEY="<your-api-key>" \
  --env OPENAI_EMBEDDINGS_BASE_URL="<embedding-service-url>"

How to use

FS-MCP is a Python-based MCP server that provides intelligent file reading and semantic search across multiple document formats. It auto-detects text content, converts documents to Markdown for streamlined viewing, and builds a vector index to enable fast, AI-powered searches across your file corpus. You can access the server via Server-Sent Events (SSE) at http://localhost:3002 and use the included MCP tools to explore, read, and search your files. The built-in tools include directory visualization, line-range reading, semantic search, and index management, making it easy to scope large codebases or document collections and retrieve relevant results quickly.

To use the capabilities, start the server with the configured Python entry point (main.py). Then utilize the MCP tools such as view_directory_tree, read_file_content, search_documents, rebuild_document_index, get_document_stats, list_files, and preview_file to interact with your files. The vector search leverages AI embeddings to understand queries in natural language and map them to relevant code or text across languages, including English and Chinese. For integration, you can connect clients via SSE to receive updates and results in real time.

How to install

Prerequisites:

  • Python 3.12 or higher
  • Git
  • Optional: a GPU-enabled environment for faster embeddings (not required)
  1. Clone the repository
git clone https://github.com/yourusername/fs-mcp.git
cd fs-mcp
  1. Create and activate a Python virtual environment (recommended)
python -m venv venv
# On Windows
venv\Scripts\activate
# On macOS/Linux
source venv/bin/activate
  1. Install dependencies
pip install -r requirements.txt
# Or install core dependencies directly if requirements.txt is not present
pip install fastmcp>=2.0.0 langchain>=0.3.0 python-dotenv>=1.1.0
  1. Prepare environment configuration

Create a .env file or set environment variables as needed (SAFE_DIRECTORY, MAX_FILE_SIZE_MB, DEFAULT_ENCODING, OPENAI_EMBEDDINGS_API_KEY, etc.). Example:

export SAFE_DIRECTORY=/path/to/your/files
export MAX_FILE_SIZE_MB=100
export DEFAULT_ENCODING=utf-8
export OPENAI_EMBEDDINGS_API_KEY=your-api-key
export OPENAI_EMBEDDINGS_BASE_URL=https://your-embedding-service/v1
export EMBEDDING_MODEL_NAME=BAAI/bge-m3
export EMBEDDING_CHUNK_SIZE=1000
  1. Run the server
python main.py

The server will start and listen on http://localhost:3002, automatically building the vector index.

Additional notes

Environment variables and configuration options can be customized for production deployments. Common considerations include:

  • SAFE_DIRECTORY should be a restricted path to prevent unauthorized access
  • MAX_FILE_SIZE_MB controls the maximum readable file size
  • Ensure your embedding service keys and URL (OPENAI_EMBEDDINGS_API_KEY, OPENAI_EMBEDDINGS_BASE_URL) are valid
  • For large datasets, consider external vector databases or enhanced caching
  • If you use a different model or chunk size, adjust EMBEDDING_MODEL_NAME and EMBEDDING_CHUNK_SIZE accordingly
  • The MCP tools (view_directory_tree, read_file_content, search_documents, etc.) can be invoked from your client or script to interact with the server

Related MCP Servers

Sponsor this space

Reach thousands of developers