mle_kit_mcp
MLE kit MCP server: Tools for working with files and code locally and with a remote GPU
claude mcp add --transport stdio ilyagusev-mle_kit_mcp uvx run python -m mle_kit_mcp --port 5057
How to use
MLE kit MCP provides a multi-tool development environment for ML workflows inside a containerized server. The server exposes utilities such as a local bash environment inside a Docker container bound to your workspace, a text editor for editing files with undo support, fast file search, remote GPU helpers via vast.ai, and an OpenRouter-backed OpenAI-compatible proxy. You can interact with these tools through the MCP endpoint served by the mle_kit_mcp module. Start the MCP server and connect to the /mcp endpoint to issue commands and leverage the provided interfaces. The tools are designed to be used both locally (within the container) and remotely (on a remote GPU) depending on the operation you choose, such as remote_bash for executing commands on a distant machine or remote_text_editor to edit files remotely and sync them back to your workspace. The llm_proxy_local and llm_proxy_remote utilities let you run an OpenRouter-backed LLM proxy either locally inside the bash container or on the remote GPU.
Once running, you can access the following capabilities:
- Bash: Run commands inside an isolated Docker container with your WORKSPACE_DIR mounted, enabling repeatable experiments and isolated environments.
- Text editor: View and edit files in your workspace with undo support, directly through the MCP interface.
- Glob / grep: Quickly glob files or search file contents using ripgrep-backed utilities; useful for code exploration and refactoring.
- Remote GPU tools: Launch remote computations on vast.ai GPUs, and synchronize files to/from the remote machine for faster data processing and model training.
- LLM proxy: Start an OpenAI-compatible proxy backed by OpenRouter, enabling local or remote LLM access with customizable routing.
How to install
Prerequisites
- Python 3.12+ installed on your host
- Docker daemon running if you plan to use the local bash tool
- ripgrep (rg) installed on the host for grep tool usage
- Network access to install Python packages (online environment)
Installation steps (recommended via uvx)
- Create and activate a virtual environment (optional but recommended):
python -m venv .venv
source .venv/bin/activate
- Install the MCP package in editable mode (this repo):
python -m pip install -e .
- Install uvx (if not already installed) and ensure uv is available as a command:
pip install uvx
# or ensure uv is installed and accessible as `uvx`
- Run the MCP server (example port 5057):
WORKSPACE_DIR=/absolute/path/to/workdir uvx run python -m mle_kit_mcp --port 5057
- Optional Docker run (same behavior via Docker):
docker build -t mle_kit_mcp .
docker run --rm -p 5057:5057 \
-e PORT=5057 \
-e WORKSPACE_DIR=/workspace \
-v "$PWD/workdir:/workspace" \
mle_kit_mcp
Notes:
- The server defaults to port 5057 if you do not specify --port.
- Ensure WORKSPACE_DIR is set to an absolute path where you want the workspace mounted inside the container.
- For remote GPU features, configure the required environment variables such as VAST_AI_KEY and related settings as documented in the README.
Additional notes
Tips and common issues:
- If you encounter permission or filesystem issues, ensure the host user has access to the workspace directory and Docker is allowed to bind mount the workspace.
- Remote GPU tools rely on vast.ai; ensure you have a valid VAST_AI_KEY and that you understand the cost implications of creating remote instances.
- For LLM proxy tools, you may need an OPENROUTER_API_KEY; set OPENROUTER_BASE_URL if you use a custom OpenRouter endpoint.
- The bash tool runs inside a Docker container with the workspace mounted at /workdir; commands should reference paths relative to the workspace, not the host filesystem.
- If you edit the server or dependencies, re-install in editable mode to pick up changes, or rebuild the Docker image.
- The MCP endpoint is served at /mcp by default; you can specify different transports if supported by your frontend client.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
jupyter
🪐 🔧 Model Context Protocol (MCP) Server for Jupyter.
mcp-aktools
📈 提供股票、加密货币的数据查询和分析功能MCP服务器
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
awsome_kali_MCPServers
awsome kali MCPServers is a set of MCP servers tailored for Kali Linux
academia_mcp
Academia MCP server: Tools for automatic scientific research