ml
A comprehensive MCP (Model Context Protocol) server for ML model training, fine-tuning, and experimentation. Transform your AI assistant into a full ML engineering environment.
claude mcp add --transport stdio pushpullcommitpush-ml-mcp ml-lab serve
How to use
This ML-focused MCP server provides an integrated environment for ML model training, fine-tuning, dataset management, experiment tracking, and cloud/infrastructure orchestration. It exposes a set of tools to manage credentials securely, register and transform datasets, configure and run experiments, and coordinate multi-backend training workflows (local, Mistral API, Together AI, and OpenAI fine-tuning). You can use the built-in MCP tools to register data, set up credentials for various providers, estimate training costs, launch training runs, monitor status, and manage cloud or VPS infrastructure. The Ollama and Open WebUI integrations allow you to deploy and test models locally and via web interfaces, with seamless model management and knowledge-base capabilities. The server is designed to operate with the ml-lab-mcp CLI (installed via pip) and can provision pipelines across providers, track experiments in SQLite, and store credentials securely in an encrypted vault.
How to install
Prerequisites:\n- Python 3.8+ installed on your system.\n- pip available in PATH.\n- Optional: a CUDA-capable GPU and NVIDIA drivers if you plan to run local training.\n\nInstall the MCP package with optional features:\n pip install ml-lab-mcp\n\n# With training dependencies (GPU/accelerated)\npip install ml-lab-mcp[training]\n\n# With cloud provider support (for provisioning/usage)\npip install ml-lab-mcp[cloud]\n\n# All features (training, cloud, dev)\npip install ml-lab-mcp[training,cloud,dev]\n\n\nRun the server (example using the recommended CLI):\n```
ml-lab serve
Additional notes
Tips and common issues:\n- Ensure your Python environment is clean (virtualenv/conda) to avoid dependency conflicts.\n- When using the credentials vault, regularly rotate keys and enable expirations where supported.\n- If you encounter network issues while provisioning GPUs, check firewall rules and API quotas on your cloud provider.\n- For Open WebUI or Ollama integrations, confirm that the respective services are reachable (URLs, ports, and auth).\n- The system stores experiments in SQLite; for long-term usage, consider configuring a more robust database if needed.\n
Related MCP Servers
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
mcp-pinecone
Model Context Protocol server to allow for reading and writing from Pinecone. Rudimentary RAG
Gitingest
mcp server for gitingest
blender-open
Open Models MCP for Blender Using Ollama
microsoft_fabric_mcp
MCP server wrapping around the Fabric Rest API
mcp -memos-py
A Python package enabling LLM models to interact with the Memos server via the MCP interface for searching, creating, retrieving, and managing memos.