langtools
Supercharge AI agents with programming language tools
claude mcp add --transport stdio flothjl-langtools-mcp uvx langtools-mcp
How to use
langtools-mcp provides a unified interface for running static analysis tools and LSP-backed clients to help AI agents validate, lint, and debug code across Python and Go. It exposes both batch CLI checkers (like Ruff for Python and vet for Go) and language servers (such as pyright, gopls) through a lightweight HTTP daemon. This allows an AI agent or IDE to request analysis, lint, or go-to-definition capabilities via MCP requests, receive structured results, and apply fixes or explanations in-context. You can configure which tools to enable and what commands are run by adjusting the server’s environment and language/tool mappings.
To use the server, run it using the provided MCP configuration (uvx in this case) and then query the endpoint for the tools you need. For Python, you’ll typically have Ruff and Pyright available; for Go, vet is exposed. The project is designed to be daemonized for long-running analysis or invoked in batch mode for quick checks. As you integrate with your Go or Python projects, the MCP server will help the agent catch issues, explain them, and suggest fixes by delegating to the same tools developers use, keeping analysis consistent with standard workflows.
How to install
Prerequisites:
- Python 3.10+ and a working Go toolchain in PATH (for Python and Go tool support)
- Internet access to install Python packages and language servers
Installation steps:
-
Clone the repository git clone https://github.com/flothjl/langtools-mcp.git cd langtools-mcp
-
Install dev dependencies and set up the package uvx # or install via: pip install -e .[dev] (as described in the project README)
-
Ensure language tools are installed and available in PATH:
- Python: Ruff, Pyright
- Go: go vet (and ensure Go tooling is installed)
- Optional language servers: pyright, gopls, etc.
-
Run the MCP server using the configuration uvx langtools-mcp
Notes:
- The exact commands may vary if you choose to run in a different environment (e.g., npx, docker, etc.).
- If you need to customize tool mappings or environment variables, adjust the environment and tool configuration accordingly.
Additional notes
Tips and caveats:
- Ensure Ruff, Pyright, and Go tooling are installed and available in your shell PATH for Python and Go support.
- When using Goose or other agents, provide the correct configuration so the MCP server can surface the expected CLI and LSP capabilities.
- If you encounter issues with tool discovery, verify that environment variables (LANGTOOLS_PYTHON_TOOLS and LANGTOOLS_GO_GO_TOOLS) are set to include the desired tools (e.g., ["ruff"] and ["vet"]).
- The roadmap indicates planned Rust and JS/TS tool support; current focus is Python and Go analysis through CLI tools.
- Running in daemon mode provides faster iterations for long-running sessions, while batch mode is suitable for quick checks.
Related MCP Servers
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
mcp-pinecone
Model Context Protocol server to allow for reading and writing from Pinecone. Rudimentary RAG
Gitingest
mcp server for gitingest
blender-open
Open Models MCP for Blender Using Ollama
microsoft_fabric_mcp
MCP server wrapping around the Fabric Rest API
mcp -memos-py
A Python package enabling LLM models to interact with the Memos server via the MCP interface for searching, creating, retrieving, and managing memos.