lc2mcp
Convert LangChain tools to FastMCP tools
claude mcp add --transport stdio xiaotonng-lc2mcp python -m lc2mcp
How to use
lc2mcp is a Python-based MCP server that adapts existing LangChain tools into FastMCP tools, enabling you to expose them to MCP clients such as Claude and Cursor with minimal boilerplate. The server acts as an adapter layer: it takes LangChain tools (defined with @tool or via tool-like wrappers), converts them into MCP-compatible tools, and serves them through the FastMCP framework. This allows you to leverage LangChain's extensive tool ecosystem while providing standardized MCP endpoints that external clients can discover and invoke. You can also inject authentication context and request metadata into tools, receive real-time progress updates, and rely on automatic JSON schema generation for tool inputs.
To use lc2mcp, install the package, expose your LangChain tools with lc2mcp’s register_tools helper, and run the MCP server. Your tools will then be available to MCP clients under a namespace you provide (for example, a server named weather-server or knowledge-server). The server supports features such as context injection (auth, user info, request context) and progress logging, enabling richer interactions with clients and better observability.
How to install
Prerequisites:
- Python 3.10+ (or as required by lc2mcp and LangChain dependencies)
- pip (Python package manager)
Steps:
-
Create and activate a Python virtual environment (optional but recommended):
python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate.bat
-
Install lc2mcp (and an optional set of LangChain components you plan to expose):
pip install lc2mcp
Optional: install LangChain and any community tools you plan to expose
pip install langchain langchain-core # adjust as needed
-
Prepare your Python script to define and register tools, then run the server using lc2mcp. Example usage is provided in the README; you would typically create a script that imports register_tools, defines your LangChain tools, registers them with a FastMCP instance, and starts the server.
-
Run the server:
python -m lc2mcp
Notes:
- If you prefer containerized deployment, you can build a Docker image with Python and install lc2mcp followed by running python -m lc2mcp.
- You can customize authentication context or pass environment variables to configure the server’s behavior (see additional notes).
Additional notes
Tips and common considerations:
- Environment variables: You can pass authentication or runtime context through your MCP client calls; lc2mcp supports injecting a runtime context into tools. Consider configuring environment-based defaults for user IDs, tenants, or API keys if your deployment requires them.
- Tool descriptions and input schemas: To ensure parameter descriptions are exposed in the MCP JSON schema, use parse_docstring=True on your tool or supply an explicit args_schema. Without these, tool parameter documentation may not be exposed automatically.
- Logging and progress: lc2mcp supports progress notifications and logging to help clients understand long-running tool calls. Use the provided Context or mcp_ctx utilities to emit updates.
- Namespace and conflicts: lc2mcp offers namespace support to prefix tool names and avoid conflicts when exposing multiple tools. Plan naming to minimize collisions across your MCP server ecosystem.
- Version compatibility: Ensure your LangChain version and lc2mcp version are compatible with your Python environment, especially if you rely on newer LangChain features.
Related MCP Servers
mcp-telegram
MCP Server for Telegram
fal
MCP server for Fal.ai - Generate images, videos, music and audio with Claude
skill-to
Convert AI Skills (Claude Skills format) to MCP server resources - Part of BioContextAI
mcp -mattermost
MCP server for Mattermost — let Claude, Cursor, and other AI assistants work with channels, messages, and files
memory
A MCP (Model Context Protocol) server providing long-term memory for LLMs
openapi-to
Transform OpenAPI specifications into production-ready MCP servers with AI-powered evaluation and enhancement. Leverages LLMs to analyze, improve, and generate Model Context Protocol implementations from your existing API documentation.