langchain -client
🦜🔗 LangChain Model Context Protocol (MCP) Client
claude mcp add --transport stdio datalayer-langchain-mcp-client python -m langchain_mcp_client \ --env API_KEYS="Your API keys for LLM providers (if required)"
How to use
The LangChain MCP Client is a Python-based client that connects to MCP servers and converts their available tools into LangChain-compatible tools. It enables you to seamlessly integrate multiple MCP servers and use them within LangChain workflows, leveraging a chosen LangChain-compatible LLM for decision making. The core feature is convert_mcp_to_langchain_tools(), which initializes multiple MCP servers in parallel and exposes their tools as a List[BaseTool] that LangChain can consume in your agents or tools pipelines. This client also enables a CLI workflow so you can interactively issue queries and test tool availability against connected MCP servers.
To use it, install the package, configure your MCP servers, and call convert_mcp_to_langchain_tools() in your code. You can then plug the resulting tools into a LangChain ReAct or other agent framework, experiment with tool invocation sequences, and iterate with example queries defined in your llm_mcp_config.json5 configuration. This makes it straightforward to swap MCP endpoints or add new servers without changing your core LangChain integration.
How to install
Prerequisites:
- Python 3.11 or higher
- pip (comes with Python)
Installation steps:
# Optional: create a virtual environment
python -m venv venv
source venv/bin/activate # on Linux/macOS
venv\Scripts\activate.bat # on Windows
# Install the LangChain MCP Client
pip install langchain_mcp_client
Configuration steps:
- Prepare environment variables and API keys as needed for your LLMs or MCPs. Create a .env file if desired and load them in your application.
- Create or modify llm_mcp_config.json5 to specify LLM settings, MCP server connections, and example prompts that demonstrate how to invoke MCP tools. Refer to your MCP server configuration for compatibility.
- In your Python script, import and use convert_mcp_to_langchain_tools() to obtain tools, then build your LangChain agent or workflow as you normally would.
# Example usage in code (conceptual)
from langchain_mcp_client import convert_mcp_to_langchain_tools
tools = convert_mcp_to_langchain_tools(
mcp_server_names=["langchain-mcp-client"]
)
# Use `tools` with a LangChain agent as you would with any list of BaseTool instances
Additional notes
Tips and considerations:
- Ensure Python 3.11+ is active in your environment when running the MCP client.
- The mcp_config entry names (e.g., server-name) can be customized; you’ll reference them in your code when selecting which MCPs to connect to.
- If your MCP servers require API keys or environment-specific credentials, provide them via the env field in the mcp_config or load from a .env file at runtime.
- The README mentions using llm_mcp_config.json5; this file should define LLM parameters, MCP server connections, and example prompts to guide tool invocation. Adjust the configuration to reflect your environment and desired tool sets.
- If you encounter network or authentication errors, verify that MCP server endpoints are reachable and that any required API keys are correctly provided.
- The npm_package field is null for this Python-based client; Node.js users should use the appropriate npm package if available.
Related MCP Servers
AgentChat
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
mcp-use-ts
mcp-use is the framework for MCP with the best DX - Build AI agents, create MCP servers with UI widgets, and debug with built-in inspector. Includes client SDK, server SDK, React hooks, and powerful dev tools.
mcp-toolbox-sdk-python
Python SDK for interacting with the MCP Toolbox for Databases.
langgraph-ai
LangGraph AI Repository
furi
CLI & API for MCP management
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.