MCP -with-langchain-client
LangChain + LangGraph client using tools from a MCP server
claude mcp add --transport stdio viniciusfinger-mcp-server-with-langchain-client uvx viniciusfinger-mcp-server-with-langchain-client
How to use
This MCP server provides a FastMCP-based server paired with a LangChain + LangGraph client. The client implements a ReAct-style agent that uses a graph-based state machine to manage conversations, enabling memory across turns by tying state to a unique thread_id. The server exposes tools via FastMCP and coordinates with the LangChain client to perform reasoning, select tools, execute them, and return results to the user. Practically, you run the Python server, expose it behind an ngrok tunnel if needed, and then interact with the client’s /ask endpoint to carry out tasks such as querying data, performing actions, or retrieving information through the agent’s toolset.
To use it, first run the client using uv and then call the Ask endpoint to start a conversation or query the agent. The client handles the orchestration of reasoning and actions, leveraging LangChain + LangGraph under the hood to maintain state and provide a coherent dialogue. The MCP server side provides the necessary tools and endpoints for the client to perform tasks in a controlled environment, with tracing by thread_id to maintain context across requests.
How to install
Prerequisites:
- Python 3.9+ installed
- uv (Uvicorn/uv) available for running Python modules (via uvx in this project setup)
- Access to a network path to install dependencies (if behind a firewall)
Installation steps:
- Clone the repository:
git clone https://github.com/your-org/viniciusfinger-mcp-server-with-langchain-client.git
cd viniciusfinger-mcp-server-with-langchain-client
- Install dependencies using uv (per project convention):
uv sync
- Start the ngrok tunnel if you need an externally accessible URL (optional):
ngrok http --url={ngrok_url} 8000
- Run the MCP server (port 8000 by default):
uv run python main.py --port 8000
- For the client, run it similarly or ensure your FastMCP environment points to the server endpoint.
Note: Replace placeholders like {ngrok_url} with the actual URL provided by ngrok or your reverse proxy. If you update dependencies, re-run uv sync to install required packages.
Additional notes
Tips and common considerations:
- The client uses a thread_id to maintain conversation state. Ensure you pass a consistent thread_id across related requests to preserve memory and context.
- Consider adding a security layer (token or API key) to protect the MCP server from unauthorized access.
- Enable logging with thread_id traces to simplify debugging and monitoring of conversations.
- If you modify tools or prompts, update tests to cover both reasoning and action steps to prevent regressions.
- For production, configure a robust pointer to a Redis or database-backed checkpointer with TTL and summaries to manage memory usage.
- Ensure the ngrok or reverse proxy configuration forwards headers properly if you rely on them for tracing or authentication.
Related MCP Servers
AgentChat
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
langgraph-ai
LangGraph AI Repository
MCP-MultiServer-Interoperable-Agent2Agent-LangGraph-AI-System
This project demonstrates a decoupled real-time agent architecture that connects LangGraph agents to remote tools served by custom MCP (Modular Command Protocol) servers. The architecture enables a flexible and scalable multi-agent system where each tool can be hosted independently (via SSE or STDIO), offering modularity and cloud-deployable execut
ai-learning
AI Learning: A comprehensive repository for Artificial Intelligence and Machine Learning resources, primarily using Jupyter Notebooks and Python. Explore tutorials, projects, and guides covering foundational to advanced concepts in AI, ML, DL and Gen/Agentic Ai.
mcp -templates
A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.
alris
Alris is an AI automation tool that transforms natural language commands into task execution.