Model-Context-Protocol -Demo-with-langchain -ADAPTERS-Ollama
Demo of implementation of MCP using Langchain MCP Adapters and Ollama
claude mcp add --transport stdio ishaanlabs-model-context-protocol-mcp-demo-with-langchain-mcp-adapters-ollama python mathserver.py
How to use
This MCP demo exposes two simple servers: a math server and a weather server. The math server offers basic arithmetic operations, while the weather server simulates weather data responses. A client script (client.py) demonstrates how an LLM-powered agent would discover and invoke these tools through the MCP protocol. In practice, you can run the two servers in separate processes and then use a multi-client setup (multiclient.py) to show how an agent can orchestrate across multiple tools to complete a complex task. The adapters included with LangChain-MCP enable seamless integration with LangChain workflows and tool catalogs, so you can plug these servers into your existing agent pipelines and let the LLM decide which tool to call based on the user’s request.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- Basic understanding of running Python scripts
- (Optional) virtual environment management tool (venv) for isolation
Step-by-step:
- Clone the repository and navigate to the project directory.
- Install Python dependencies if a requirements file exists (adjust as needed):
- python -m venv venv
- source venv/bin/activate # macOS/Linux
- .\venv\Scripts\activate # Windows
- pip install -r requirements.txt # if a requirements file is provided
- Ensure you have the demo scripts available: mathserver.py, weatherserver.py, client.py, and multiclient.py in the working directory.
- Run the math server in one terminal:
- python mathserver.py
- In another terminal, run the weather server:
- python weatherserver.py
- Run the client/demo orchestrator to see MCP in action (optional):
- python client.py
- If you want to exercise multi-server orchestration, run:
- python multiclient.py
Additional notes
Tips and considerations:
- The demo servers are intentionally lightweight to illustrate the MCP interaction pattern. They are suitable for local testing and demonstrations.
- If you modify the servers, ensure the ports or IPC mechanisms they use do not conflict with other running services on your machine.
- The LangChain MCP adapters help bridge these servers with LangChain/LLM workflows; you can expand the tool catalog by adding more servers following the same pattern.
- In a real deployment, consider securing the MCP communication channel, handling authentication, and managing credentials for external services.
- If you encounter issues with Python environments, try running within a virtual environment and ensure your Python path is correctly set.
- For troubleshooting, check server logs for startup messages and any exceptions raised by mathserver.py or weatherserver.py.
Related MCP Servers
mcp-tool-kit
Agentic abstraction layer for building high precision vertical AI agents written in python for Model Context Protocol.
Archive-Agent
Find your files with natural language and ask questions.
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
adk -gemma3
Build AI Agent using Google ADK , MCP and Gemma 3 model
srclight
Deep code indexing MCP server for AI agents. 25 tools: hybrid FTS5 + embedding search, call graphs, git blame/hotspots, build system analysis. Multi-repo workspaces, GPU-accelerated semantic search, 10 languages via tree-sitter. Fully local, zero cloud dependencies.
ollama -example
Ollama MCP example for dummies.