mcp
It shows how to use model-context-protocol.
claude mcp add --transport stdio kyopark2014-mcp python -m application.mcp_server
How to use
This MCP server implements a Python-based MCP server that exposes a lightweight toolset for LangGraph-driven agents. It leverages the MCP protocol to describe and provide capabilities (tools), resources, and prompts to MCP clients. With a LangChain MCP Adapter integration, you can wrap the server's tools into LangGraph-compatible nodes, enabling your agents to query local data, call external APIs, and orchestrate RAG-style knowledge retrieval. The server is designed to be lightweight and can be driven via stdio, allowing clients to directly invoke Python code without keeping a long-running server process.
To use, configure your MCP client to read the server configuration (mcp.config) and connect via the chosen transport (stdio in the example). The server exposes tools (capabilities) that the client can list and invoke. It can also coordinate with Lambda-based or remote knowledge sources for retrieval, grading, and generation steps, as described in the repository documentation. Once connected, you can expand the agent by annotating or adding new tools, resources, and prompts to support your application’s data sources and workflows.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- pip (Python package manager) available
- Optional: virtualenv for isolated environments
Step-by-step installation:
# 1. Create a virtual environment (recommended)
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# 2. Upgrade pip
pip install --upgrade pip
# 3. Install MCP and LangChain MCP Adapters
pip install mcp langchain-mcp-adapters
If you plan to use AWS Lambda/RAG integrations as described in the README, you may also install AWS SDK and related packages:
pip install boto3
To run the MCP server (as defined in this repository):
# Ensure you are in the project root and the module path matches the server entry
python -m application.mcp_server
Environment configuration (example):
- You can customize log level, transport type, and other runtime options via environment variables or a config file as needed.
Notes:
- The server is designed to work with the LangChain MCP Adapter; ensure compatibility with your LangGraph setup.
- If you use stdio transport, the client starts and communicates with the Python process via standard input/output streams.
Additional notes
Tips and common issues:
- Ensure the Python entry point matches the module path used in mcpServers (e.g., -m application.mcp_server).
- If using SSE transport, confirm network reachability and proper port exposure for client connections.
- When upgrading dependencies, verify compatibility with langchain-mcp-adapters to avoid API mismatches.
- Set MCP_LOG_LEVEL to DEBUG during development to capture verbose traces for troubleshooting.
- For Lambda-based knowledge sources, ensure IAM permissions and region configurations are correct, and that the Lambda function name matches the payload construction in your server code.
- If you encounter tool loading issues, verify that the server exposes the expected tool list via the MCP protocol (list_tools) before client invocation.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
sandbox
All-in-One Sandbox for AI Agents that combines Browser, Shell, File, MCP and VSCode Server in a single Docker container.
FireRed-OpenStoryline
FireRed-OpenStoryline is an AI video editing agent that transforms manual editing into intention-driven directing through natural language interaction, LLM-powered planning, and precise tool orchestration. It facilitates transparent, human-in-the-loop creation with reusable Style Skills for consistent, professional storytelling.
evo-ai
Evo AI is an open-source platform for creating and managing AI agents, enabling integration with different AI models and services.
skillz
An MCP server for loading skills (shim for non-claude clients).
mcp-proxy-for-aws
AWS MCP Proxy Server