mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
claude mcp add --transport stdio elkhn-mcp-playground python servers/weather/main.py \ --env PYTHONUNBUFFERED="1"
How to use
MCP Playground is a Streamlit-based interface that lets you chat with large language models and dynamically invoke external MCP tools. It spins up two FastMCP servers (Weather Service and Currency Exchange) alongside a Streamlit client, all orchestrated via Docker Compose. The MCP tools register with the agent so that, as you chat, the React agent can automatically detect available MCP tools and route relevant calls to them. You can inspect tool invocations as YAML blocks streamed back during a session, and you can add more MCP servers through the UI if desired. The client is provider-agnostic, supporting providers like OpenAI, Bedrock, Anthropic, Google Gemini, and Groq through LangChain and LangGraph.
How to install
Prerequisites:
- Docker (version 24+ recommended) and Docker Compose
- Access/credentials for an LLM provider (e.g., OPENAI_API_KEY) or AWS credentials for Bedrock
- Clone the repository
git clone https://github.com/your-org/mcp-playground.git
cd mcp-playground
- Build and run the stack with Docker Compose
docker compose up --build
- Access the applications:
- Streamlit Client: http://localhost:8501
- Weather MCP: http://localhost:8000
- Currency MCP: http://localhost:8001
Note: The project uses uv for dependency installation inside the containers to speed up builds.
Additional notes
Tips and considerations:
- Ensure your LLM provider keys are available in the environment (e.g., OPENAI_API_KEY) or that you have AWS credentials for Bedrock.
- The UI allows dynamic MCP server management; you can add/remove servers through the sidebar without editing config files.
- If you run into port conflicts, adjust the docker-compose.yaml ports or disable conflicting services.
- The included MCP servers (Weather and Currency) expose endpoints on 8000 and 8001 respectively; you can implement additional MCP servers by following the same pattern and updating servers_config or the UI.
- For best performance, keep the uv-based dependency management; rebuilds will be faster when pyproject.toml or dependencies change.
Related MCP Servers
AgentChat
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
mcp-manager
CLI tool for managing Model Context Protocol (MCP) servers in one place & using them across them different clients
mcp-community
Easily run, deploy, and connect to MCP servers
knowledgebase
BioContextAI Knowledgebase MCP server for biomedical agentic AI
Ai_agents
This is a repository of collection of many agents build on top of Langchain , Langgraph, MCP and so many amazing tools
mcp
MCP server from gaurisharan/mcp