WebwithMCP
基于 FastAPI + 原生前端的智能助手 Web 应用,支持实时对话、MCP 工具调用与对话历史管理。开箱即用,易于扩展,适合 AI 工具集成与智能对话场景。
claude mcp add --transport stdio guangxiangdebizi-webwithmcp uvx main:app \ --env BACKEND_PORT="8003" \ --env OPENAI_MODEL="your-model" \ --env OPENAI_API_KEY="your-api-key-here" \ --env OPENAI_TIMEOUT="60" \ --env OPENAI_BASE_URL="https://api.example.com/v1" \ --env OPENAI_TEMPERATURE="0.2"
How to use
WebwithMCP is a Python-based MCP server that powers a web chat interface with real-time messaging, tool invocation, and history management. The backend exposes an MCP-enabled FastAPI application that communicates with MCP tool services over the MCP protocol and serves as the core intelligence for multi-tool orchestration. The frontend connects via WebSocket to the backend for streaming AI responses and tool progress, while REST endpoints provide access to tools and conversation history. To get started, run the backend server (the MCP-enabled FastAPI app) and load the frontend configuration so the client can connect to the correct backend. The system will automatically load configured MCP tools from backend/mcp.json and enable dynamic tool usage in conversations.
How to install
Prerequisites:
- Python 3.8+ with pip
- Optional: Node.js for frontend development (not required for basic usage)
- Clone the repository
- git clone https://github.com/guangxiangdebizi/WebwithMCP.git
- cd WebwithMCP
- Install backend dependencies
- cd backend
- python -m venv venv
- source venv/bin/activate (on macOS/Linux) or venv\Scripts\activate (on Windows)
- pip install -r requirements.txt
- Prepare environment variables
- Create a .env file in the repository root (or use the example) and set: OPENAI_API_KEY=your-api-key-here OPENAI_BASE_URL=https://api.example.com/v1 OPENAI_MODEL=your-model OPENAI_TEMPERATURE=0.2 OPENAI_TIMEOUT=60 BACKEND_PORT=8003
- Configure MCP servers (optional)
- Edit backend/mcp.json to define your MCP tool servers. Example: { "servers": { "finance-data-server": { "url": "http://106.14.205.176:3101/sse", "transport": "sse" }, "your-custom-server": { "url": "http://your-server-url:port", "transport": "sse" } } }
- Start the backend (MCP-enabled FastAPI app)
- uvicorn main:app --reload --host 0.0.0.0 --port 8003
- (Optional) Start the frontend
- If you have the frontend locally:
- Ensure frontend/config.json points to the backend at http://localhost:8003
- Serve frontend via a static server or any static hosting solution
After these steps, open the frontend in a browser and connect to the backend WebSocket at /ws/chat. The MCP-enabled server will handle conversations, trigger tool calls as needed, and store chat history in SQLite as configured by the project.
Additional notes
Tips and caveats:
- The MCP tool configuration lives in backend/mcp.json; adding or updating tools requires restarting the backend so MCP tools are reloaded.
- Environment variables control the AI API integration; ensure your API key and endpoint are correct for OpenAI-compatible interfaces.
- The backend uses SQLite for chat history; ensure the backend process has write permissions to the project directory.
- If using Docker, you can containerize the Python backend and expose port 8003 to your reverse proxy; update mcp_config accordingly.
- The frontend is optional for development; you can serve static HTML/JS directly as described in the README.
- If you modify MCP configurations or add new tools, the frontend does not require changes—the server will expose new tools automatically after a restart.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
excel
A Model Context Protocol server for Excel file manipulation
mcp-neo4j
Neo4j Labs Model Context Protocol servers
weather
A lightweight Model Context Protocol (MCP) server that enables AI assistants like Claude to retrieve and interpret real-time weather data. Discuss on Hacker News:
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
AI-web mode
一个基于 MCP (Model Context Protocol) 的智能对话助手Web应用,支持实时聊天、工具调用和对话历史管理。