Get the FREE Ultimate OpenClaw Setup Guide →

WebwithMCP

基于 FastAPI + 原生前端的智能助手 Web 应用,支持实时对话、MCP 工具调用与对话历史管理。开箱即用,易于扩展,适合 AI 工具集成与智能对话场景。

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio guangxiangdebizi-webwithmcp uvx main:app \
  --env BACKEND_PORT="8003" \
  --env OPENAI_MODEL="your-model" \
  --env OPENAI_API_KEY="your-api-key-here" \
  --env OPENAI_TIMEOUT="60" \
  --env OPENAI_BASE_URL="https://api.example.com/v1" \
  --env OPENAI_TEMPERATURE="0.2"

How to use

WebwithMCP is a Python-based MCP server that powers a web chat interface with real-time messaging, tool invocation, and history management. The backend exposes an MCP-enabled FastAPI application that communicates with MCP tool services over the MCP protocol and serves as the core intelligence for multi-tool orchestration. The frontend connects via WebSocket to the backend for streaming AI responses and tool progress, while REST endpoints provide access to tools and conversation history. To get started, run the backend server (the MCP-enabled FastAPI app) and load the frontend configuration so the client can connect to the correct backend. The system will automatically load configured MCP tools from backend/mcp.json and enable dynamic tool usage in conversations.

How to install

Prerequisites:

  • Python 3.8+ with pip
  • Optional: Node.js for frontend development (not required for basic usage)
  1. Clone the repository
  1. Install backend dependencies
  • cd backend
  • python -m venv venv
  • source venv/bin/activate (on macOS/Linux) or venv\Scripts\activate (on Windows)
  • pip install -r requirements.txt
  1. Prepare environment variables
  • Create a .env file in the repository root (or use the example) and set: OPENAI_API_KEY=your-api-key-here OPENAI_BASE_URL=https://api.example.com/v1 OPENAI_MODEL=your-model OPENAI_TEMPERATURE=0.2 OPENAI_TIMEOUT=60 BACKEND_PORT=8003
  1. Configure MCP servers (optional)
  1. Start the backend (MCP-enabled FastAPI app)
  • uvicorn main:app --reload --host 0.0.0.0 --port 8003
  1. (Optional) Start the frontend
  • If you have the frontend locally:
    • Ensure frontend/config.json points to the backend at http://localhost:8003
    • Serve frontend via a static server or any static hosting solution

After these steps, open the frontend in a browser and connect to the backend WebSocket at /ws/chat. The MCP-enabled server will handle conversations, trigger tool calls as needed, and store chat history in SQLite as configured by the project.

Additional notes

Tips and caveats:

  • The MCP tool configuration lives in backend/mcp.json; adding or updating tools requires restarting the backend so MCP tools are reloaded.
  • Environment variables control the AI API integration; ensure your API key and endpoint are correct for OpenAI-compatible interfaces.
  • The backend uses SQLite for chat history; ensure the backend process has write permissions to the project directory.
  • If using Docker, you can containerize the Python backend and expose port 8003 to your reverse proxy; update mcp_config accordingly.
  • The frontend is optional for development; you can serve static HTML/JS directly as described in the README.
  • If you modify MCP configurations or add new tools, the frontend does not require changes—the server will expose new tools automatically after a restart.

Related MCP Servers

Sponsor this space

Reach thousands of developers