MCP_Host_Client_Server_SSE
手搓MCP样例,使用SSE模式,访问本地数据。从Hosts到server全部代码演示。
claude mcp add --transport stdio hanshan006-mcp_host_client_server_sse python mcp_server.py \ --env API_KEY="your_api_key_here"
How to use
This MCP server implements a host-client-server architecture over the MCP protocol with Server-Sent Events (SSE) for real-time communication. The host component integrates OpenAI to translate natural language questions into SQL queries executed against a SQLite database, returning results and natural language explanations back to the user. The server acts as an intermediary between the client and the database, orchestrating API calls to OpenAI (or compatible interfaces) and executing SQL against SQLite while streaming updates via SSE. You can run the MCP server and then connect a client to issue natural language prompts, receive incremental responses, and see the resulting SQL-backed data in real time.
To use it, start the MCP server (mcp_server.py) and run the host/client application (mcp_host.py). The client sends natural language queries, which the server forwards through the MCP framework to the host, which queries OpenAI to generate SQL, then executes the SQL against the SQLite database and streams the results back to the client via SSE. The system also logs interactions for traceability and debugging. Ensure you have a valid OpenAI API key configured to enable natural language processing and SQL generation.
How to install
Prerequisites:
- Python 3.8+
- pip (Python package manager)
- OpenAI API key
Installation steps:
- Clone the repository and navigate into it:
git clone https://github.com/HanShan006/MCP_Host_Client_Server_SSE.git
cd MCP_Host_Client_Server_SSE
- Create and activate a Python environment (optional but recommended):
python -m venv venv
# On Windows:
venv\Scripts\activate
# On macOS/Linux:
source venv/bin/activate
- Install dependencies:
pip install openai mcp-framework starlette uvicorn
- Configure API key (OpenAI) by creating config.ini as described in the project or set environment variable:
# config.ini
[secrets]
API_KEY=your_api_key_here
Or set in the environment:
export API_KEY=your_api_key_here
- Prepare the database (as included in the repo):
python create_database.py
- Run the MCP server and host client:
python mcp_server.py
python mcp_host.py
Additional notes
Notes and tips:
- The configuration expects an API key for OpenAI access. If the key is not provided, requests to the OpenAI API will fail.
- The system uses SQLite; ensure the database file path (database.db by default) is writable and present.
- SSE streams will deliver incremental results; ensure your client supports EventSource or equivalent.
- If you modify config.ini, you may need to restart the MCP server to pick up changes.
- When debugging, enable logging to capture request/response traces for OpenAI and database queries.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP