mcp -bridge
This is a pure MCP-based database access platform that provides secure SQL execution and schema introspection via the Model Context Protocol (MCP) using HTTP+SSE transport. The server acts as a data access layer while clients handle NLP-to-SQL conversion using LLMs.
claude mcp add --transport stdio nandeshkanagaraju-mcp-server-bridge python -m uvicorn api.main:app --reload --port 8000 \ --env DB_HOST="Database host" \ --env DB_USER="Database user" \ --env LOG_LEVEL="DEBUG|INFO|WARNING|ERROR (optional)" \ --env DB_PASSWORD="Database password" \ --env OPENAI_API_KEY="Your OpenAI API key"
How to use
Universal MCP: An Intelligent Database Gateway exposing a secure REST API and MCP protocol tools. This bridge server provides a natural language to SQL translation endpoint, a conversation-aware memory store, and traditional MCP tooling for secure, validated SQL execution across multiple databases. Core capabilities include the Natural Language API for converting questions into SQL queries, a conversation history that persists across sessions, and tooling to inspect schemas, describe tables, paginate data, and validate queries. Start the REST API with an integrated FastAPI server (via uvicorn) and interact with both natural language endpoints and MCP-style query endpoints through the same server instance.
How to install
Prerequisites:
- Python 3.11+
- MySQL server (or any supported database) with credentials
- Git
- Clone the repository:
git clone <repository-url>
cd universal-mcp
- Set up a Python virtual environment and install dependencies:
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
- Configure environment variables (example):
cp .env.example .env
nano .env
Ensure OPENAI_API_KEY and database credentials (DB_HOST, DB_USER, DB_PASSWORD) are set. 4) Initialize and start the server:
# Start the REST API (uvicorn)
uvicorn api.main:app --reload --port 8000
- Optional: interact with the interactive chat client if available in scripts/:
# If a chat client exists, activate virtual environment and run
source venv/bin/activate
python scripts/chat.py
- Verify MCP tools are accessible at the REST endpoints and via MCP protocol as described in the README.
Additional notes
Tips and considerations:
- The server uses a persistent file-based conversation store for memory across sessions; ensure the filesystem is writable.
- Security: only allow SELECT-like queries from LLM-generated inputs; enable the provided validation pipeline to prevent unsafe statements.
- Logging can be tuned with the LOG_LEVEL environment variable; by default, INFO-level logs are emitted to mcp_server.log.
- If connecting to a non-local database, ensure network access and proper credentials; update DB_HOST, DB_USER, and DB_PASSWORD accordingly.
- The REST API endpoint for natural language queries is typically at /api/v1/query/natural-language; use this to translate NL questions into SQL and execute them via the secure executor.
- For debugging, tail the log file: tail -f mcp_server.log
- If you modify the code, consider reloading uvicorn with --reload or restarting the server to pick up changes.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP