turbular
A MCP server allowing LLM agents to easily connect and retrieve data from any database
claude mcp add --transport stdio raeudigerraeffi-turbular python -m uvicorn app.main:app --reload
How to use
Turbular is a Python-based MCP server that exposes a FastAPI application to manage and query multiple databases through a single unified API. It provides endpoints for getting database schemas, executing queries with optional normalization, and uploading credential data (e.g., BigQuery keys) as part of its data source management. This makes it easier for AI agents or LLM-powered tools to interact with PostgreSQL, MySQL, SQLite, BigQuery, Oracle, MS SQL, and Redshift via a consistent interface. The server emphasizes fast startup in development (via uvicorn with reload) and supports Docker deployments as described in the repository, enabling quick iteration when adding new data sources or providers.
To use the API, start the server and call the exposed endpoints: POST /get_schema to retrieve a connected database schema, POST /execute_query to run SQL against a connected database (with optional normalization and result limits), and endpoints like POST /upload-bigquery-key or POST /upload-sqlite-file to provision credentials or data sources. The API also offers health checks at GET /health and a listing of supported databases at GET /supported-databases. These tools enable AI-assisted data access, transformation, and retrieval through a single, consistent interface across multiple database backends.
How to install
Prerequisites:
- Python 3.11 or higher
- git
- Access to install Python packages (pip)
Step-by-step installation:
- Clone the repository
git clone https://github.com/raeudigerRaeffi/turbular.git
cd turbular
- Create and activate a Python environment (recommended)
# On macOS/Linux
python -m venv .venv
source .venv/bin/activate
# On Windows
python -m venv .venv
.\.venv\Scripts\activate
- Install dependencies
pip install -r requirements.txt
- Run the server (development mode with auto-reload)
uvicorn app.main:app --reload
Optional Docker-based development:
- If you prefer Docker, follow the repository's Docker usage instructions (docker-compose.dev.yml) to start the development environment and PostgreSQL instance.
Note: If you maintain a local environment, ensure that any required environment variables (such as database credentials or paths for credentials) are configured as needed by your deployment.
Additional notes
Tips and common considerations:
- The server exposes multiple endpoints for database interactions (e.g., /get_schema, /execute_query, /upload-bigquery-key, /upload-sqlite-file). Use these to provision data sources and run LLM-assisted queries.
- For Docker deployments, use the recommended docker-compose.dev.yml during development to spin up the API plus a test database.
- When connecting to remote databases, ensure SSL and authentication configurations are correctly set due to the security-focused features mentioned in the README.
- If you extend the data sources, you can implement new connectors by following the BaseDBConnector interface described in the repository.
- Access the API docs via the automatic OpenAPI/Swagger UI once the server is running (typically at http://localhost:8000/docs).
- Use the health endpoint GET /health to quickly verify that the API is running in your environment.
Related MCP Servers
npcpy
The python library for research and development in NLP, multimodal LLMs, Agents, ML, Knowledge Graphs, and more.
time
⏰ Time MCP Server: Giving LLMs Time Awareness Capabilities
hyperterse
The MCP framework. Connect your data to your agents.
mcp-konnect
A Model Context Protocol (MCP) server for interacting with Kong Konnect APIs, allowing AI assistants to query and analyze Kong Gateway configurations, traffic, and analytics.
agentic-commerce-protocol-demo
Reference implementation of OpenAI's Agentic Commerce Protocol (ACP)
Python-Runtime-Interpreter
PRIMS is a lightweight, open-source Model Context Protocol (MCP) server that lets LLM agents safely execute arbitrary Python code in a secure, throw-away sandbox.