nlp2sql
Enterprise-ready Natural Language to SQL converter with multi-provider AI support (OpenAI, Anthropic, Gemini). Built for production scale databases (1000+ tables) with Clean Architecture.
claude mcp add --transport stdio luiscarbonel1991-nlp2sql python /path/to/nlp2sql/mcp_server/server.py \
--env OPENAI_API_KEY="${OPENAI_API_KEY}" \
--env NLP2SQL_DEFAULT_DB_URL="postgresql://user:pass@localhost:5432/mydb"How to use
nlp2sql exposes a Model Context Protocol (MCP) server that lets an AI assistant query and manipulate a PostgreSQL (or compatible) database via natural language. The server provides a small set of built-in tools designed for conversational data access: ask_database (execute ad hoc questions against the database structure and data), explore_schema (inspect tables, columns, and relationships), run_sql (execute arbitrary SQL with a given context), list_databases (enumerate available databases), and explain_sql (generate human-friendly explanations of SQL queries). These tools enable a chat-based interface to understand the schema, craft queries, and reason about results without exposing raw SQL to end users. To use the MCP, start the Python MCP server with the correct environment variables (API keys for your AI providers and a database connection string). Then your MCP-enabled AI assistant can invoke the tools as needed to answer questions such as “Show active users” or “Explain the revenue trend by month.”
How to install
Prerequisites:
- Python 3.9+ installed on the host machine
- Access to an AI provider API (e.g., OpenAI) and a valid API key
- A PostgreSQL or compatible database reachable from the host
Installation steps:
-
Install Python and a virtual environment tool (optional but recommended):
- On macOS/Linux: ensure python3.9+ is installed
- On Windows: install Python 3.9+ from the official installer
-
Create and activate a virtual environment (optional but recommended):
- python3 -m venv venv
- source venv/bin/activate # Unix
- venv\Scripts\activate # Windows
-
Install the NLP2SQL package (or clone the repository and install locally):
- pip install nlp2sql # if provided as a PyPI package
- or: pip install -e /path/to/nlp2sql # install from source
-
Prepare environment variables and configuration:
- Set OPENAI_API_KEY (and any other provider keys you plan to use)
- Set NLP2SQL_DEFAULT_DB_URL to your database connection string
-
Run the MCP server (example using Python):
- python /path/to/nlp2sql/mcp_server/server.py
-
Verify the MCP server is up by checking logs or attempting a test invocation via the MCP client tooling.
Additional notes
Notes and tips:
- The MCP server expects certain environment variables to be present (for AI providers and DB connection). Ensure OPENAI_API_KEY (and others as needed) are exported in the environment where the server runs.
- The mcp_config shown here maps the nlp2sql MCP server to the path /path/to/nlp2sql/mcp_server/server.py. Replace with the actual path in your deployment.
- If you are behind a corporate network or firewall, ensure outbound access to the AI provider endpoints and your database is allowed.
- For large schemas or expensive embeddings, consider provider choices (Anthropic or Gemini) as appropriate, and apply schema_filters if needed in your app logic.
- Monitor the MCP server logs for tool invocations (ask_database, explore_schema, run_sql, etc.) to tune performance and catch errors early.
- If you update API keys, restart the MCP server to ensure new credentials are picked up.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
deep-research
Use any LLMs (Large Language Models) for Deep Research. Support SSE API and MCP server.
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
pty
pty-mcp-server
sequel
MCP Database servers for Claude, Cursor and Windsuf
oura
Oura Ring Model Controller Protocol (MCP).