Get the FREE Ultimate OpenClaw Setup Guide →

langchain_data_agent

NL2SQL - Ask questions in plain English, get SQL queries and results. Powered by LangGraph.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio eosho-langchain_data_agent python -m data_agent \
  --env AZURE_OPENAI_API_KEY="your-api-key" \
  --env AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/" \
  --env AZURE_OPENAI_DEPLOYMENT="gpt-4o"

How to use

This MCP server provides a natural language to SQL (NL2SQL) agent built on LangGraph and Azure OpenAI. It orchestrates multiple data backends and data agents, automatically routing user questions to the appropriate database dialect and returning optimized SQL queries and results. Users can query across PostgreSQL, Azure SQL, Synapse, Cosmos DB, Databricks SQL, and BigQuery, with safe query generation, dialect validation, and optional data visualization. The system supports multi-turn conversations, intent detection, and A2A agent interoperability, making it suitable for powering analytics assistants or business intelligence chatbots.

To use it, start the server (via the Python module entry point) and leverage the included CLI and UI tools. The CLI allows querying data agents in natural language, listing available configurations, and validating configuration files. You can also use the Chainlit-based web interface for interactive exploration, which exposes profiles like Contoso, Amex, and Adventure Works for exploring representative data scenarios. Configure your environment by supplying Azure OpenAI credentials and deployment names, then run queries or start interactive chats to see how questions are translated into NL2SQL and executed against the configured backends.

How to install

Prerequisites:

  • Python 3.12+
  • Git
  • Access to an Azure OpenAI deployment (endpoint and API key)
  • Optional: uv package manager for installation and extras

Install steps:

  1. Clone the repository git clone https://github.com/eosho/langchain_data_agent cd langchain_data_agent

  2. Install Python dependencies (via uv or directly with Python) uv sync --all-extras

    or if you prefer pip:

    python -m pip install -r requirements.txt

  3. Create and configure environment variables cp .env.example .env

    Edit .env with your Azure OpenAI endpoint, API key, and deployment

  4. Run the server

    Using uv (recommended for this project):

    uv run -m data_agent

  5. (Optional) Start the Chainlit UI for interactive exploration chainlit run src/data_agent/ui/app.py

Notes:

  • Ensure your Azure OpenAI deployment supports the required model (e.g., gpt-4o).
  • The repository includes multiple YAML configs for contoso, amex, adventure_works; you can validate or load specific configs using the data-agent CLI.

Additional notes

Tips and caveats:

  • Environment variables AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, and AZURE_OPENAI_DEPLOYMENT are required for Azure OpenAI integration.
  • Use the data-agent CLI to explore available configurations with data-agent configs and to validate YAML config files with data-agent validate.
  • The system supports safe SQL generation and dialect validation via sqlglot across multiple dialects; enable verbose output to inspect agent, SQL, and message history during debugging.
  • The Chainlit UI profiles (Contoso, Amex, Adventure Works) provide ready-made test scenarios for exploring NL2SQL capabilities against representative backends.
  • If you encounter authentication or connectivity issues with backends, verify network access, credentials, and the appropriate SQL dialect configuration in your config files.

Related MCP Servers

Sponsor this space

Reach thousands of developers