steampipe
Enable AI assistants to explore and query your Steampipe data!
claude mcp add --transport stdio turbot-steampipe-mcp npx -y @turbot/steampipe-mcp \ --env STEAMPIPE_MCP_LOG_LEVEL="Control server logging verbosity (default: info)" \ --env STEAMPIPE_MCP_WORKSPACE_DATABASE="Override the default Steampipe connection string (default: postgresql://steampipe@localhost:9193/steampipe)"
How to use
This MCP server enables natural language querying and analysis over Steampipe data. It exposes a set of Tools that let your AI assistant run SQL queries against Steampipe, list available Steampipe tables, inspect table schemas, and manage Steampipe plugins. Use the best_practices prompt to tune the LLM’s behavior for Steampipe data, and then ask questions like which AWS accounts are visible, or to generate reports on EC2 instances and their attached storage. The included tools steer the model toward read-only SQL operations, ensuring safe access to your cloud data while enabling security, cost, and compliance insights across AWS, Azure, GCP and many services.
Available Tools:
- steampipe_query: Run SQL queries (PostgreSQL syntax) against Steampipe. Provide a string input with your SQL query. Use CTEs for performance and to structure complex analytics.
- steampipe_table_list: List tables, with optional schema and filter parameters to narrow the results.
- steampipe_table_show: Show details for a specific table, including columns and data types. Optionally specify a schema.
- steampipe_plugin_list: List installed Steampipe plugins (data sources like AWS, GCP, Azure).
- steampipe_plugin_show: Show details for a specific plugin installation (version, memory, config).
Prompts and Capabilities:
- best_practices: Guidance for working with Steampipe data, including SQL style, caching, and when to use WITH clauses vs joins.
- Status resource: Check the current Steampipe connection status and the active connection string.
To use locally, ensure Steampipe is running and the MCP server is reachable by your AI assistant. The default connection is to a local Steampipe instance at postgresql://steampipe@localhost:9193/steampipe. For Turbot Pipes, provide a workspace connection string as an argument to the steampipe-mcp tool. After starting, restart your assistant to apply the new MCP server configuration.
How to install
Prerequisites:
- Node.js v16 or higher (includes npx)
- For local use: Steampipe installed and running (steampipe service start)
- For Turbot Pipes: A Turbot Pipes workspace and a connection string
Installation steps:
- Install dependencies and build the MCP server (from project root):
npm install
npm run build
- Run the MCP server using npx (as documented):
# Example (default local Steampipe connection)
npx -y @turbot/steampipe-mcp
- If you plan to point to a Turbot Pipes workspace, pass the connection string as an argument:
npx -y @turbot/steampipe-mcp "postgresql://my_name:my_pw@workspace-name.usea1.db.pipes.turbot.com:9193/abc123"
- Alternatively, in development testing, run directly via Node.js with the built dist/index.js:
node dist/index.js postgresql://steampipe@localhost:9193/steampipe
- Verify environment variables:
export STEAMPIPE_MCP_LOG_LEVEL=info
export STEAMPIPE_MCP_WORKSPACE_DATABASE=postgresql://steampipe@localhost:9193/steampipe
- Optional: Validate MCP server with the Inspector:
npx @modelcontextprotocol/inspector dist/index.js
Additional notes
Environment variables:
- STEAMPIPE_MCP_LOG_LEVEL: Sets the logging verbosity (default: info).
- STEAMPIPE_MCP_WORKSPACE_DATABASE: Overrides the default Steampipe connection string when connecting to a Turbot Pipes workspace. Common issues:
- Ensure Steampipe is running locally if using the default connection string.
- When using Turbot Pipes, ensure the workspace connection string is correct and the network allows access.
- If you encounter permission or read-only errors, verify that the Steampipe user has access to the required data sources and that the MCP server is configured to limit operations to read-only queries.
- Restart the AI assistant after updating the MCP configuration.
Tips:
- Use steampipe_query with CTEs for complex analytics to improve readability and performance.
- Prefer steampipe_table_list and steampipe_table_show to explore the data model before building queries.
Related MCP Servers
mcp-graphql
Model Context Protocol server for GraphQL
mcp-google-map
A powerful Model Context Protocol (MCP) server providing comprehensive Google Maps API integration with LLM processing capabilities.
systemprompt-code-orchestrator
MCP server for orchestrating AI coding agents (Claude Code CLI & Gemini CLI). Features task management, process execution, Git integration, and dynamic resource discovery. Full TypeScript implementation with Docker support and Cloudflare Tunnel integration.
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
rohlik
MCP server that lets you shop groceries across the Rohlik Group platforms (Rohlik.cz, Knuspr.de, Gurkerl.at, Kifli.hu, Sezamo.ro)
mcp -js
MCP server that exposes YepCode processes as callable tools for AI platforms. Securely connect AI assistants to your YepCode workflows, APIs, and automations.