polarbase
Extensible Open-source Data Backend for PostgreSQL. Features a multi-view UI (Spreadsheet, Dashboard, Calendar), AI-Agent integration via MCP, and zero vendor lock-in.
claude mcp add --transport stdio polarbase-team-polarbase docker run -i polarbase-team/polarbase:latest \ --env POSTGRES_DB="PostgreSQL database name" \ --env POSTGRES_HOST="PostgreSQL host" \ --env POSTGRES_PORT="PostgreSQL port" \ --env POSTGRES_USER="PostgreSQL user" \ --env GEMINI_API_KEY="Your Gemini API key (optional if using Gemini via POLARBACE AI)" \ --env POSTGRES_PASSWORD="PostgreSQL password" \ --env LOCAL_LLM_BASE_URL="URL for local LLM server (e.g., http://localhost:1234/v1) or leave default" \ --env SUPER_ADMIN_API_KEY="Super admin API key for initial login and admin tasks"
How to use
PolarBase is an extensible data backend built directly on PostgreSQL with a focus on a multi-view UI, API access, and an AI-assisted workflow. The MCP server implemented by PolarBase exposes a REST API for database operations, an AI agent interface to perform database reasoning and operational tasks, and real-time features for live updates. You can run the server via Docker or locally with Bun/Node tooling as described in the installation guide. After starting, you can manage API keys, connect to your PostgreSQL instance, and use the AI agent to draft queries, automate routine admin tasks, or generate data-backed insights. The UI (multi-view workspace) lets you edit data in a spreadsheet-like view, visualize data in dashboards, and use operational views such as forms and calendars, all backed by the same MCP protocol.
Usage highlights include:
- REST API endpoints to read/write data and manage schema via the MCP server.
- AI agent capability to reason about your data and perform actions through natural language prompts or structured intents.
- Real-time updates via WebSocket/SSE to keep clients synchronized with database changes.
- API key management to grant controlled access to items like the REST API or AI agent features.
To get started, deploy the server (Docker is recommended for production), point the UI at the server, and configure your PostgreSQL credentials in the environment. Then log in with the super admin key to generate API keys and begin using the rest of the features.
How to install
Prerequisites:
- Docker (recommended for production usage) or Bun/Node for local development
- PostgreSQL database to connect to
- Optional: Gemini/OpenAI API keys if you plan to use AI providers
Install with Docker (recommended):
- Ensure Docker is running
- Pull and run the PolarBase image: docker run -i polarbase-team/polarbase:latest
- Bind ports as needed in docker-compose or docker run options and configure environment variables (e.g., POSTGRES_*, GEMINI_API_KEY, SUPER_ADMIN_API_KEY) in the container or docker-compose.yml.
Alternative: Local development (if you clone the repo and have Bun/Node):
- Install dependencies (Bun is recommended): bun install
- Copy the example environment file and customize:
cp server/.env.example server/.env
edit server/.env with your credentials
- Run development server: bun run dev
- Access the server at http://localhost:3000 (or as configured in your environment).
Production build (Docker):
- Prepare docker-compose.yml or a docker run command using the PolarBase image and expose the required ports.
- Ensure environment variables are provided (POSTGRES_*, API keys, SUPER_ADMIN_API_KEY).
Additional notes
Tips and common considerations:
- PolarBase requires a PostgreSQL database. Make sure the database user has appropriate permissions for schema migrations.
- Do not store credentials in the codebase. Use environment variables or a secure secret manager to provide POSTGRES_*/API keys.
- If using Local LM Studio or local models, set LOCAL_LLM_BASE_URL in the .env to point to your local model server.
- When running via Docker, you can customize the environment in docker-compose.yml to suit development or production needs.
- The Super Admin API key is required for initial login and for creating/generating API keys for regular users.
- The MCP server supports multiple AI providers (Gemini, OpenAI, etc.). Configure GEMINI_API_KEY or other provider keys in the environment.
- Privacy note: Polarbase does not store your database credentials in logs or external services; operations run in your controlled environment.
Related MCP Servers
openops
The batteries-included, No-Code FinOps automation platform, with the AI you trust.
flyto-core
The open-source execution engine for AI agents. 412 modules, MCP-native, triggers, queue, versioning, metering.
mcpresso
TypeScript framework to build robust, agent-ready MCP servers around your APIs.
octagon-vc-agents
An MCP server that runs AI-driven venture capitalist agents (Fred Wilson, Peter Thiel, etc.), whose thinking is continuously enriched by Octagon Private Markets' real-time deals, valuations, and deep research intelligence. Use it to spin up programmable "VC brains" for pitch feedback, diligence simulations, term sheet negotiations, and more.
mode-manager
MCP Memory Agent Server - A VS Code chatmode and instruction manager with library integration
agent-configs
Control Claude Code, Cursor & Gemini CLI remotely — answer agent questions from your phone via Slack