neurondb
PostgreSQL extension for vector search, embeddings, and ML, plus NeuronAgent runtime and NeuronMCP server.
claude mcp add --transport stdio neurondb-neurondb docker run -i neurondb/neurondb:latest \ --env NEURONDB_CONFIG="Path to configuration or environment overrides (if applicable)"
How to use
NeuronDB is an AI database extension for PostgreSQL that adds vector search, ML model inference, hybrid retrieval, and RAG capabilities directly inside PostgreSQL. This MCP server exposes NeuronDB’s functionality so you can leverage high-performance vector indexing (HNSW and related indexing techniques), in-database ML inference, and hybrid search pipelines without leaving your PostgreSQL environment. Typical workflows include storing embeddings, performing similarity search across large corpora, running in-database ML models, and composing retrieval-augmented generation pipelines with RAG components. You can combine vector semantics with traditional SQL queries to build advanced search and analytics capabilities inside your database.
Once running, you can access NeuronDB capabilities through your PostgreSQL client as you would with standard extensions. The MCP server orchestrates the containerized NeuronDB components, enabling you to configure indices, embeddings, and ML inference endpoints. Use the provided tools to manage vector indexes, tune retrieval parameters, and orchestrate hybrid search queries that blend dense vector similarity with traditional text search. The extension also offers background processing workers for asynchronous tasks and GPU-accelerated operations (where supported).
How to install
Prerequisites:
- Docker installed and running on your host (or your deployment environment).
- PostgreSQL instance with extension/adapter support for NeuronDB MCP integration (as provided by the MCP server setup).
- Sufficient CPU/GPU resources for your vector workloads and models.
Installation steps:
-
Pull the MCP server image (example): docker pull neurondb/neurondb:latest
-
Run the MCP server (example configuration): docker run -d --name neurondb-mcp -p 8080:8080 neurondb/neurondb:latest
-
If you have environment-specific config, mount a config file or pass environment variables as needed. Example with a config file: docker run -d --name neurondb-mcp -v /path/to/neurondb/config:/app/config neurondb/neurondb:latest
-
Connect to your PostgreSQL instance and enable the NeuronDB extension per your environment’s instructions. Refer to NeuronDB docs for enabling and configuring within PostgreSQL.
-
Validate the setup by running a quick in-database vector operation or an ML inference request via the provided MCP tools.
Notes:
- The exact image tags and startup flags may vary; consult the project’s release notes for adjustments to environment variables and ports.
- If you are integrating with Kubernetes or cloud runtimes, adapt the docker run steps to Kubernetes manifests or Helm charts as appropriate.
Additional notes
Tips and considerations:
- Environment variables: you may need to configure embeddings sources, model endpoints, and GPU allocation. Use placeholder descriptions in the env map and replace with your production values.
- Performance tuning: tune ef_search and related parameters for your recall/latency requirements. Higher recall often means higher latency.
- GPU considerations: if you plan to use GPU acceleration, ensure your container has access to GPU resources and the appropriate drivers, and enable CUDA/ROCm/Metal backends per your hardware.
- Compatibility: check PostgreSQL version compatibility and NeuronDB extension compatibility with your data types and index configurations.
- Security: manage access controls for embeddings and model inferences; use encryption and proper authentication for in-database ML endpoints.
- Troubleshooting: if the MCP server container fails to start, inspect logs for missing dependencies, image mismatches, or port conflicts, and verify that PostgreSQL can reach the NeuronDB services as needed.
Related MCP Servers
ai-trader
Backtrader-powered backtesting framework for algorithmic trading, featuring 20+ strategies, multi-market support, CLI tools, and an integrated MCP server for professional traders.
robloxstudio
Create agentic AI workflows in ROBLOX Studio
mcp
🤖 Taskade MCP · Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.
mcpcat-python-sdk
MCPcat is an analytics platform for MCP server owners 🐱.
sequel
MCP Database servers for Claude, Cursor and Windsuf
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI