Get the FREE Ultimate OpenClaw Setup Guide →

mcp -mariadb-vector

MCP server for MariaDB

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio davidramossal-mcp-server-mariadb-vector uv run --dir path/to/mcp-server-mariadb-vector/ --env-file path/to/mcp-server-mariadb-vector/.env mcp_server_mariadb_vector \
  --env MARIADB_HOST="host of the running MariaDB database (default: 127.0.0.1)" \
  --env MARIADB_PORT="port of the running MariaDB database (default: 3306)" \
  --env MARIADB_USER="user of the running MariaDB database" \
  --env OPENAI_API_KEY="API key for OpenAI's platform" \
  --env EMBEDDING_MODEL="model of the embedding provider (default: text-embedding-3-small)" \
  --env MARIADB_DATABASE="name of the running MariaDB database" \
  --env MARIADB_PASSWORD="password of the running MariaDB database" \
  --env EMBEDDING_PROVIDER="provider of the embedding models (default: openai)"

How to use

This MCP server exposes a set of tools that allow a client or agent to interact with a MariaDB database that supports vector operations. It provides vector store management, document storage, and semantic search capabilities, enabling natural language queries over stored documents and conversations. The server follows the Model Context Protocol (MCP), so it can be consumed by MCP clients such as Claude Desktop, Cursor, Windsurf, LangGraph, or PydanticAI. You can run the server in uv mode or as a Docker container, supply a MariaDB connection, and use the included MCP tools to create vector stores, insert documents, and perform semantic searches.

Key capabilities include:

  • Create, list, and delete vector stores in MariaDB.
  • Add documents with optional metadata to a vector store.
  • Run semantic searches against a vector store using embeddings (with a provider such as OpenAI).
  • Interact with the server via the provided MCP tools (e.g., mariadb_create_vector_store, mariadb_insert_documents, mariadb_search_vector_store).

To use it, configure the environment with your MariaDB connection details and embedding provider settings, start the server, and then connect your MCP client to the server using either stdio (for Claude Desktop, Cursor, Windsurf) or SSE transport (for web-based clients).

How to install

Prerequisites:

  • Docker or uv (Python-based UPC runtime)
  • Access to a MariaDB instance with vector support (11.7+)
  • Optional: OpenAI API key if using OpenAI embeddings

Option A: Run with uv (Python-based runtime)

  1. Install uv and dependencies (if not already installed):
    • pipx or pip to install uv: pip install uv
  2. Prepare a working clone of the repository (from your README): git clone https://github.com/DavidRamosSal/mcp-server-mariadb-vector.git cd mcp-server-mariadb-vector
  3. Create a .env file in the project root with required variables, for example: MARIADB_HOST=127.0.0.1 MARIADB_PORT=3306 MARIADB_USER=root MARIADB_PASSWORD=yourpassword MARIADB_DATABASE=database_name EMBEDDING_PROVIDER=openai EMBEDDING_MODEL=text-embedding-3-small OPENAI_API_KEY=your-openai-api-key
  4. Run the server via uv (as described in the README): uv run --dir path/to/mcp-server-mariadb-vector/ --env-file path/to/mcp-server-mariadb-vector/.env mcp_server_mariadb_vector

Option B: Run with Docker

  1. Build the Docker image from the repository root: docker build -t mcp-server-mariadb-vector .
  2. Run the container with your environment variables set (adjust values): docker run -p 8000:8000
    --add-host host.docker.internal:host-gateway
    -e MARIADB_HOST="host.docker.internal"
    -e MARIADB_PORT="3306"
    -e MARIADB_USER="youruser"
    -e MARIADB_PASSWORD="yourpassword"
    -e MARIADB_DATABASE="database"
    -e EMBEDDING_PROVIDER="openai"
    -e EMBEDDING_MODEL="text-embedding-3-small"
    -e OPENAI_API_KEY="your-openai-api-key"
    mcp-server-mariadb-vector
  3. Access the server at http://localhost:8000/sse (SSE transport).

Additional notes

Tips and common considerations:

  • Ensure your MariaDB instance is accessible from the host running the MCP server and that the database has vector capabilities (MariaDB Vector). For Docker, you may need to adjust MARIADB_HOST to host.docker.internal when the MariaDB container runs on the host.
  • The EMBEDDING_PROVIDER defaults to OpenAI. If you switch providers, update EMBEDDING_PROVIDER and EMBEDDING_MODEL accordingly, and provide any required API keys or credentials.
  • The environment variable OPENAI_API_KEY should be kept secure. Do not commit it to version control.
  • If you encounter connectivity issues, verify network access to MariaDB (host, port, user, password) and confirm that the required MariaDB extension for vector operations is installed.
  • When using Claude Desktop or Cursor/Windsurf, you can use stdio transport by providing the mcpServers.mariadb-vector.config (as shown) or switch to SSE transport by pointing to http://localhost:8000/sse if you’re running the server locally.

Related MCP Servers

Sponsor this space

Reach thousands of developers