langconnect-client
A Modern GUI Interface for Vector Database Management(Supports MCP integration)
claude mcp add --transport stdio teddynote-lab-langconnect-client uvx run mcpserver/mcp_sse_server.py
How to use
LangConnect Client provides a modern GUI for managing vector databases powered by PostgreSQL with pgvector, and it includes MCP (Model Context Protocol) integration to enable AI assistants to query and operate the data via a predefined set of tools. The MCP server component exposed by this project runs as a Python UV-enabled server that hosts an MCP SSE endpoint, allowing assistants to perform semantic searches, list and manage collections and documents, and run administrative actions through the bundled tools. Available tools include search_documents, list_collections, get_collection, create_collection, delete_collection, list_documents, add_documents, delete_document, get_health_status, and multi_query. These tools enable an assistant to execute complex workflows such as semantic/hybrid searches, document ingestion, and collection management while streaming results via SSE when supported by the client setup.
How to install
Prerequisites:\n- Docker and Docker Compose (for running the full LangConnect Client stack)\n- Node.js 20+ (for MCP inspector tooling)\n- Python 3.11+ with the uv (uvicorn) package\n- PostgreSQL with pgvector (as used by the LangConnect backend)\n- A Supabase account (for authentication and token management)\n\n1. Clone the repository and navigate into the project:\nbash\ngit clone https://github.com/teddynote-lab/langconnect-client.git\ncd langconnect-client\n\n\n2. Create and configure environment variables\nbash\ncp .env.example .env\n
Edit .env with your Supabase URL and key, as well as any other required service credentials (e.g., database connection details).\n\n3. Build and start services (Docker-based workflow is recommended):\nbash\nmake build\nmake up\n\nThis will build the frontend, backend, and MCP-related components and bring up the required services.\n\n4. Generate MCP configuration (as needed by the MCP tooling):\nbash\nmake mcp\n\nThis will prompt for credentials, obtain an access token, update .env, and produce mcpserver/mcp_config.json.\n\n5. Access the MCP-enabled server (example):\n- Start the MCP SSE server (Python/uv) as configured by the mcp_config.json generated above. You can also run the MCP inspector tooling to validate integration.\n
Additional notes
Tips and notes:\n- The MCP configuration is generated by the repo’s Makefile target (make mcp). The server entry point in this setup typically runs via uv (Python) and is exposed over an SSE endpoint for efficient streaming of results to assistants. Ensure your .env contains valid Supabase credentials and any necessary tokens; the setup can auto-refresh tokens as described in the MCP/SSE integration docs. If you encounter port conflicts, verify that health and docs endpoints (as described in the README) are reachable and that the frontend (localhost:3000) and API (localhost:8080) ports do not collide with other services.\n- Environment variables commonly used include SUPABASE_URL and SUPABASE_KEY, as well as any database connection details required by the backend. The MCP tooling expects an accessible token and proper authentication flow, which is typically handled by the Makefile automation during make mcp.\n- When integrating with Claude Desktop or Cursor, copy the generated mcp_config.json contents into the respective MCP settings. The client supports 9+ tools for AI assistants and can exchange results via stdio or SSE transport depending on the client configuration.\n- If you modify environment variables or tokens, re-run make mcp to regenerate the configuration.\n
Related MCP Servers
Security-Detections
MCP to help Defenders Detection Engineer Harder and Smarter
mcp-use-ts
mcp-use is the framework for MCP with the best DX - Build AI agents, create MCP servers with UI widgets, and debug with built-in inspector. Includes client SDK, server SDK, React hooks, and powerful dev tools.
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
fullstack-langgraph-nextjs-agent
Production-ready Next.js template for building AI agents with LangGraph.js. Features MCP integration for dynamic tool loading, human-in-the-loop tool approval, persistent conversation memory with PostgreSQL, and real-time streaming responses. Built with TypeScript, React, Prisma, and Tailwind CSS.
us-census-bureau-data-api
The U.S. Census Bureau Data API MCP connects AI Assistants with official Census Bureau statistics.