nebulagraph
Model Context Protocol Server for NebulaGraph 3.x
claude mcp add --transport stdio nebula-contrib-nebulagraph-mcp-server python -m nebulagraph_mcp_server \ --env NEBULA_HOST="<your-nebulagraph-host>" \ --env NEBULA_PORT="<your-nebulagraph-port>" \ --env NEBULA_USER="<your-nebulagraph-username>" \ --env NEBULA_VERSION="v3 (required, e.g., v3)" \ --env NEBULA_PASSWORD="<your-nebulagraph-password>"
How to use
NebulaGraph MCP Server exposes NebulaGraph operations through the Model Context Protocol, enabling LLM-assisted graph exploration and queries. The server reads its configuration from environment variables (or a .env file) and provides a CLI entry point nebulagraph-mcp-server to initialize and run the MCP service. You can connect your MCP-enabled tooling (such as LlamaIndex or other MCx clients) to the server to fetch graph schemas, perform queries, and run lightweight graph algorithms. The included environment variables map to your NebulaGraph instance, so you can point the MCP server at your NebulaGraph deployment and expose it to your tooling stack. By following the MCP protocol, you can plug this server into larger pipelines for reasoning over graph data, including schema discovery, vertex/edge querying, and basic analytics.
How to install
Prerequisites:
- Python 3.8+ (recommended 3.9+)
- pip (bundled with Python)
- Access to a NebulaGraph v3 cluster (as required by the server)
- Create a Python virtual environment (optional but recommended):
python -m venv venv
source venv/bin/activate # On Windows use: venv\Scripts\activate
- Install the NebulaGraph MCP Server package from PyPI:
pip install nebulagraph-mcp-server
- Prepare environment variables for NebulaGraph connectivity. You can set them in a .env file or export them in your shell. Example:
export NEBULA_VERSION=v3
export NEBULA_HOST=localhost
export NEBULA_PORT=Nebula_Port
export NEBULA_USER=root
export NEBULA_PASSWORD=secret
- Run the MCP server (assuming the package provides a -m entry point as NebulaGraph MCP Server):
python -m nebulagraph_mcp_server
- Verify it's running by checking logs or hitting the MCP endpoint configured by the server (default port is typically the MCP HTTP/gRPC endpoint as implemented by the package). For containerized deployments, wrap the command in your preferred orchestration tooling and ensure environment variables are passed to the container.
Additional notes
Tips and notes:
- The server relies on NebulaGraph v3; ensure NEBULA_VERSION is set to v3 as required.
- If you encounter connection errors, verify NEBULA_HOST and NEBULA_PORT are correct and that NebulaGraph is reachable from the host running the MCP server.
- You can store configuration in a .env file and load it with your runtime environment; many systems automatically load dotenv files.
- If the CLI name or module path changes in your installation, consult the package's entry points or run a quick search in site-packages to locate the correct -m module to invoke.
- When integrating with LLM tooling, ensure your MCP client uses the MCP protocol as documented by the NebulaGraph MCP server to retrieve schema, queries, and algorithms.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
open-ptc-agent
An open source implementation of code execution with MCP (Programatic Tool Calling)
mcp-redis
The official Redis MCP Server is a natural language interface designed for agentic applications to manage and search data in Redis efficiently
TradingAgents mode
TradingAgents-MCPmode 是一个创新的多智能体交易分析系统,集成了 Model Context Protocol (MCP) 工具,实现了智能化的股票分析和交易决策流程。系统通过多个专业化智能体的协作,提供全面的市场分析、投资建议和风险管理。
python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
ultimate_mcp_server
Comprehensive MCP server exposing dozens of capabilities to AI agents: multi-provider LLM delegation, browser automation, document processing, vector ops, and cognitive memory systems