GraphMemory-IDE
AI-assisted development MCP providing long-term, on-device "AI memory" for IDEs. Powered by Kuzu GraphDB and exposed via MCP server
claude mcp add elementalcollision-graphmemory-ide
How to use
GraphMemory IDE is a comprehensive integrated development environment focused on graph-based memory management and collaborative development. It centralizes project documentation, architecture overviews, deployment guides, and developer resources to streamline onboarding and ongoing work. The platform emphasizes modular documentation, with dedicated sections for project architecture, API references, development workflows, deployment procedures, testing, security, analytics, and user guides. Tools and resources are organized by topic to help teams quickly locate guides relevant to configuration, deployment, plugin development, testing routines, and operational practices. To begin, explore the Documentation hub to understand project scope, architecture, and the recommended development workflow, then leverage the Deployment and Development guides to set up local environments and contribute changes.
When using GraphMemory IDE, you’ll find resources for running deployments (Docker/Kubernetes guidance), development guidelines (quality checks, testing setups, and contributing standards), and API references. The IDE also provides tutorials for plugin development and collaborative features, enabling teams to extend the platform with custom tools while maintaining alignment with security and architecture guidelines. Regular references to monitoring, operations, and analytics help ensure you can manage performance and gain insights from usage data as you build graph-based memory features.
How to install
Prerequisites:
- Access to the repository and any required org/project permissions
- Node.js and npm/yarn if you plan to run Node-based tooling (optional, depending on your setup)
- Docker and/or Kubernetes if you plan to deploy locally via containerization
Step 1: Review Documentation
- Browse docs/ directories to identify the deployment and development guides relevant to your environment (Docker deployment, development setup, API references, etc.).
Step 2: Set up your environment
- If using Docker: ensure Docker is installed and running on your machine.
- If using Kubernetes: ensure kubectl and a local/remote cluster are configured as per docs/deployment guidelines.
- If integrating with a local development workflow: install any required CLI tools referenced in the Development guides (e.g., code quality tools, testing frameworks).
Step 3: Install dependencies (if applicable)
- For Node-based tooling (if you encounter any in your workflow):
- npm install
- npm run build (if a build step is described in the development docs)
Step 4: Run the deployment locally (example for Docker-based setup)
- Follow the Docker Deployment Guide in docs/deployment/DOCKER_DEPLOYMENT_GUIDE.md to pull images and start containers.
- Verify services are up using the Monitoring and Operations guides.
Step 5: Access the IDE and docs
- Navigate to the local URL or port specified in the deployment guide to access GraphMemory IDE.
- Explore documentation sections (Project & Architecture, Development, Deployment & Operations, Analytics & UX, Project Management) to begin contributing or using the tools.
Additional notes
Tips:
- If you can’t find a document, refer to the link-based navigation in the README and the top-level docs directories listed under Documentation.
- Ensure you keep an eye on Security guidelines when enabling plugins or extending the IDE with new tools.
- Check the Troubleshooting and Phase/Summary documents in the Summaries section for common issues and progress notes.
- When running locally, use the Deployment and Operations guides to align with your production practices and monitoring setup.
Environment variables and configuration options would be specified in the Deployment and Security documents; adjust them per your environment and security requirements.
Related MCP Servers
mcp -qdrant
An official Qdrant Model Context Protocol (MCP) server implementation
mcp-neo4j
Neo4j Labs Model Context Protocol servers
browser-use
Browse the web, directly from Cursor etc.
MCPBench
The evaluation benchmark on MCP servers
legion
A server that helps people access and query data in databases using the Legion Query Runner with Model Context Protocol (MCP) in Python.
pdf-rag
PDF RAG server for cursor.