NagaAgent
A simple yet powerful agent framework for personal assistants, designed to enable intelligent interaction, multi-agent collaboration, and seamless tool integration.
claude mcp add --transport stdio rtgs2017-nagaagent python -m NagaAgent
How to use
NagaAgent provides a super AI secretary experience with streaming tool calls, knowledge graph memory, Live2D visuals, and voice interaction. The MCP integration exposes several modular capabilities (Agent tools) that can be invoked by the LLM through the MCP adapter. Core components include: a tool loop that parses in-text tool calls and routes them to appropriate agents, a GRAG-based memory system for structured knowledge capture and retrieval, and a MCP-enabled tool registry that lists built-in agents like weather_time, open_launcher, game_guide, online_search, crawl4ai, playwrisght_master, vision, mqtt_tool, and office_doc. The server coordinates streaming results, context compression, and asynchronous tool execution, enabling multi-step tool invocation within a single chat session. To use it, start the NagaAgent MCP server and interact with the UI-enabled features (conversation, mind view, skill workshop) or connect your own frontend to the same MCP endpoints for tool execution and memory retrieval.
How to install
Prerequisites
- Python 3.11 (>=3.11, <3.12)
- Optional: uv for faster dependency installation
- Optional: Neo4j for local knowledge graph memory
Installation steps
- Clone the repository
git clone https://github.com/Xxiii8322766509/NagaAgent.git
cd NagaAgent
- Install frontend dependencies (if you plan to use the included UI)
cd frontend
npm install
cd ..
- Install backend dependencies and set up environment Option A: Use the provided setup script (auto-detects environment, creates a virtual environment, installs dependencies)
python setup.py
Option B: Use uv (recommended for development)
uv sync
Option C: Manual installation
python -m venv .venv
# Windows
.\.venv\Scripts\activate
# macOS/Linux
source .venv/bin/activate
pip install -r requirements.txt
- Prepare configuration Copy the example config to config.json and insert your API credentials (e.g., DeepSeek/OpenAI-compatible API settings). Example:
{
"api": {
"api_key": "your-api-key",
"base_url": "https://api.deepseek.com",
"model": "deepseek-v3.2"
}
}
- Run the MCP server
# from project root, using the recommended command defined in mcp_config
python -m NagaAgent
Notes
- If you already have a frontend, run it separately (e.g., CD into frontend and npm run dev).
- The setup supports multiple environments; choose the one that fits your dev or deployment workflow.
- OpenNeo4j or remote memory backends may require additional configuration (connection URL, credentials).
Additional notes
Tips and common issues:
- Ensure Python 3.11 is used; the project specifies compatibility with 3.11.x.
- If using uv, install and run via uv sync to accelerate dependency installation.
- For memory features, Neo4j must be configured locally or use a cloud/remote graph; ensure network access.
- The config.json example must be present at the project root or as expected by the launch script; verify paths if the MCP runner reports missing config.
- The tool loop supports up to 5 iterations per prompt for streaming tool calls; adjust max_loop_stream if you customize the flow.
- When integrating with other frontends, reuse the MCP endpoints for tool execution and memory access to ensure consistent tool routing.
Related MCP Servers
nerve
The Simple Agent Development Kit.
a2a-x402
The A2A x402 Extension brings cryptocurrency payments to the Agent-to-Agent (A2A) protocol, enabling agents to monetize their services through on-chain payments. This extension revives the spirit of HTTP 402 "Payment Required" for the decentralized agent ecosystem.
MCP-Checklists
MCP server from MCP-Manager/MCP-Checklists
sample-agentic-ai-demos
Collection of examples of how to use Model Context Protocol with AWS.
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
mcpx-py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps