emotion_ai
The Aura Emotion AI system has chroma with a local embedding model, memvid qr code mp4 infinite memory, brainwave and neurochemical simulations, sociobiological reasoning, autonomous subsystem processing with a Gemini flash model so the main model is less taxed, is a MCP client with adaptive tool learning and MCP server.
claude mcp add --transport stdio angrysky56-emotion_ai uvx emotion_ai \ --env HOST="0.0.0.0" \ --env PORT="8000" \ --env DEBUG="false" \ --env AURA_MODEL="gemini-2.5-flash-preview-05-20" \ --env CORS_ORIGINS="["http://localhost:5173", "http://localhost:3000"]" \ --env GOOGLE_API_KEY="Your Google API key" \ --env MCP_SERVER_NAME="aura-companion" \ --env AUTONOMIC_ENABLED="true" \ --env MCP_SERVER_VERSION="1.0.0" \ --env AURA_DATA_DIRECTORY="./aura_data" \ --env ENABLE_FILE_EXPORTS="true" \ --env AURA_AUTONOMIC_MODEL="gemini-2.0-flash-lite" \ --env ENABLE_VECTOR_SEARCH="true" \ --env CHROMA_PERSIST_DIRECTORY="./aura_chroma_db" \ --env ENABLE_COGNITIVE_TRACKING="true" \ --env ENABLE_EMOTIONAL_ANALYSIS="true" \ --env AURA_AUTONOMIC_MAX_OUTPUT_TOKENS="100000"
How to use
Emotion_AI is a Python-based MCP-enabled backend designed to run as an MCP Server with a FastAPI-powered interface and a vector search-backed memory. It integrates with a Model Context Protocol flow to enable external tool calls, bidirectional data exchange, and advanced emotional and cognitive analysis. The server exposes capabilities for emotional state tracking, memory persistence via a ChromaDB-like vector store, and a Gemini-based model for response generation with transparent thinking extraction. To use it, configure the environment, start the server, and interact with the API endpoints as described in the MCP ecosystem. You can leverage the MCP client tooling to access external tools and integrate this server with other MCP-enabled agents.
How to install
Prerequisites:
- Python 3.12+
- Git
- A Google API Key for Gemini/Aura integrations
- Sufficient RAM (4 GB+) and storage for vector embeddings
Installation steps:
-
Clone the repository and navigate to the project: git clone https://github.com/angrysky56/emotion_ai.git cd emotion_ai
-
Set up a Python virtual environment (Python 3.12+): python3.12 -m venv .venv source .venv/bin/activate
-
Install dependencies and run setup:
If a setup script exists as described in the README
chmod +x setup.sh ./setup.sh
-
Configure environment variables:
- Create an .env file or export variables as needed. Example: export GOOGLE_API_KEY=your-google-api-key export CHROMA_PERSIST_DIRECTORY=./aura_chroma_db export AURA_DATA_DIRECTORY=./aura_data export HOST=0.0.0.0 export PORT=8000 export MCP_SERVER_NAME=aura-companion export MCP_SERVER_VERSION=1.0.0 export AURA_MODEL=gemini-2.5-flash-preview-05-20 export AURA_AUTONOMIC_MODEL=gemini-2.0-flash-lite
-
Run the MCP server (via the uv/uvx workflow as described in the MCP tooling for Python): uvx emotion_ai
-
Access the server and check health endpoints as described in your MCP tooling or via the FastAPI docs (typically http://0.0.0.0:8000/docs).
Notes:
- The exact run command may vary based on how you package the MCP server; the configuration above targets the uvx-based Python execution path.
Additional notes
Tips and common issues:
- Ensure your GOOGLE_API_KEY is valid and has access to Gemini or related APIs used by Aura.
- The vector store (ChromaDB) directory should be writable by the process; verify CHROMA_PERSIST_DIRECTORY exists or is creatable.
- If the MCP server fails to start, check that PORT and HOST do not conflict with other services and that Python 3.12+ is being used.
- Environment variable handling can differ between OS shells; prefer a .env file or export commands that match your environment.
- The AURA_MAX_OUTPUT_TOKENS and AUTONOMIC_MAX_OUTPUT_TOKENS set upper limits for responses; adjust according to resource constraints.
- When integrating with external MCP clients, ensure MCP_SERVER_NAME and MCP_SERVER_VERSION align with your deployment configuration for discoverability.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
furi
CLI & API for MCP management
create -app
A CLI tool for quickly scaffolding Model Context Protocol (MCP) server applications with TypeScript support and modern development tooling
mcp-lite-dev
共学《MCP极简开发》项目代码
biznagafest
MCP Servers en Málaga con salero
gemini -client
A MCP (Model Context Protocol) client that uses Google Gemini AI models for intelligent tool usage and conversation handling. Tested working nicely with Claude Desktop as an MCP Server currently. Based on untested AI gen code by a non-coder use at own risk.