brain-ecosystem
Brain Ecosystem — Self-learning MCP servers for Claude Code (monorepo)
claude mcp add --transport stdio timmeck-brain-ecosystem npx -y @timmeck/brain \ --env BRAIN_HOST="localhost" \ --env BRAIN_PORT="7777"
How to use
Brain Ecosystem is an autonomous AI research system composed of multiple specialized brains connected through a Hebbian synapse network. It runs 72+ MCP tools and supports multi-provider LLMs, live market data, social feeds, and web research, all orchestrated through a central MCP-based daemon. Once started, you can interact with the system via the built-in Intelligence CLI and the Intelligence Dashboard, monitor tool usage and performance, and leverage inter-brain communications for consensus decisions and self-improvement. The MCP layer exposes HTTP endpoints with Server-Sent Events (SSE) transport, enabling real-time updates from individual brains and agents.
To get started, install the Brain package and initialize the MCP configuration with brain setup (as described in the repository). The system includes specialized brains such as trading-brain and marketing-brain, which can be installed and initialized to extend capabilities. You can query and orchestrate via the Intelligence CLI (e.g., brain intel, brain intel rag, brain intel knowledge) and use the provided endpoints to integrate with external tools, data sources, and plugins. The platform is designed for self-improvement: it observes its own behavior, runs experiments, and can modify its own source code when improvements are found, all while maintaining an auditable trail of changes.
If you run multiple brains, you can configure separate MCP endpoints and connect them through a unified configuration object, enabling multi-brain consensus decisions and cross-brain learning. Use the setup to enable persistent memory, RAG vector search, and the Knowledge Graph to keep track of insights, errors, and solutions across sessions.
How to install
Prerequisites:
- Node.js and npm installed on the host system
- Internet access to fetch the brain MCP package
Step-by-step:
-
Install Node.js if not already installed (visit https://nodejs.org/ and follow the installer for your OS).
-
Install the Brain MCP package globally via npx (the MCP server) or install locally in your project:
npm install -g @timmeck/brain
-
Initialize the MCP configuration for Brain Ecosystem. Depending on your setup, you can start the MCP server using the recommended command (as shown in the mcp_config above):
npx -y @timmeck/brain
-
Start and verify the server. Ensure the port (default 7777) is accessible and that SSE endpoints are reachable (e.g., http://localhost:7777/sse).
-
If you want to run additional brains (optional): install extra brains (e.g., trading-brain, marketing-brain) and configure them in the same MCP environment:
npm install -g @timmeck/trading-brain trading setup
npm install -g @timmeck/marketing-brain marketing setup
-
Connect to the Intelligence Dashboard or use the Intelligence CLI to manage and observe the system:
brain intel brain intel knowledge brain intel rag <query>
-
Consult the documentation for environment variables and plugin options to tailor the MCP setup to your needs.
Additional notes
Tips and common issues:
- Ensure the host firewall allows the configured port (default 7777) for SSE connections.
- When running multiple brains, provide distinct ports or host mappings to avoid conflicts.
- If the MCP server fails to start, check Node.js version compatibility and ensure dependencies are installed correctly with npm.
- ENV variables such as BRAIN_PORT and BRAIN_HOST can be adjusted to fit containerized or cloud deployments. Consider adding BRAIN_LOG_LEVEL for verbose logs during troubleshooting.
- For production deployments, consider running the MCP server behind a reverse proxy with TLS and enabling authentication for the endpoints.
- If you rely on external data sources (markets, social feeds, web search), ensure API keys and access credentials are configured in the environment or via a secure config mechanism.
- The system supports hot-swapping brains; after installation, you can add or remove brains without major downtime, but always validate consistency in the Knowledge Graph and synapse network after changes.
Related MCP Servers
task-orchestrator
A light touch MCP task orchestration server for AI agents. Persistent work tracking and context storage across sessions and agents. Defines planning floors through composable notes with optional gating transitions. Coordinates multi-agent execution without prescribing how agents do their work.
ollama
An MCP Server for Ollama
Pare
Dev tools, optimized for agents. Structured, token-efficient MCP servers for git, test runners, npm, Docker, and more.
github-to
Convert GitHub repositories to MCP servers automatically. Extract tools from OpenAPI, GraphQL & REST APIs for Claude Desktop, Cursor, Windsurf, Cline & VS Code. AI-powered code generation creates type-safe TypeScript/Python MCP servers. Zero config setup - just paste a repo URL. Built for AI assistants & LLM tool integration.
photon
Build MCP servers from single TypeScript files. One file becomes an MCP server, CLI tool, and web UI — automatically.
openapi -swagger
Solve AI context window limits for API docs | Convert any Swagger/OpenAPI to searchable MCP server | AI-powered endpoint discovery & code generation | Works with Cursor, Claude, VS Code