architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).
claude mcp add --transport stdio ageborn-dev-architect-mcp-server node server.js \ --env PORT="3001" \ --env DB_PATH="sqlite://./data/architect.db" \ --env LOG_LEVEL="info"
How to use
Architect MCP is an AI agent workshop that lets your agents dynamically create and deploy their own tools within a secure sandbox. The server provides a built-in workflow where agents can write JavaScript tools, request user-approved permissions, and execute those tools in isolation. Key features include a real-time web dashboard (accessible on port 3001 by default) for managing active tools, execution logs, failures, and secrets, as well as a persistent SQLite storage layer for tools, run logs, and execution state. Agents can also schedule tools to run on cron-like intervals, and browse or publish tools to a global marketplace using a GitHub token. This makes Architect a flexible, extendable platform where your agents can go from problem-solving to tool-building in a secure, auditable environment. To get started, run the server locally or via Docker, then open http://localhost:3001 to monitor and manage tools as they are created and executed by your agents.
How to install
Prerequisites:
- Node.js (recommended LTS) and npm installed on your system
- Docker and Docker Compose installed if you prefer the Docker setup
Install and run locally:
-
Clone the repository
-
Install dependencies npm install
-
Build (if applicable) npm run build
-
Start the server npm start
-
Access the dashboard at http://localhost:3001
Docker setup (recommended for isolation):
- Ensure Docker and Docker Compose are installed
- From the project directory, start with Docker Compose docker compose up -d
- The dashboard will be available at http://localhost:3001 and data/tools will persist in Docker volumes
Prereq notes:
- The server expects a Node.js environment and will run tools inside a sandboxed context.
- If you modify environment-sensitive settings (ports, database paths, or secrets), update your environment variables accordingly.
Additional notes
Tips and common considerations:
- The built-in sandbox ensures tools run with strict permission checks. You must explicitly approve network or filesystem access for each tool before execution.
- The dashboard on port 3001 lets you review tool requests, grant granular permissions, and monitor real-time logs.
- Tools and runs are stored in a SQLite-backed storage layer; for production, consider backing store options or persistent Docker volumes.
- You can leverage the Global Marketplace to browse, install, or publish tools using a GitHub token.
- Cron scheduling allows agents to run tools on schedules or as part of continuous pipelines.
- If you encounter port conflicts, adjust the PORT environment variable or Docker Compose configuration.
- Environment variables you might configure include PORT, DB_PATH, and LOG_LEVEL for debugging.
Related MCP Servers
mcp-gemini
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
vscode-context
MCP Server to Connect with VS Code IDE
gridctl
🧪 Local Stack for testing Agents
Email MCP server with full IMAP + SMTP support — read, search, send, manage, and organize email from any AI assistant via the Model Context Protocol
LinkedIn-Posts-Hunter
LinkedIn Posts Hunter MCP is a Model Context Protocol (MCP) server that provides tools for automating LinkedIn job post search and management through your AI assistant (Claude Desktop, Cursor, or other MCP-compatible clients).
RLM-Memory
A Model Context Protocol (MCP) server that provides AI agents with persistent memory and semantic file discovery.