Get the FREE Ultimate OpenClaw Setup Guide →

smart

The Smart MCP Server stands out from other tool orchestration or AI workflow systems in a few meaningful ways: Context Awareness: Unlike many workflow engines or tool servers that require explicit user input to select and run tools, Smart MCP Server automatically analyzes user messages, historical activity, and project context. This intelligent.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio reconsumeralization-smart-mcp-server node src/server.js \
  --env PORT="3000" \
  --env TLS_KEY_PATH="<path-to-key>" \
  --env TLS_CERT_PATH="<path-to-cert>" \
  --env GEMINI_API_KEY="<your-google-gemini-api-key>" \
  --env OPENAI_API_KEY="<your-openai-api-key>" \
  --env ANTHROPIC_API_KEY="<your-anthropic-api-key>"

How to use

Smart MCP Server is a feature-rich, context-aware MCP and A2A-compliant server designed to orchestrate AI workflows across multiple providers. It integrates Gemini, OpenAI, and Anthropic models under a unified interface, supports dynamic workflow loading from JSON definitions, and offers secure token management with encryption and automatic refresh. The server exposes a set of MCP and A2A endpoints for health checks, token management, workflow execution, and tool discovery, enabling agents to collaborate securely and efficiently. You can generate and manage MCP/A2A tokens via the built-in CLI, start the server to begin processing tasks, and monitor workflows in real time with logging and metrics.

How to install

Prerequisites:\n- Node.js v18.0.0 or higher\n- npm v7.0.0 or higher\n- Google Gemini API Key (for AI features)\n\nSteps:\n1. Clone the repository and install dependencies:\n\nbash\ngit clone https://github.com/reconsumeralization/smart-mcp-server.git\ncd smart-mcp-server\nnpm install\n\n\n2. Generate MCP/A2A compliant tokens (optional but recommended for first run):\n\nbash\nnpm run token:generate\n``\n\n3. Configure environment (example): create a .env file or export vars:\n\nbash\nexport GEMINI_API_KEY=<your-gemini-api-key>\nexport OPENAI_API_KEY=<your-openai-api-key>\nexport ANTHROPIC_API_KEY=<your-anthropic-api-key>\nexport PORT=3000\n\n\n4. Start the server:\n\nbash\nnpm start\n``\nThe server will begin listening on the configured port (default 3000).

Additional notes

Tips and considerations:\n- Ensure your Gemini/OpenAI/Anthropic API keys are valid and have the required permissions for the tasks you intend to run.\n- The server supports dynamic workflow definitions loaded from JSON files; place them under the configured workflows directory (per project conventions) for automatic discovery.\n- Tokens are stored securely with AES-256-CBC encryption; keep your encryption keys secure and rotate tokens periodically.\n- If you modify environment variables related to TLS, ensure your certificates and keys are correctly mounted and accessible by the server.\n- For debugging, inspect the /health and /api/logs endpoints and enable verbose logging in the configuration.\n- When running in production, consider containerization (Docker) or process managers (PM2, systemd) to ensure uptime and automatic restarts.

Related MCP Servers

Sponsor this space

Reach thousands of developers