ssd-ai
MCP server
claude mcp add --transport stdio ssdeanx-ssd-ai node server.js \ --env PORT="8080" \ --env MCP_LOG_LEVEL="info" \ --env MEMORY_DB_PATH="path/to/memory.db"
How to use
SSD-AI is an MCP-compliant AI development assistant that exposes 36 specialized tools across memory management, semantic code analysis, code quality evaluation, project planning, sequential thinking, prompt engineering, browser automation, UI preview, time utilities, and long-running tasks. It leverages natural language keywords in English and Korean to trigger tools such as save_memory, recall_memory, find_symbol, analyze_complexity, generate_prd, create_thinking_chain, enhance_prompt, monitor_console_logs, preview_ui_ascii, and get_current_time, enabling you to manage context, analyze and improve code, plan roadmaps, reason through problems, and automate debugging workflows. To use it, run the MCP server, then interact with the tool catalog via natural language prompts. The server will route requests to the appropriate tool based on the keywords or intents you specify, returning structured results that can be consumed by your development pipelines or chat interfaces.
How to install
Prerequisites:
- Node.js (LTS) installed on the host machine
- npm (comes with Node.js) or yarn
- Optional: Python (for Python-based tooling support)
-
Install the MCP server package (example using npm): npm install -g @su-record/hi-ai
-
Prepare a local install directory and navigate there: mkdir -p ~/ssd-ai && cd ~/ssd-ai
-
Install dependencies (if you cloned or downloaded source): npm install
-
Start the MCP server (assuming a server.js entry point at the project root): node server.js
-
Verify the server is running (default port 8080): curl http://localhost:8080/health
Notes:
- If you prefer running via npx, you can use: npx -y @su-record/hi-ai
- Environment variables can be used to tailor memory storage paths, log levels, and ports as needed.
Additional notes
Tips and common issues:
- Ensure Node.js version compatibility with the Hi-AI package to avoid peer dependency issues.
- For persistent memory across sessions, configure MEMORY_DB_PATH to a writable SQLite file or equivalent storage.
- If you enable long-running tasks, monitor task status via the MCP task endpoints (tasks/get, tasks/list, etc.).
- Use the language keywords in prompts to trigger the appropriate tools (e.g., find_symbol, analyze_complexity, generate_prd).
- When integrating with CI/CD, expose the MCP server port to your internal network and ensure health checks are configured.
- If you encounter turtle-neck performance, consider enabling the project caching and LRU strategies described in the feature set.
Environment variables (examples):
- PORT: Port number the server listens on
- MCP_LOG_LEVEL: debug, info, warn, error
- MEMORY_DB_PATH: Path to memory store (SQLite or similar)
- MAX_CONCURRENT_TASKS: Limit for concurrent tasks to prevent overload
Related MCP Servers
mcp
Official MCP Servers for AWS
mcp-router
A Unified MCP Server Management App (MCP Manager).
director
MCP Playbooks for AI agents
MCP-Defender
Desktop app that automatically scans and blocks malicious MCP traffic in AI apps like Cursor, Claude, VS Code and Windsurf.
remote
Remote MCP Server that securely connects Enterprise context with your LLM, IDE, or agent platform of choice.
cli
Fine-grained control over model context protocol (MCP) clients, servers, and tools. Context is God.