zypher-agent
A minimal yet powerful framework for creating AI agents with full control over tools, providers, and execution flow.
claude mcp add --transport stdio corespeed-io-zypher-agent node dist/server.js \ --env PORT="HTTP listen port for the MCP server" \ --env LOG_LEVEL="logging level (e.g., info, debug)" \ --env MODEL_API_KEY="API key for model provider (e.g., OpenAI, Claude)" \ --env OAUTH_CLIENT_ID="OAuth client ID for MCP authentication" \ --env OAUTH_TOKEN_URL="OAuth token URL" \ --env MCP_REGISTRY_URL="Registry URL for MCP servers (if using a custom registry)" \ --env OAUTH_CLIENT_SECRET="OAuth client secret"
How to use
Zypher Agent exposes an MCP-enabled runtime that can orchestrate AI tasks within your application. It ships with a flexible tool system for file operations, search, and terminal commands, and supports integrating external MCP servers via OAuth authentication. Use the agent to run task-oriented sessions where the agent reason autonomously and issue tool-operations while you observe streaming events. The MCP integration lets the agent communicate with MCP servers to delegate reasoning steps, fetch model context, and persist checkpoints as the agent evolves tasks. To get started, configure the MCP server in your environment and supply the necessary OAuth credentials, then reference the MCP server in the agent configuration so it can route model-context interactions accordingly. As you work, you can leverage the built-in tools and post-inference interceptors to customize behavior, log progress, and enforce safety checks during task execution.
How to install
Prerequisites:
- Node.js (LTS) installed on your machine
- npm or yarn for package management
- Git for cloning repositories
-
Clone the repository or install the MCP server package (example assumes npm package availability): npm install -g @zypher/agent || yarn global add @zypher/agent
-
Install dependencies in your project (if integrating as a local module): git clone https://github.com/corespeed-io/zypher-agent.git cd zypher-agent npm install
-
Build the project (if applicable): npm run build
-
Run the MCP-enabled server (example): node dist/server.js
-
Set up environment variables for OAuth and model providers as shown in the mcp_config example. Ensure the MCP server is accessible from your application.
Prerequisites are Node.js and npm, plus credentials for OAuth and your chosen model provider.
Additional notes
Notes and tips:
- Ensure your OAuth credentials (OAUTH_CLIENT_ID and OAUTH_CLIENT_SECRET) are securely stored and not committed to version control.
- The MCP registry URL is used if you operate with a custom MCP server registry; default to the public MCP registry if applicable.
- Adjust LOG_LEVEL to debug during development to capture detailed traces, and switch to info or warning in production.
- The agent relies on a tool system (file operations, search, terminal commands). Extend or replace tools to fit your environment.
- If you encounter port conflicts, change the PORT environment variable accordingly.
- For troubleshooting, verify that the dist/server.js path exists after your build step; if your entry point differs, update the mcp_config args accordingly.
- When using multiple MCP servers, ensure proper OAuth scopes and role-based access for each server to avoid authorization failures.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
headroom
The Context Optimization Layer for LLM Applications
aser
Aser is a lightweight, self-assembling AI Agent frame.
codemesh
The Self-Improving MCP Server - Agents write code to orchestrate multiple MCP servers with intelligent TypeScript execution and auto-augmentation
local-skills
Universal MCP server enabling any LLM or AI agent to utilize expert skills from your local filesystem. Reduces context consumption through lazy loading. Works with Claude, Cline, and any MCP-compatible client.