onit
OnIt is an AI agent framework for automation.
claude mcp add --transport stdio sibyl-oracles-onit http http://127.0.0.1:18201/sse
How to use
OnIt is an AI agent framework built on top of MCP (Model Context Protocol) that orchestrates task automation and assistance through modular MCP servers. It auto-discovers tools and templates via the included MCP servers (PromptsMCPServer for instruction templates and ToolsMCPServer for web search, bash commands, file operations, and document tools), while enabling multi-agent communication through the A2A protocol. To start, install OnIt and run the framework; MCP servers will start automatically and expose interfaces for instruction templates and tooling. You can interact with OnIt in a terminal chat, use the Gradio web UI, or connect through A2A gateways and external MCP servers as needed. The system is designed to route tasks to the appropriate tooling via MCP, allowing you to perform research, file handling, and prompt generation in a cohesive workflow. Tools accessible via ToolsMCPServer include web search, command execution, file operations, and document tools, while PromptsMCPServer provides prompt templates to steer the model’s behavior.
How to install
Prerequisites:
- Python 3.8+ and pip
- Optional: a supported LLM host (OpenRouter, vLLM, or similar) for OnIt to connect to
Installation steps:
-
Install OnIt from PyPI or from source
- Install from PyPI: pip install onit==0.1.3c
- Install from source: git clone https://github.com/sibyl-oracles/onit.git cd onit pip install -e ".[all]" --upgrade
-
Configure environment (example):
- Set LLM host (private vLLM or OpenRouter): export ONIT_HOST=http://localhost:8000/v1
- Optional API keys for built-in tools: export OLLAMA_API_KEY=your_key export OPENWEATHER_API_KEY=your_key
-
Run OnIt: onit
-
Optional interfaces:
- Web UI: onit --web
- A2A gateway: onit --gateway
- A2A server: onit --a2a --a2a-port 9001
Note: The MCP servers PromptsMCPServer and ToolsMCPServer are started automatically when OnIt runs and expose their endpoints via SSE at ports 18200 and 18201 respectively (adjust as needed in config).
Additional notes
Environment variables and configuration options can be overridden via CLI flags or YAML config. Common issues include missing API keys for OpenRouter when using it as the LLM host, or network restrictions preventing access to the A2A or MCP SSE endpoints. If you need to customize prompts or tools, modify the YAML config under the mcp.servers section or point OnIt to a different external MCP server via --mcp-sse. The A2A feature allows remote task delegation between agents; ensure the a2a-port is accessible if you plan to use client mode. For the web UI, ensure port 9000 is open if you enable --web.
Related MCP Servers
skillz
An MCP server for loading skills (shim for non-claude clients).
agentor
Fastest way to build and deploy reliable AI agents, MCP tools and agent-to-agent. Deploy in a production ready serverless environment.
awsome_kali_MCPServers
awsome kali MCPServers is a set of MCP servers tailored for Kali Linux
agenite
🤖 Build powerful AI agents with TypeScript. Agenite makes it easy to create, compose, and control AI agents with first-class support for tools, streaming, and multi-agent architectures. Switch seamlessly between providers like OpenAI, Anthropic, AWS Bedrock, and Ollama.
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI
mcp-document-converter
MCP Document Converter - A powerful MCP tool for converting documents between multiple formats, enabling AI agents to easily transform documents.