Autono
A ReAct-Based Highly Robust Autonomous Agent Framework.
claude mcp add --transport stdio vortezwohl-autono python -m autono \ --env OPENAI_API_KEY="your-openai-api-key"
How to use
Autono is a highly robust autonomous agent framework built on the ReAct paradigm. It enables you to instantiate an agent with customizable abilities, expose those abilities for execution, and drive decision making through a thought-driven action loop. The MCP-compatible Autono server exposes tools to create and configure agents, set their personalities (such as PRUDENT or INQUISITIVE), and integrate additional external tools via modular abilities. Core concepts you’ll leverage include the Agent class, the Personality enum, the get_openai_model helper for selecting a thought engine, and decorators like @ability and @agentic to declare actions and capabilities. With Autono, you can orchestrate multi-step reasoning, memory sharing across agents, and dynamic tool usage during task execution, all under the MCP protocol’s standardized interface.
Once running, you can connect to the MCP server to instantiate an Autono agent, assign abilities (e.g., calculators, file writers, or domain-specific tools), and control its behavior through personality tuning. Typical workflows involve defining lightweight Python functions as abilities, decorating them, and then creating an Agent that aggregates these abilities. The framework then plans a sequence of actions, calls the appropriate tools, and updates its strategy based on outcomes, supporting robust multi-step tasks and recovery from tool failures when needed.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- Access to a compatible OpenAI API key (or another supported model provider)
- network access to call the OpenAI API
Step 1: Install the Autono package
-
From PyPI (stable):
pip install -U autono -
From GitHub (latest unreleased features):
pip install git+https://github.com/vortezwohl/Autono.git
Step 2: Set up environment variables
- Configure your OpenAI API key (example for UNIX-like shells):
# .env OPENAI_API_KEY=sk-... - Or export directly in the current session:
export OPENAI_API_KEY=sk-...
Step 3: Run the MCP server for Autono
- If you followed the mcp_config, start the server with Python:
python -m autono
Step 4: Quickstart sanity checks
- You can verify installation by importing and creating a simple Agent in a Python script, e.g.:
from autono import Agent, Personality, get_openai_model, ability @ability def dummy(x: int) -> int: return x * 2 model = get_openai_model() agent = Agent(abilities=[dummy], brain=model, name='Autono', personality=Personality.INQUISITIVE)
Additional notes
Tips and common issues:
- Ensure OPENAI_API_KEY is available in the environment where the MCP server runs.
- If you encounter rate limits or authentication errors, verify your OpenAI account permissions and API usage quotas.
- The Autono framework supports external tool integration via modular abilities; you can expand action spaces by adding new decorated functions and attaching them to the Agent.
- For multi-agent collaboration or memory sharing, consider extending agents with shared memory or inter-agent communication tools, compatible with MCP protocol flows.
- If the server doesn’t respond, check network connectivity, Python environment, and that the correct module path is used when starting the server.
- When upgrading Autono, review API changes between versions, as decorators and utility functions may evolve.
Related MCP Servers
magic
Super Magic. The first open-source all-in-one AI productivity platform (Generalist AI Agent + Workflow Engine + IM + Online collaborative office system)
sre
The SmythOS Runtime Environment (SRE) is an open-source, cloud-native runtime for agentic AI. Secure, modular, and production-ready, it lets developers build, run, and manage intelligent agents across local, cloud, and edge environments.
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
headroom
The Context Optimization Layer for LLM Applications
agentql
Model Context Protocol server that integrates AgentQL's data extraction capabilities.
mcp
🤖 Taskade MCP · Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.