servagent
Servagent is an MCP (Model Context Protocol) server that enables a remote AI to take full control of a Linux server: command execution, file management, service administration, and more.
claude mcp add --transport stdio servagent-servagent python -m servagent \ --env SERVAGENT_API_KEY="API key for Bearer token authentication (required in production)"
How to use
Servagent is an MCP server that grants a remote AI controlled access to a Linux host, enabling command execution, file management, service control, and more through a defined set of MCP tools. The server exposes tools like execute_command for running shell commands, read_file/write_file/edit_file for file operations, service_action for Systemd service management, and tail_file for streaming logs or journal outputs. Clients connect over two transports (Streamable HTTP at /mcp and SSE at /sse) and can upload data via POST /upload. This combination allows remote automation and debugging workflows while enforcing tool-specific annotations to guide AI behavior and safety constraints. To operate securely in production, configure an API key, TLS, and optional OAuth settings as described in the configuration options.
How to install
Prerequisites:
- Linux (Ubuntu/Debian, RHEL/CentOS, etc.)
- Python 3.10 or newer
- Root or sudo privileges for installation as a service
Installation steps (recommended):
-
Install from source via GitHub repository:
- Ensure git is installed
- Clone the repository: git clone https://github.com/Servagent/servagent.git
- Navigate into the repo: cd servagent
-
Create and activate a Python virtual environment, then install in editable mode: python3 -m venv .venv source .venv/bin/activate pip install -e .
-
Configure environment variables:
- Copy the example env file and edit it as needed:
cp .env.example .env
Edit .env to set SERVAGENT_API_KEY and other options as needed
- Ensure SERVAGENT_API_KEY is set in the environment or in .env
- Copy the example env file and edit it as needed:
cp .env.example .env
-
Run the server (development): servagent
-
Production setup (recommended):
- Install as a systemd service using the provided install script or follow the project’s deployment guide
- If using TLS, obtain certificates and configure SERVAGENT_TLS_CERTFILE and SERVAGENT_TLS_KEYFILE appropriately
- Start and enable the service: systemctl enable --now servagent
-
Basic usage after start:
- Check status: servagent status
- View logs: sudo journalctl -u servagent -f
Notes:
- The server exposes the following endpoints once running: /mcp (Streamable HTTP), /sse (SSE), and /upload (for file uploads).
- You can adjust exposed tools via environment variable SERVAGENT_TOOLS or the .env file.
- If using OAuth or TLS, ensure domain DNS and firewall rules allow external access on port 443 (TLS) or the configured port.
Additional notes
Tips and common considerations:
- Security: Always use TLS in production and protect API keys. The API key is masked in status outputs; treat it as a sensitive credential.
- Resource limits: Configure SERVAGENT_COMMAND_TIMEOUT and SERVAGENT_MAX_OUTPUT_SIZE to avoid runaway commands or excessive logs.
- OAuth setup: If you enable OAuth (SERVAGENT_OAUTH_ISSUER_URL and related vars), ensure the issuer URL is accessible and the client credentials are correctly configured.
- Backups: Regularly back up the /opt/servagent data and the OAuth database (SERVAGENT_OAUTH_DB_PATH).
- Updates: Use servagent update to fetch latest changes and restart the service; monitor for breaking changes in new releases.
- Troubleshooting: If the service fails to start, check systemd status, review journalctl logs, and verify Python virtualenv is active in the running environment.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
mcp-ical
A Model Context Protocol Server that allows you to interact with your MacOS Calendar through natural language.
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
knowledgebase
BioContextAI Knowledgebase MCP server for biomedical agentic AI