cronicorn
Hosted HTTP job scheduler — AI reads response bodies and adapts frequency based on natural language descriptions
claude mcp add --transport stdio weskerllc-cronicorn node dist/server.js \ --env PORT="3000" \ --env CRONICORN_DB_URL="postgres://user:pass@host:5432/dbname" \ --env CRONICORN_API_KEY="your-api-key" \ --env CRONICORN_BASE_URL="https://cronicorn.example.com"
How to use
Cronicorn is an HTTP-based MCP server that lets you add endpoints and describe what matters in plain English. It uses AI to monitor response data and adapt polling frequency, coordinate between multiple endpoints (siblings), and perform automatic recovery actions when failures occur. This means you can define simple HTTP jobs, set baseline schedules, and rely on the system to optimize timing and resilience without heavy configuration. The server can operate with traditional cron or interval scheduling as a fallback, and it exposes a REST API and a web UI for managing jobs and viewing status. You can integrate clients like Claude, Cursor, or any MCP-compatible client to manage and observe your jobs from your preferred toolchain. The core capability is creating endpoints, describing success criteria and coordination logic in natural language, and letting the AI adjust behavior to optimize results while staying within safety constraints.
How to install
Prerequisites:
- Node.js (LTS) and npm installed on your machine
- Git (optional, for cloning the repository)
Installation steps:
-
Clone the repository (or download the release): git clone https://github.com/weskerllc/cronicorn.git cd cronicorn
-
Install dependencies: npm ci
-
Build the project (if required by the repo): npm run build
-
Configure environment variables (see environment variable notes below) and start the server: export PORT=3000 export CRONICORN_API_KEY=your-api-key export CRONICORN_DB_URL=postgres://user:pass@host:5432/dbname export CRONICORN_BASE_URL=https://cronicorn.example.com npm run start
-
Verify the server is running by hitting the API reference or UI:
- API: http://localhost:3000/api/docs
- Web UI: http://localhost:3000/
Note: If you are deploying in production, consider using a process manager like pm2 or systemd and place your environment variables in a secure config file or secret management system.
Additional notes
- Environment variables: keep API keys and database credentials secure. Use a secrets manager in production.
- If you enable AI-based adaptation, monitor usage quotas and latency, as AI services may incur costs and rate limits.
- Safety constraints: configure minimum/maximum intervals and TTL-based hints to avoid runaway polling or stale data.
- For multi-endpoint coordination, define sibling relationships clearly in your endpoint descriptions to ensure proper orchestration.
- If the self-hosted option is used, ensure the hosting environment has network access to any external AI services or APIs required by the MCP server.
- Common issues: incorrect API keys, invalid database connections, port conflicts, and insufficient permissions for the process to bind to the chosen PORT.
Related MCP Servers
koog
Koog is the official Kotlin framework for building predictable, fault-tolerant and enterprise-ready AI agents across all platforms – from backend services to Android and iOS, JVM, and even in-browser environments. Koog is based on our AI products expertise and provides proven solutions for complex LLM and AI problems
lets-learn -python
MCP Python Tutorial
station
Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.
apktool
A MCP Server for APK Tool (Part of Android Reverse Engineering MCP Suites)
AIDA
AI-Driven Security Assessment - Connect AI to 400+ pentesting tools via MCP
data-gov-il
Advanced MCP server for seamless access to Israeli Government Open Data