vibepush
MCP-native deployment platform. Any AI deploys to your servers.
claude mcp add --transport stdio vibepush-dev-vibepush node path/to/server.js \ --env MCP_HOST="0.0.0.0" \ --env MCP_PORT="7070" \ --env LOG_LEVEL="info"
How to use
VibePush is an MCP-native deployment platform that lets any MCP-compatible AI agent deploy, monitor, and fix your applications on your own servers. The server sits in your infrastructure and communicates via the MCP protocol, enabling AI-driven deployment workflows, automatic troubleshooting, and cross-service orchestration. With VibePush, you don’t need YAML or Kubernetes expertise; your AI agent can understand your intent in natural language and translate it into actions such as provisioning services, building images, deploying workloads, and scaling resources. You can engage the AI to deploy new services from code, monitor logs and metrics, and automatically apply fixes when issues are detected.
To use it, first run the MCP server alongside your existing infrastructure. Connect your MCP-compatible AI (e.g., Claude or a compatible agent) and start giving natural-language commands like: “Deploy my-app with PostgreSQL” or “Scale my-app to 5 replicas.” The AI will respond with status updates and, if configured, will perform the necessary steps—building images, provisioning databases, configuring networking, and applying self-healing measures as needed.
How to install
Prerequisites:
- A VPS or server with Node.js installed (or a compatible runtime for your chosen deployment method)
- Git installed to clone the repository
- Basic familiarity with MCP (Model Context Protocol)
Install instructions:
-
Clone the repository: git clone https://github.com/vibepush-dev/vebepush.git cd vebepush-dev/vebepush
-
Install dependencies (assuming Node.js/TypeScript stack): npm install
-
Configure environment variables (example): Create a .env file with: MCP_HOST=0.0.0.0 MCP_PORT=7070 LOG_LEVEL=info
-
Start the MCP server: npm run build node path/to/server.js
-
Connect an MCP-compatible AI agent and begin issuing natural-language deployment instructions.
Note: If your deployment target uses Docker/Kubernetes, ensure appropriate permissions and network access. The project may provide additional scripts for running in containerized environments via npm or a docker-compose setup when available.
Additional notes
Tips and caveats:
- Ensure MCP_PORT is accessible from your AI agent host; configure firewalls accordingly.
- If you use self-hosted databases or services, securely manage credentials and follow best practices for secret management.
- The project is in early validation; expect API changes or configuration tweaks as features mature.
- Consider enabling verbose logging during initial experiments to better understand AI decisions and actions.
- If you encounter issues, check that the agent supports MCP and that the server is reachable at MCP_HOST:MCP_PORT.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Wax
Sub-Millisecond RAG on Apple Silicon. No Server. No API. One File. Pure Swift
robloxstudio
Create agentic AI workflows in ROBLOX Studio
open-skills
Battle-tested skill library for AI agents. Save 98% of API costs with ready-to-use code for crypto, PDFs, search, web scraping & more. No trial-and-error, no expensive APIs.
opencode-ultimate-starter
The Ultimate OpenCode Starter Kit. Includes Oh My OpenCode config, Superpowers installation fix, MCP Setup, and Windows Crash Fix (exit_code: -1073740791). Panduan lengkap Bahasa Indonesia & English.
ultrafast
High-performance, ergonomic Model Context Protocol (MCP) implementation in Rust