cockpit
npx machina-cli add skill nsalvacao/nsalvacao-claude-code-plugins/cockpit --openclawCockpit Skill
A browser-based dashboard that provides a visual interface for the productivity system.
Components
| File | Purpose |
|---|---|
index.html | Main dashboard UI |
bridge.py | Local Python HTTP server (file I/O + AI chatbot proxy) |
cockpit.sh | Launcher for Linux/WSL |
cockpit.bat | Launcher for Windows |
.cockpit.json | Per-project configuration (paths, pulse rules, AI settings) |
assets/css/styles.css | Dashboard styles |
assets/js/app.js | Dashboard logic |
Features
- Task Board — Kanban view of TASKS.md with drag-and-drop, inline editing
- Memory Viewer — Browse and edit memory files (glossary, people, projects)
- Project Pulse — Health check for expected files and directories
- AI Chatbot — Ask questions about your project using any AI CLI or API
- Spotlight Search — Ctrl+K to search across tasks, memory, and inventory
- Focus Timer — Built-in Pomodoro timer
- Drift Watcher — Auto-detects external file changes
Launching
The cockpit is launched via /productivity-cockpit:start, which:
- Copies
.cockpit.jsonto the project root if it doesn't exist - Starts
bridge.pyin the background - Opens
http://localhost:8001in the browser
AI Chatbot Configuration
The chatbot is configured per-project in .cockpit.json:
{
"ai": {
"mode": "cli",
"cli": "claude",
"args": []
}
}
Supported modes:
- CLI mode — Calls any AI CLI binary as a subprocess (claude, gemini, copilot, codex, ollama)
- API mode — Direct HTTP calls to AI providers (set
"mode": "api"with"provider"and env var for key)
.cockpit.json Reference
{
"name": "My Project Cockpit",
"version": "2.0.0",
"paths": {
"tasks": "TASKS.md",
"memory": "memory",
"output": "output"
},
"pulse_rules": {
"essential_files": ["README.md"],
"min_folders": []
},
"ai": {
"mode": "cli",
"cli": "claude",
"args": []
}
}
The paths section should match your project structure. The cockpit reads these paths relative to your project root.
Notes
- Bridge server runs on port 8001 (auto-increments if busy)
- All file writes are sandboxed to the project root (path traversal is blocked)
- CORS headers are set for localhost development
- The chatbot sends project context (task state, inventory) with each query
Source
git clone https://github.com/nsalvacao/nsalvacao-claude-code-plugins/blob/main/plugins/productivity-cockpit/skills/cockpit/SKILL.mdView on GitHub Overview
Cockpit is a browser-based dashboard that visualizes your productivity system. It combines a Kanban style Task Board, a Memory Viewer, a Project Pulse health check, and an AI chatbot, all wired to a local Python bridge server for secure file I/O and AI queries. This setup makes project work more organized, observable, and chat driven within your workspace.
How This Skill Works
The UI (index.html) talks to a local bridge.py HTTP server that proxies file I/O and AI queries. A per project .cockpit.json configures paths, pulse rules, and AI settings, and the cockpit is launched via cockpit.sh or cockpit.bat. When started, it copies .cockpit.json to the project root if missing, launches bridge.py in the background, and opens http://localhost:8001 for the dashboard.
When to Use It
- Starting a new project to map tasks, memory, and files.
- Running regular Project Pulse checks to verify essential files and folders.
- Asking the AI chatbot questions with project context for faster answers.
- Using Spotlight Search to jump to tasks, memory entries, or inventory with Ctrl+K.
- Developing locally with a Python bridge for fast file I/O and AI queries.
Quick Start
- Step 1: Launch cockpit by running the launcher script on your OS or visit the cockpit start endpoint.
- Step 2: The launcher copies the per project config, starts the bridge, and boots the UI.
- Step 3: Open http://localhost:8001 and begin using the Task Board, Memory Viewer, Project Pulse, and AI Chatbot.
Best Practices
- Keep a precise per project .cockpit.json with correct paths and pulse rules.
- Refresh project pulse regularly to catch drift in essential files.
- Leverage Spotlight Search to minimize context switching.
- Choose CLI mode for low latency AI together with the local bridge, or API mode for cloud AI as needed.
- Ensure the bridge is running and the dashboard is accessible at localhost:8001; all writes stay sandboxed in the project root.
Example Use Cases
- A developer sets up cockpit for a new repo with TASKS.md and memory folders, then uses Project Pulse to confirm essential files.
- A team member updates glossary in Memory Viewer and uses AI chatbot to query project notes.
- A project lead asks the AI chatbot for sprint status and receives context enriched answers.
- A user uses Ctrl+K Spotlight to jump from a task to a related memory entry while editing TASKS.md.
- A local workflow uses the Python bridge to perform file I/O and AI queries without exposing keys to the browser.