skene-growth
Product-Led Growth (PLG) analysis toolkit that detects tech stacks, plans growth loops and builds the loop iteratively.
claude mcp add --transport stdio skenetechnologies-skene-growth uvx --from skene[mcp] skene-mcp \ --env SKENE_API_KEY="your-api-key"
How to use
Skene includes an MCP server that enables integration of Skene's growth analysis capabilities with AI assistants. The MCP server exposes Skene's Growth toolkit to your assistant environment, allowing you to analyze codebases, generate growth-focused implementation plans, and push telemetry or telemetry-related prompts through consistent prompts and workflows. With the MCP server configured, you can query and drive Skene workflows from within your assistant, enabling seamless request/response interactions for growth opportunities, feature discovery, and implementation prompts. The server is designed to work with uvx, acting as a bridge between your AI assistant and Skene's Python CLI.
How to install
Prerequisites:
- Python 3.8+ and pip
- uv (the Universal CLI) installed or installable via the provided scripts
Installation steps:
- Install uv (if you don’t have it): curl -LsSf https://astral.sh/uv/install.sh | sh
- Install the Skene CLI (if you want to run the local CLI directly): pip install skene
- Install uvx (the universal runner) if not already installed or use the recommended local setup: uvx skene
- Verify installation by running a Skene command, e.g.: uvx skene --help
- Configure the MCP server in your assistant environment using the following example (adjust as needed): { "mcpServers": { "skene": { "command": "uvx", "args": ["--from", "skene[mcp]", "skene-mcp"], "env": { "SKENE_API_KEY": "your-api-key" } } } }
Additional notes
Tips and considerations:
- The MCP server uses uvx to expose Skene's MCP entry points; ensure your environment variables (e.g., SKENE_API_KEY) are securely managed.
- If you run into authentication or API key issues, verify that SKENE_API_KEY is set in the environment where the MCP server is executed.
- Skene supports multiple capabilities such as codebase analysis, growth feature discovery, growth plan generation, and implementation prompts; use the provided commands (analyze, plan, build, status, push) to drive these features through the MCP workflow.
- When integrating with an AI assistant, you can add the MCP server under the mcpServers section of your assistant’s configuration to enable prompt-driven execution of Skene tasks.
- The server is Python-based (CLI under src/skene) with a Go-based TUI wrapper, so ensure both components can be executed in your deployment environment if you rely on the UI components.
- If you need to upgrade, follow the same installation steps and revalidate that the MCP server remains reachable from your assistant configuration.
Related MCP Servers
mcp-mail
📧 MCP Mail Tool - AI-powered email management tool | 基于 MCP 的智能邮件管理工具
fast-filesystem
A high-performance Model Context Protocol (MCP) server that provides secure filesystem access for Claude and other AI assistants.
mcpx
Token-efficient MCP client: TypeScript schemas instead of JSON, LLM-friendly syntax, batch calls, TOON output. Built for Claude/GPT automations.
mcp-jest
Automated testing for Model Context Protocol servers. Ship MCP Servers with confidence.
github-to
Convert GitHub repositories to MCP servers automatically. Extract tools from OpenAPI, GraphQL & REST APIs for Claude Desktop, Cursor, Windsurf, Cline & VS Code. AI-powered code generation creates type-safe TypeScript/Python MCP servers. Zero config setup - just paste a repo URL. Built for AI assistants & LLM tool integration.
PackageFlow
A visual DevOps hub for npm scripts, Git, workflows, and deploy — controllable by AI via MCP.