skill
LLM-managed skills platform using MCP - create, edit, and execute skills programmatically in Claude, Cursor, and any MCP-compatible client without manual file uploads.
claude mcp add --transport stdio fkesheh-skill-mcp uvx fkesheh-skill-mcp
How to use
This MCP server provides a centralized interface to manage and execute multiple skills stored under ~/.skill-mcp/skills. It exposes CRUD operations for skill files and directories, and supports running Python, Bash, and other executable scripts with environment variables automatically injected from each referenced skill. One of the key capabilities is unifying multiple skills into a single execution block, enabling cross-skill imports and streamlined workflows without duplicating dependencies. Use this server to compose complex tasks by importing utilities from various skills and executing them together in a single code block, while environmental context and dependencies are resolved across all involved skills.
To use the server, install the MCP client (via uvx as documented), connect to the skill MCP server, and invoke the CRUD tools to manage your skills. When you execute a multi-skill script, you can import modules from any skill, and the server will auto-aggregate dependencies and environment variables from all referenced skills. This makes it easy to build, test, and run multi-step workflows such as data processing pipelines, simulations, or tooling dashboards that require utilities from multiple skills, all within one execution context.
How to install
Prerequisites:
- A Unix-like environment or Windows with a compatible shell
- Python environment for uv (via uvx installation)
- Internet access to install packages from PyPI
Step 1: Install uv (which includes uvx)
curl -LsSf https://astral.sh/uv/install.sh | sh
Step 2: Install the Skill MCP server package from PyPI
uvx fkesheh-skill-mcp
Step 3: Run or connect to the MCP server
- If the uvx command returns a managed environment, the server will be available automatically in your MCP client configuration.
- Add the server to your MCP client using the provided configuration, or rely on automatic discovery if supported by your client.
Step 4: Verify installation
# Example: list skills (via the MCP client tooling) or fetch server status as available in your client
Notes:
- The server expects skills to live under ~/.skill-mcp/skills and will load dependencies and .env files from referenced skills as needed.
- The inline multi-skill execution capability lets you import modules from any skill and run them in a unified Python execution context.
Additional notes
Tips and considerations:
- Environment injection: When executing a multi-skill script, environment variables from all referenced skills are loaded automatically into the runtime environment.
- Dependency handling: Python dependencies declared inline (PEP 723) or in per-skill metadata are aggregated across all used skills for the unified execution.
- Timeouts: Script executions have a 30-second safety timeout to prevent long-running tasks from hanging the runner.
- Working directory: You can set a custom working directory for each run if needed to isolate file I/O.
- Cross-skill imports: Use standard Python import statements to reference modules from any skill; the server unifies the import path across skills.
- Path considerations: Skill directories and assets should follow the recommended ~/.skill-mcp/skills structure for automatic discovery.
- Troubleshooting: If a skill fails to load, verify SKILL.md presence, valid Python environment declarations, and that the skill’s .env file (if present) loads correctly.
Related MCP Servers
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
mcp-logic
Fully functional AI Logic Calculator utilizing Prover9/Mace4 via Python based Model Context Protocol (MCP-Server)- tool for Windows Claude App etc
apple-books
Apple Books MCP Server
Unified -Tool-Graph
Instead of dumping 1000+ tools into a model’s prompt and expecting it to choose wisely, the Unified MCP Tool Graph equips your LLM with structure, clarity, and relevance. It fixes tool confusion, prevents infinite loops, and enables modular, intelligent agent workflows.
mcpx-py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI