mcp
π€ An MCP server that serves custom AGENTS.md files and bash scripts. π₯οΈ
claude mcp add --transport stdio nicholaswilde-mcp-server python -m app.server \ --env HOST="0.0.0.0" \ --env PORT="8080" \ --env LOG_LEVEL="info"
How to use
This MCP server hosts a FastAPI-based API that exposes a library of reusable agent instructions and utility scripts for enabling a generative AI model to perform complex, context-aware tasks. The server serves as a centralized reference point for standardized AGENTS.md instructions and accompanying shell utilities, making it easier for an AI agent to execute multi-step actions with consistent behavior. Once running, tools can be accessed via HTTP endpoints defined in the FastAPI app, and clients (like Gemini or other compatible AI models) can query the API to fetch agent instructions, scripts, and related utilities.
To use the server, start it with the provided task flow (for example, using task run during development). The server will listen on the configured host and port (default 0.0.0.0:8080). You can then point your AI integration (e.g., gemini-cli or other orchestration tooling) to the serverβs HTTP URL to retrieve available agents, fetch instructions, and execute utility scripts as needed. The repository also maintains a central agents-library that houses the core instructions and scripts, enabling version-controlled, reusable capabilities for AI-driven tasks.
How to install
Prerequisites:
- Python 3.9+ installed on your system
- Git installed
- Task (taskfile.dev) available for bootstrap and run commands (or adapt to your environment)
Steps:
-
Clone the repository: git clone https://github.com/nicholaswilde/mcp-server.git cd mcp-server
-
Install Python dependencies (adjust if you use a virtual environment): python -m pip install --upgrade pip python -m pip install -r requirements.txt
-
Bootstrap the project (as suggested by the README): task bootstrap
-
Run the server locally: task run
-
Verify the server is up by curling the health endpoint (example): curl http://localhost:8080/docs
Notes:
- If you customize ports or hosts, ensure your environment variables in mcp_config match.
- The server exposes a FastAPI app; you may access interactive API docs at /docs when running.
Additional notes
Tips and common considerations:
- Environment variables: configure PORT and HOST to match your deployment environment. LOG_LEVEL can help with debugging in development.
- If you see networking issues, ensure your firewall allows traffic on the specified port.
- The repository centers around an agents-library and AGENTS.md files. Leverage these as the stable source of AI-instruction templates for repeatable tasks.
- When upgrading, check for updates to the AGENTS library to keep capabilities consistent with the latest task patterns.
- The documentation site is hosted at MkDocs; consult the MkDocs site for deeper usage and integration details.
Related MCP Servers
sample-agentic-ai-demos
Collection of examples of how to use Model Context Protocol with AWS.
just
Share the same project justfile tasks with your AI Coding Agent.
mcp-crew-ai
MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows.
mcpx-py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps
the -company
TheMCPCompany: Creating General-purpose Agents with Task-specific Tools
bauplan
Repository hosting the open source Bauplan MCP server and curated Agent Skills