Awesome s-directory
A comprehensive collection of Model Context Protocol (MCP) servers categorized by functionality. This repository helps developers and AI enthusiasts discover and utilize the wide array of available MCP servers for various applications.
claude mcp add --transport stdio habitoai-awesome-mcp-servers-directory echo This repository is a catalog of MCP servers and categories, not a single runnable MCP server. \ --env NOTE="This MCP catalog provides links to individual MCP server implementations. Use the specific server repos listed in the README for actual runtimes."
How to use
This repository is a curated directory of MCP servers organized by category. It functions as a catalog rather than a single run-ready MCP server. To use MCP capabilities, browse the sections and follow the links to individual MCP server projects (for example, OpenAI, HuggingFace, or other vendor-specific MCP implementations). Each linked repository contains its own setup, installation, and usage instructions. When you identify an MCP server that fits your needs, clone or deploy that specific server according to its own README, and then connect your Claude or other MCP-compatible client to interact with the tools and endpoints that server exposes. The directory structure and category names (AI Services, Browser Automation, Cloud Platforms, etc.) help you discover servers that provide tools such as model queries, web automation, data retrieval, and tool integration.
How to install
Since this repository is a catalog and not a single MCP server, installation steps are focused on exploring and using individual server projects listed within. Prerequisites: git, a terminal, and internet access.
Steps:
- Clone the catalog (optional): git clone https://github.com/habitoai-awesome-mcp-servers-directory.git
- Navigate categories and select a specific MCP server project listed in the README.
- Open the chosen project’s repository and follow its own installation instructions. Each server will have its own prerequisites (e.g., Node.js, Python, Docker) and setup steps.
- After installing the chosen server, start it according to its guide and configure your MCP client to connect to the server’s endpoint.
If you prefer not to clone the catalog, you can directly open the individual MCP server repositories linked in the README and follow their dedicated installation guides.
Additional notes
Tips:
- This repository acts as a directory rather than a runnable server. Treat it as a map to discover MCP capabilities across different tools.
- Use the category sections to quickly locate servers that match your needs (e.g., AI Services, Browser Automation, Cloud Platforms).
- When deploying a specific MCP server, pay attention to its required environment variables, authentication methods, and any service dependencies (databases, cloud credentials, API keys).
- Some MCP servers may offer Docker images or prebuilt binaries; prefer those for quicker setup, but verify compatibility with your environment.
- If you encounter broken or outdated links, check the respective repository for an updated location or contact maintainers.
- Maintain security by using least-privilege credentials for any external services and rotate API keys regularly.
Related MCP Servers
mcp-nixos
MCP-NixOS - Model Context Protocol Server for NixOS resources
Awesome
Awesome ModelContextProtocol resources - A curated list of MCP resources
spec-kit
MCP server enabling AI assistants to use GitHub's spec-kit methodology
local -gateway
Aggregate multiple MCP servers into a single endpoint with web UI, OAuth 2.1, and profile-based tool management
openapi -swagger
Solve AI context window limits for API docs | Convert any Swagger/OpenAPI to searchable MCP server | AI-powered endpoint discovery & code generation | Works with Cursor, Claude, VS Code
openapi-to
Transform OpenAPI specifications into production-ready MCP servers with AI-powered evaluation and enhancement. Leverages LLMs to analyze, improve, and generate Model Context Protocol implementations from your existing API documentation.