inAI-wiki
🌍 The open-source Wikipedia of AI — 2M+ apps, agents, LLMs & datasets. Updated daily with tools, tutorials & news.
claude mcp add --transport stdio inai-sandy-inai-wiki docker run -i -p 8080:8080 inai-sandy/inai-wiki:latest
How to use
The inAI-wiki MCP server exposes an index of the INAI project’s AI resources as part of the Model Context Protocol ecosystem. This server aggregates the INAI wiki’s entries and related resources (apps, agents, MCP servers, tools, and tutorials) to allow MCP-capable clients to discover, query, and interact with Open AI ecosystem content via MCP endpoints. Use your MCP client or integration that can resolve MCP servers, issue requests, and process responses to fetch entries like available MCP servers, tools, or knowledge articles referenced within the INAI wiki. The data provided includes metadata such as titles, descriptions, categories, and links to deeper resources, enabling chains of context to be built across AI tools and documentation. The server is designed to be lightweight and to respond with structured MCP payloads suitable for downstream tooling, dashboards, or agent workflows.
How to install
Prerequisites:
- Docker installed on your machine (Docker Desktop for Windows/macOS, or Docker Engine on Linux)
- Basic familiarity with MCP clients and how to register MCP servers
Installation steps:
- Pull and run the MCP server image:
docker pull inai-sandy/inai-wiki:latest
- Start the server (already configured to expose MCP endpoints on port 8080):
docker run -d --name inai-wiki-mcp -p 8080:8080 inai-sandy/inai-wiki:latest
- Verify it's running by hitting the endpoint (adjust if your MCP client uses a different base path):
curl http://localhost:8080/.well-known/mcp-servers
- Register the MCP server in your MCP client or orchestrator and begin querying for resources such as MCP servers, apps, agents, and tutorials listed in the INAI wiki.
Notes:
- If you need to customize ports or environment, you can pass -e VAR=value options to docker run as needed.
- Ensure network access allows your MCP clients to reach the host port (8080 by default).
Additional notes
Tips and common issues:
- If the container fails to start, check Docker logs for missing dependencies or port conflicts:
docker logs inai-wiki-mcp. - Some MCP clients expect a static base URL; ensure the server URL is consistently reachable in your environment.
- The INAI wiki data is updated regularly; consider scheduling periodic reloads or cache invalidation in your client to reflect new entries.
- If you prefer to run locally without Docker, you would need a Node/Express or static-server setup that serves the wiki assets; this configuration uses Docker for portability.
- Environment variables can be introduced to adjust data sources or logging levels; consider adding
LOG_LEVELorDATA_SOURCEif you customize the container.
Related MCP Servers
ai-engineering-hub
In-depth tutorials on LLMs, RAGs and real-world AI agent applications.
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
learn-ai-engineering
Learn AI and LLMs from scratch using free resources
dbhub
Zero-dependency, token-efficient database MCP server for Postgres, MySQL, SQL Server, MariaDB, SQLite.
sre
The SmythOS Runtime Environment (SRE) is an open-source, cloud-native runtime for agentic AI. Secure, modular, and production-ready, it lets developers build, run, and manage intelligent agents across local, cloud, and edge environments.
DeepMCPAgent
Model-agnostic plug-n-play LangChain/LangGraph agents powered entirely by MCP tools over HTTP/SSE.