Get the FREE Ultimate OpenClaw Setup Guide →

inAI-wiki

🌍 The open-source Wikipedia of AI — 2M+ apps, agents, LLMs & datasets. Updated daily with tools, tutorials & news.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio inai-sandy-inai-wiki docker run -i -p 8080:8080 inai-sandy/inai-wiki:latest

How to use

The inAI-wiki MCP server exposes an index of the INAI project’s AI resources as part of the Model Context Protocol ecosystem. This server aggregates the INAI wiki’s entries and related resources (apps, agents, MCP servers, tools, and tutorials) to allow MCP-capable clients to discover, query, and interact with Open AI ecosystem content via MCP endpoints. Use your MCP client or integration that can resolve MCP servers, issue requests, and process responses to fetch entries like available MCP servers, tools, or knowledge articles referenced within the INAI wiki. The data provided includes metadata such as titles, descriptions, categories, and links to deeper resources, enabling chains of context to be built across AI tools and documentation. The server is designed to be lightweight and to respond with structured MCP payloads suitable for downstream tooling, dashboards, or agent workflows.

How to install

Prerequisites:

  • Docker installed on your machine (Docker Desktop for Windows/macOS, or Docker Engine on Linux)
  • Basic familiarity with MCP clients and how to register MCP servers

Installation steps:

  1. Pull and run the MCP server image:
docker pull inai-sandy/inai-wiki:latest
  1. Start the server (already configured to expose MCP endpoints on port 8080):
docker run -d --name inai-wiki-mcp -p 8080:8080 inai-sandy/inai-wiki:latest
  1. Verify it's running by hitting the endpoint (adjust if your MCP client uses a different base path):
curl http://localhost:8080/.well-known/mcp-servers
  1. Register the MCP server in your MCP client or orchestrator and begin querying for resources such as MCP servers, apps, agents, and tutorials listed in the INAI wiki.

Notes:

  • If you need to customize ports or environment, you can pass -e VAR=value options to docker run as needed.
  • Ensure network access allows your MCP clients to reach the host port (8080 by default).

Additional notes

Tips and common issues:

  • If the container fails to start, check Docker logs for missing dependencies or port conflicts: docker logs inai-wiki-mcp.
  • Some MCP clients expect a static base URL; ensure the server URL is consistently reachable in your environment.
  • The INAI wiki data is updated regularly; consider scheduling periodic reloads or cache invalidation in your client to reflect new entries.
  • If you prefer to run locally without Docker, you would need a Node/Express or static-server setup that serves the wiki assets; this configuration uses Docker for portability.
  • Environment variables can be introduced to adjust data sources or logging levels; consider adding LOG_LEVEL or DATA_SOURCE if you customize the container.

Related MCP Servers

Sponsor this space

Reach thousands of developers ↗