Get the FREE Ultimate OpenClaw Setup Guide →

sushimcp

SushiMCP is a dev tools MCP that serves context on a roll.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio maverickg59-sushimcp npx -y @chriswhiterocks/sushimcp@latest --llms-txt-source cool_project:https://coolproject.dev/llms-full.txt --openapi-spec-source local_api:http://localhost:8787/api/v1/openapi.json

How to use

SushiMCP is a model context protocol server designed to deliver contextual data to your AI IDEs, helping improve performance and relevance when generating code and other tasks. The server is registered with a client using a preconfigured npx command that wires in a text source for LLMs and an OpenAPI spec source for the API surface SushiMCP should expose. Once registered, the client can point to the provided llms-txt-source and openapi-spec-source to begin leveraging context-aware capabilities during development. This setup enables smoother integration with IDEs and tooling that rely on contextual snippets and API schemas.

To use SushiMCP, install and run the MCP using the registration command shown in the README, which launches the SushiMCP client package and connects the local or remote sources. The llms-txt-source provides the contextual data that the MCP uses to prime or augment the models, while the openapi-spec-source informs the system about available endpoints and schemas. Advanced users can consult the SushiMCP Docs for deeper learning and configuration options to tailor the context delivery to their project workflows.

How to install

Prerequisites:

  • Node.js 14+ and npm (or an environment capable of running npx)
  • Internet access to install the package

Installation and first run:

  1. Use the registration command from the README (this is the recommended way to run SushiMCP with the default configuration):
- Create or use an MCP client config file (as shown in the README):
{
  "sushimcp": {
    "command": "npx",
    "args": [
      "-y",
      "@chriswhiterocks/sushimcp@latest",
      "--llms-txt-source",
      "cool_project:https://coolproject.dev/llms-full.txt",
      "--openapi-spec-source",
      "local_api:http://localhost:8787/api/v1/openapi.json"
    ]
  }
}
  1. Run the MCP client using your preferred environment (the exact command comes from the config above). For a quick start, you can execute the equivalent npx command directly:
npx -y @chriswhiterocks/sushimcp@latest --llms-txt-source cool_project:https://coolproject.dev/llms-full.txt --openapi-spec-source local_api:http://localhost:8787/api/v1/openapi.json
  1. Ensure the LLM TXT source and OpenAPI spec source are reachable by the host running SushiMCP. Adjust the URLs as needed for your environment.
  2. Verify the MCP is active and accessible via the configured endpoints and that your client can fetch the OpenAPI spec and contextual data from the LLMS source.

Additional notes

Tips and caveats:

  • Ensure the llms-txt-source URL and openapi-spec-source URL are accessible from the machine running SushiMCP. Network restrictions can block access and prevent proper context delivery.
  • If you update the LLMS or API sources, restart SushiMCP so it can pick up the new data sources.
  • The AGPL license applies to SushiMCP; ensure compliance when deploying in your environment.
  • Consider pinning the version (latest is used in the example) to avoid unexpected breaking changes; specify a specific version in the package argument if stability is important.
  • Glama.ai badges and other integrations can help you monitor performance and quality of the MCP server in production.

Related MCP Servers

Sponsor this space

Reach thousands of developers