sushimcp
SushiMCP is a dev tools MCP that serves context on a roll.
claude mcp add --transport stdio maverickg59-sushimcp npx -y @chriswhiterocks/sushimcp@latest --llms-txt-source cool_project:https://coolproject.dev/llms-full.txt --openapi-spec-source local_api:http://localhost:8787/api/v1/openapi.json
How to use
SushiMCP is a model context protocol server designed to deliver contextual data to your AI IDEs, helping improve performance and relevance when generating code and other tasks. The server is registered with a client using a preconfigured npx command that wires in a text source for LLMs and an OpenAPI spec source for the API surface SushiMCP should expose. Once registered, the client can point to the provided llms-txt-source and openapi-spec-source to begin leveraging context-aware capabilities during development. This setup enables smoother integration with IDEs and tooling that rely on contextual snippets and API schemas.
To use SushiMCP, install and run the MCP using the registration command shown in the README, which launches the SushiMCP client package and connects the local or remote sources. The llms-txt-source provides the contextual data that the MCP uses to prime or augment the models, while the openapi-spec-source informs the system about available endpoints and schemas. Advanced users can consult the SushiMCP Docs for deeper learning and configuration options to tailor the context delivery to their project workflows.
How to install
Prerequisites:
- Node.js 14+ and npm (or an environment capable of running npx)
- Internet access to install the package
Installation and first run:
- Use the registration command from the README (this is the recommended way to run SushiMCP with the default configuration):
- Create or use an MCP client config file (as shown in the README):
{
"sushimcp": {
"command": "npx",
"args": [
"-y",
"@chriswhiterocks/sushimcp@latest",
"--llms-txt-source",
"cool_project:https://coolproject.dev/llms-full.txt",
"--openapi-spec-source",
"local_api:http://localhost:8787/api/v1/openapi.json"
]
}
}
- Run the MCP client using your preferred environment (the exact command comes from the config above). For a quick start, you can execute the equivalent npx command directly:
npx -y @chriswhiterocks/sushimcp@latest --llms-txt-source cool_project:https://coolproject.dev/llms-full.txt --openapi-spec-source local_api:http://localhost:8787/api/v1/openapi.json
- Ensure the LLM TXT source and OpenAPI spec source are reachable by the host running SushiMCP. Adjust the URLs as needed for your environment.
- Verify the MCP is active and accessible via the configured endpoints and that your client can fetch the OpenAPI spec and contextual data from the LLMS source.
Additional notes
Tips and caveats:
- Ensure the llms-txt-source URL and openapi-spec-source URL are accessible from the machine running SushiMCP. Network restrictions can block access and prevent proper context delivery.
- If you update the LLMS or API sources, restart SushiMCP so it can pick up the new data sources.
- The AGPL license applies to SushiMCP; ensure compliance when deploying in your environment.
- Consider pinning the version (latest is used in the example) to avoid unexpected breaking changes; specify a specific version in the package argument if stability is important.
- Glama.ai badges and other integrations can help you monitor performance and quality of the MCP server in production.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
ragrabbit
Open Source, Self-Hosted, AI Search and LLM.txt for your website
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
mcp-llms-txt-explorer
MCP to explore websites with llms.txt files
mcp-easy-installer
MCP easy installer is a robust mcp server with tools to search, install, configure, repair and uninstall MCP servers
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!