llms-txt-generator
The ultimate AI-powered generator for llms.txt and llms-full.txt files.
claude mcp add --transport stdio aircodelabs-llms-txt-generator npx -y llms-txt-generator-mcp
How to use
llms-txt-generator is an MCP-enabled tool designed to automatically create AI-optimized llms.txt and llms-full.txt documentation files for your project. It provides an approachable CLI to generate both concise and comprehensive documentation that helps AI models understand codebases more effectively. The MCP integration exposes a single tool named generate-llms, which you can invoke from MCP clients to produce both navigation-friendly and full project documentation tailored to your needs. You can configure the tool in Cursor or Claude Desktop so your AI assistant can request documentation generation directly via the MCP channel. When used through MCP, the tool reads your project and outputs llms.txt and llms-full.txt to your configured paths, enabling AI agents to navigate and reason about your project with richer context.
How to install
Prerequisites:
- Node.js v18+ installed on your system
- pnpm package manager
- Basic TypeScript knowledge (optional for contribution)
Install and run locally:
- Clone the repository:
git clone <repository-url>
cd llms-txt-generator
- Install dependencies:
pnpm install
- Build the project (produces distributables and CLI):
pnpm build
- Run tests (optional but recommended):
pnpm test
Usage examples:
- Run the CLI directly via npx (from npm registry):
npx llms-txt-generator init
npx llms-txt-generator build
npx llms-txt-generator help
- If installed locally, you can also run:
pnpm exec llms-txt-generator init
pnpm exec llms-txt-generator build
MCP setup (example): If you want to expose the MCP endpoint via npx as shown in the documentation, you can configure your MCP client with the following (adjust project name as needed):
{
"mcpServers": {
"llms-generator": {
"command": "npx",
"args": ["-y", "llms-txt-generator-mcp"]
}
}
}
Additional notes
Tips and common issues:
- Ensure Node.js v18+ is used to avoid TypeScript and dependency compatibility issues.
- If you encounter a module not found error, install globally or use npx as shown in the Quick Start.
- MCP integration relies on the npm package llms-txt-generator-mcp; make sure it resolves in your environment.
- When configuring for Cursor or Claude Desktop, keep the MCP server name consistent (e.g., llms-generator) and ensure the command/arguments match your setup.
- If you customize output paths or formats, review the generated llms.txt and llms-full.txt to confirm they reflect your project structure as expected.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
mcp
🤖 Taskade MCP · Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.
offeryn
Build tools for LLMs in Rust using Model Context Protocol
short-url
简单易用的短链接生成工具,完全开源、免费、无需登录,可私有化部署,链接永久有效!
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
mcp-client-gen
Turn any MCP server into a type-safe TypeScript SDK in seconds - with OAuth 2.1 and multi-provider support