Get the FREE Ultimate OpenClaw Setup Guide →

fuel

A Fuel MCP server which provides support for Fuel docs and various coding IDEs such as Cursor.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio fuellabs-fuel-mcp-server bun run /absolute/path/to/fuel-mcp-server/src/cli.ts --transport stdio \
  --env LOG_LEVEL="Set to debug for verbose output (optional)" \
  --env CHUNK_SIZE="Target tokens per chunk (default: 2000)" \
  --env NUM_RESULTS="Number of search results (default: 5)" \
  --env EMBEDDING_MODEL="Hugging Face model (default: Xenova/all-MiniLM-L6-v2)" \
  --env VECTRA_INDEX_PATH="Vector database location (default: ./vectra_index)"

How to use

This MCP server provides semantic search over Fuel Network and Sway documentation. It indexes local Markdown docs into a file-based vector store using embeddings and exposes search capabilities via MCP tools. You can interact with the server through STDIO or HTTP transports, enabling IDE integrations (e.g., Cursor) to query for Fuel and Sway concepts, API references, and tutorials. The server exposes tools such as searchFuelDocs for semantic queries and provideStdContext to retrieve standard library paths and types for Sway, making it easier to build contextual tooling in your editor. Start the server in STDIO mode for quick local use, or switch to HTTP transport to expose a RESTful endpoint for integration in other environments.

How to install

Prerequisites:

  • Bun installed (https://bun.sh/)
  • Access to the repository with the Fuel MCP server code

Step-by-step installation:

  1. Clone the repository git clone https://github.com/FuelLabs/fuel-mcp-server cd fuel-mcp-server

  2. Install dependencies bun install

  3. Index the documentation (example uses ./docs as default) bun run src/indexer.ts ./docs

  4. Start the MCP server (STDIO transport by default) bun run src/cli.ts

Optional: Start with HTTP transport (if you plan to expose the server via HTTP) bun run src/cli.ts --transport http --port 3500

Notes:

  • Ensure your environment has access to the docs you want to index.
  • You can customize embedding models and chunk sizes via environment variables as described in the config.

Additional notes

Tips and common considerations:

  • The server uses a local file-based Vectra index; no external database is required beyond the vectra_index directory. If you move or rename the index, update VECTRA_INDEX_PATH accordingly.
  • Embedding model selection can impact indexing time and search quality. If you see slower indexing, try a smaller CHUNK_SIZE or a lighter model; for higher quality, use a larger model as long as resources permit.
  • NUM_RESULTS controls the default number of results returned by searchFuelDocs; you can override it per query as needed.
  • If you run into transport binding issues with HTTP, ensure the port is free and not blocked by a firewall.
  • For development, the provideStdContext tool returns Sway standard library paths and types, which helps in editor autocomplete and tooling.
  • Environment variables are documented in the config; adjust VECTRA_INDEX_PATH, EMBEDDING_MODEL, CHUNK_SIZE, NUM_RESULTS, and LOG_LEVEL as needed for your environment.

Related MCP Servers

Sponsor this space

Reach thousands of developers