Get the FREE Ultimate OpenClaw Setup Guide →

9octopus-cli-oss

9Octopus CLI is a powerful, open-source AI coding agent that lives in your terminal. It empowers developers to interact with Large Language Models (LLMs) directly from the command line, enabling seamless coding assistance, file manipulation, and system automation.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio 9octopusdev-9octopus-cli-oss npx -y @9octopus/9octopus-cli

How to use

9Octopus CLI is a terminal-based AI coding agent that lets you chat with large language models directly from your terminal. It can operate in Direct Mode using your own API keys (OpenAI, Anthropic, etc.) or connect to a hosted backend if you prefer. The tool integrates with your development workflow to read and manipulate files, run shell commands, and help with coding tasks, acting as an in-terminal assistant powered by an LLM. To use it, install the CLI and run the 9octopus-cli command to start an interactive session. Inside the session you can issue normal chat messages and leverage the built-in tools to query code, generate snippets, or automate repetitive tasks. You can customize the assistant’s behavior by placing a 9octopus.system.md file at your project root, which the agent will read as its system prompt.

How to install

Prerequisites:

  • Node.js and npm installed on your system
  • Optional: access to an OpenAI or Anthropic API key if using Direct Mode

Installation steps:

  1. Install the CLI globally via npm (recommended):
npm install --global @9octopus/9octopus-cli
  1. Verify installation:
9octopus-cli --version
  1. Alternative using npx (if you don’t want a global install):
npx -y @9octopus/9octopus-cli
  1. Start an interactive session:
9octopus-cli
  1. If you prefer Direct Mode with your own API keys, set environment variables before starting the session (see next section for details).

Additional notes

  • Direct Mode (recommended for developers): set your API keys in the environment to connect directly to LLM providers without a middleman.
  • Environment variables commonly used:
    • OPENAI_API_KEY for OpenAI
    • ANTHROPIC_API_KEY for Anthropic
    • GOOGLE_API_KEY if you’re using Google provider integrations
    • GROK_API_KEY for Grok or other providers as applicable
    • To disable the hosted service and use direct mode, set 9OCTOPUS_SERVICE=false
  • You can customize the agent behavior by adding a 9octopus.system.md at your project root. This file is read as the system prompt for the agent.
  • Commands within the CLI include model management and session controls (as described in the help output). If you need help, type /help inside the session.
  • If you encounter connectivity or API key errors, double-check that the keys are valid and that your network allows outbound access to the LLM provider endpoints.

Related MCP Servers

Sponsor this space

Reach thousands of developers