gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
claude mcp add --transport stdio google-gemini-gemini-cli npx -y @google/gemini-cli \ --env GEMINI_API_KEY="Your Gemini API key (if using API key authentication)" \ --env GOOGLE_API_KEY="Google Cloud API key (if using Vertex/Google integration)" \ --env GOOGLE_CLOUD_PROJECT="Your Google Cloud project ID" \ --env GOOGLE_GENAI_USE_VERTEXAI="Set to 'true' to enable Vertex AI integration"
How to use
Gemini CLI is an open-source AI agent that brings Gemini capabilities directly into your terminal. It provides a lightweight, terminal-first interface to Gemini, exposing tools such as grounding with Google Search, file operations, shell commands, and web fetching. Through MCP (Model Context Protocol) support, Gemini CLI can be extended with custom integrations and workflows, enabling automated tasks and scripted interactions from the command line. Use it to query codebases, automate repetitive tasks, fetch real-time information, and orchestrate multi-step workflows from scripts or interactive sessions.
To use Gemini CLI, install or run it via npx, npm, or your preferred package manager, then invoke gemini with optional flags. You can run in interactive mode for conversational prompts or in non-interactive mode to obtain structured outputs suitable for scripting. Typical usage includes starting a session, selecting a model, and issuing prompts that Gemini will answer while grounding responses with real-time information when needed. The MCP support allows you to connect Gemini CLI with other services or custom capabilities, expanding its reach beyond the built-in features.
How to install
Prerequisites:
- Node.js (14.x or newer) and npm/yarn installed
- Basic familiarity with the command line
Installation steps:
-
Install or verify Node.js and npm:
- macOS/Linux: follow nodejs.org installation instructions
- Windows: install via Node.js installer
-
Install Gemini CLI using npx (no permanent install required): npm run -s if you need an npm script; otherwise you can run directly:
npx -y @google/gemini-cli
-
(Optional) Install globally for easier access: npm install -g @google/gemini-cli
-
Configure authentication (environment variables):
- For API key: export GEMINI_API_KEY=YOUR_API_KEY
- For Vertex/Google integration: export GOOGLE_API_KEY=YOUR_API_KEY and export GOOGLE_GENAI_USE_VERTEXAI=true
- Set your Google Cloud project: export GOOGLE_CLOUD_PROJECT=YOUR_PROJECT_ID
-
Run Gemini CLI: gemini
Note: If using npx, you can run commands directly without permanent installation.
Additional notes
Tips and common issues:
- If you encounter authentication errors, verify that the correct environment variables are exported and accessible in your shell.
- For MCP integrations, ensure the target server name (gemini-cli) matches in your MCP config and that the command is visible in your runtime environment.
- When using Vertex AI, ensure your Google Cloud project has the appropriate permissions and API access enabled.
- The Gemini CLI supports a wide range of prompts; for scripting, use --output-format json or --output-format stream-json to get structured results suitable for parsing.
- If using npx in constrained environments, consider a local npm cache or a temporary installation to avoid network-related delays on every invocation.
Related MCP Servers
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
mcpcat-python-sdk
MCPcat is an analytics platform for MCP server owners 🐱.
mcp -js
MCP server that exposes YepCode processes as callable tools for AI platforms. Securely connect AI assistants to your YepCode workflows, APIs, and automations.
mcp-gemini
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).
google-scholar
An MCP server for Google Scholar written in TypeScript with Streamable HTTP