Get the FREE Ultimate OpenClaw Setup Guide →

deepview

DeepView MCP is a Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini 2.5 Pro's extensive context window.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ai-1st-deepview-mcp python -m deepview_mcp \
  --env GEMINI_API_KEY="your_gemini_api_key"

How to use

DeepView MCP exposes a single tool named deepview that lets IDEs like Cursor and Windsurf analyze an entire codebase using Gemini's large context window. The server is configured to load a codebase from a single text file (optional) and then answer questions about that codebase. To use it from an IDE, add a new MCP server with the path to the deepview_mcp entry point and your Gemini API key in the environment. The primary usage pattern is: start the server (via the MCP framework in your IDE), then invoke the deepview tool by asking a question about the codebase. You can optionally supply a codebase_file so the server loads it before querying. You can also set the Gemini model via the command-line option, or rely on IDE defaults. Tools available are intentionally minimal: a single tool named deepview that accepts a required question and an optional codebase_file parameter to scope the query to a specific file or dataset.

How to install

Prerequisites:

  • Python 3.13+ installed on your system
  • A Gemini API key from Google AI Studio (or equivalent) with access to the Gemini models you want to use

Installation via pip (recommended):

pip install deepview-mcp

Alternative installation (via Smithery, for Claude Desktop):

npx -y @smithery/cli install @ai-1st/deepview-mcp --client claude

Usage after installation (basic):

deepview-mcp [path/to/codebase.txt]

To configure via your IDE MCP settings, add a server with the following example (adjust paths and keys as needed):

{
  "mcpServers": {
    "deepview": {
      "command": "/path/to/deepview-mcp",
      "args": [],
      "env": {
        "GEMINI_API_KEY": "your_gemini_api_key"
      }
    }
  }
}

If you want to pin a specific Gemini model, pass it through args, for example: --model gemini-2.5-pro-exp-03-25.

Additional notes

Tips:

  • Ensure your GEMINI_API_KEY is set in the environment for the MCP to authenticate with Gemini.
  • You can preload a codebase by providing a codebase file path in the mcp configuration or by supplying it when querying.
  • If you encounter issues with loading the codebase, verify the file path and format expected by your codebase preparation tool (e.g., repomix).
  • The server assumes a single-file codebase input; for very large projects consider pre-packaging with repomix as described in the docs.
  • For model selection and debugging, you can adjust the --log-level option in the CLI or via your IDE's MCP config where supported.

Related MCP Servers

Sponsor this space

Reach thousands of developers