metashape
AI-powered photogrammetry automation — MCP server for Agisoft Metashape Professional 2.3+. Control the full pipeline (alignment, dense cloud, mesh, texture, DEM, orthomosaic, export) using natural language through Claude or any MCP-compatible AI assistant.
claude mcp add --transport stdio jenkinsm13-metashape-mcp python -m metashape_mcp.server
How to use
This MCP server exposes Metashape's entire photogrammetry workflow as AI-callable tools through the MCP interface. Once running inside Metashape's embedded Python environment, you can interact with the server over HTTP at http://127.0.0.1:8765/mcp and drive the processing pipeline using natural language prompts or MCP-compatible AI assistants such as Claude. The server provides access to 106 tools spread across 15 modules (e.g., project, photos, alignment, dense, mesh, texture, survey, export, import, markers, coordinate, network, viewport, scripting) along with real-time resources for project state inspection and guided workflow prompts. It supports non-blocking operation so the Metashape GUI remains responsive while the AI executes long tasks in the background, and it can operate in headless mode for remote servers or CI pipelines.
To use it, send MCP tool calls or prompts that describe the desired photogrammetry task (for example, create a new project, import photos, align cameras, generate a dense point cloud, build a DEM, create an orthomosaic, export to OBJ/PLY/FBX, or run a batch export). The server translates these requests into Metashape actions and streams progress updates to you. You can also query resources such as project info, chunk statistics, or CRS details to monitor progress and verify results. The 6 guided workflow prompts help you run end-to-end pipelines (aerial_survey_pipeline, close_range_pipeline, batch_export, diagnose_alignment, optimize_quality_settings) with minimal manual configuration.
Usage typically involves issuing an MCP call like a tool name or prompt through your MCP client, then awaiting the AI-driven Metashape processing results. You can monitor progress in the Metashape Console or through the MCP response stream, and leverage the non-blocking UI to continue editing or inspecting results while the AI runs in the background.
How to install
Prerequisites
- Agisoft Metashape Professional 2.3+ with Python 3.12 scripting support
- Python 3.12+ available in the Metashape environment
- Git installed on your development workstation
Installation steps
- Clone the MCP server repository
git clone https://github.com/jenkinsm13/metashape-mcp.git
cd metashape-mcp
- Ensure Metashape can access the MCP Python package
- Metashape ships with its own embedded Python; this MCP server is designed to run inside that environment. The repository includes a startup workflow that installs the MCP SDK and FastMCP on first run.
- Start the MCP server inside Metashape
-
There are multiple options described in the README. The recommended approach is to run the included startup script inside Metashape or via the Metashape Python Console:
-
Option C (from Metashape's Python Console):
import sys
sys.path.insert(0, r"C:\path\to\metashape-mcp\src")
from metashape_mcp.server import start_background
start_background()
# Server running on http://127.0.0.1:8765/mcp
- Verify server startup
- Open a browser or MCP client and navigate to http://127.0.0.1:8765/mcp to confirm the MCP endpoint is available and ready to accept tool calls.
- Optional: Run auto-install script on first run
- The startup scripts detect missing packages (mcp, fastmcp) and install them into Metashape's Python automatically on first run.
Notes
- If you prefer to run directly from the system Python, use the module invocation as shown in the README (adjust paths to your environment).
- For headless deployments, ensure proper network access and that Metashape is configured to allow background processing.
Additional notes
Tips and common issues:
- Ensure Metashape Professional 2.3+ is installed with Python 3.12 scripting support; the MCP server relies on this environment.
- The MCP server runs inside Metashape, so network-based or headless execution may require configuring Metashape to start in headless mode or via CI pipelines.
- The server exposes a real-time streaming interface for progress updates; monitor console output for detailed tool calls and status messages during long operations.
- If you encounter port conflicts, adjust the MCP server port in your client configuration or in the startup script where applicable.
- Use the guided workflow prompts to quickly bootstrap common pipelines (e.g., aerial_survey_pipeline or close_range_pipeline) and then tailor individual tool parameters as needed.
- When exporting, ensure the target formats are supported by your Metashape project version and licensing.
- Review the Resources endpoints (e.g., metashape://project/info, /chunks, /chunk/{label}/summary) to track progress and verify results throughout the workflow.
Related MCP Servers
Pare
Dev tools, optimized for agents. Structured, token-efficient MCP servers for git, test runners, npm, Docker, and more.
omega-memory
Persistent memory for AI coding agents
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
skill-to
Convert AI Skills (Claude Skills format) to MCP server resources - Part of BioContextAI
cursor-feedback-extension
Save your Cursor monthly quota! Unlimited AI interactions in one conversation via MCP feedback loop.
gmail
A robust Model Context Protocol server for Gmail integration with intelligent authentication and comprehensive email operations