DeepseekMCP
Simple MCP server in Golang to redirect questions to Deepseek models
claude mcp add --transport stdio chew-z-deepseekmcp /Your/project/path/bin/mcp-deepseek \ --env DEEPSEEK_MODEL="deepseek-chat" \ --env DEEPSEEK_API_KEY="YOUR_API_KEY" \ --env DEEPSEEK_SYSTEM_PROMPT="Optional custom code review instructions"
How to use
DeepSeek MCP Server integrates with DeepSeek's API to provide an automated code review and file-context-aware querying workflow. It exposes a set of tools under the core API, including deepseek_ask for analyzing code and answering questions (with optional file_paths to provide context), deepseek_models to list available models, deepseek_balance to check your DeepSeek account balance, and deepseek_token_estimate to forecast token usage for a given text or file. The server supports JSON mode for structured outputs, making it convenient for integration into CI/CD pipelines and automation tasks. To use, deploy the DeepSeek MCP server, configure your environment with your API key and preferred model, and invoke the available tools by sending the appropriate JSON payloads, optionally including local file paths which will be uploaded and used as context during the request. The system can be controlled via command-line overrides to adjust model, prompt, and temperature on the fly.
Typical usage involves selecting a model (e.g., deepseek-chat or deepseek-coder), optionally adjusting the system prompt, and then making requests with file_paths when you need the model to analyze specific source files. For example, you can run deepseek_ask with a query like Review this Go code for concurrency issues, include the relevant files (main.go, config.go), and set json_mode to true for a JSON-formatted response. The server will enforce file-type and size restrictions and provide structured error logging in degraded mode if initialization issues occur.
How to install
Prerequisites:
- Go 1.21+ installed on your system
- Git installed
- A DeepSeek API key
Step-by-step installation:
-
Clone the repository: git clone https://github.com/your-username/DeepseekMCP cd DeepseekMCP
-
Build the MCP server binary: go build -o bin/mcp-deepseek
-
Prepare environment variables (example): export DEEPSEEK_API_KEY=your_api_key export DEEPSEEK_MODEL=deepseek-chat export DEEPSEEK_SYSTEM_PROMPT="Your custom code review prompt here" # optional
-
Run the server: ./bin/mcp-deepseek
-
Optional: run via MCP configuration by pointing a client to the server executable path defined in mcp_config. Ensure the path in command is correct for your deployment.
Notes:
- You can override runtime options via CLI, for example: ./bin/mcp-deepseek -deepseek-model=deepseek-coder
- Ensure network access to DeepSeek API and that your API key has the required permissions.
Additional notes
Tips and considerations:
- Environment variables control model, prompts, timeout, and temperature; adjust them to suit your project–e.g., DEEPSEEK_TEMPERATURE controls the randomness of outputs.
- File handling uploads file content to DeepSeek; ensure DEEPSEEK_MAX_FILE_SIZE and DEEPSEEK_ALLOWED_FILE_TYPES reflect your security needs.
- JSON mode is ideal for automation; use deepseek_ask with json_mode=true to obtain parser-friendly responses.
- If initialization fails, the server will enter Degraded Mode and log errors with timestamps for troubleshooting.
- When using file_paths, provide accurate relative or absolute paths to allow correct file retrieval and context provisioning.
- For production, consider configuring exponential backoff variables (DEEPSEEK_INITIAL_BACKOFF, DEEPSEEK_MAX_BACKOFF) to handle transient API errors gracefully.
Related MCP Servers
sandbox
A Model Context Protocol (MCP) server that enables LLMs to run ANY code safely in isolated Docker containers.
github-brain
An experimental GitHub MCP server with local database.
temporal
Empowering AI with Workflow Orchestration
mcp-tts
MCP Server for Text to Speech
tasker
An MCP server for Android's Tasker automation app.
mcp-log-proxy
a web logging proxy for MCP client-server communication