phone-a-friend
MCP server from abhishekbhakat/phone-a-friend-mcp-server
claude mcp add --transport stdio abhishekbhakat-phone-a-friend-mcp-server uvx phone-a-friend-mcp-server --provider openai --api-key <YOUR_API_KEY> \ --env OPENAI_API_KEY="API key for OpenAI/OpenRouter provider"
How to use
Phone-a-Friend is an MCP server that enables a primary AI to outsource deep reasoning tasks to an external AI via a two-step consultation flow. The server packaging reads relevant context, sends it to an external provider (default OpenAI/OpenRouter) for critical thinking, and then returns actionable insights tailored for the primary AI. This is especially useful for long-context reasoning, complex multi-step problems, and cross-domain analysis where leveraging another AI as a consultant improves problem solving. The available tool, phone_a_friend, makes the external API call to perform the reasoning, while fax_a_friend helps generate a master prompt file for manual review or human-in-the-loop workflows. To use it in a compatible MCP client (e.g., Claude Desktop, Claude-like clients), configure the server with the provided JSON block and supply your API key via environment variable or CLI flag. Once configured, you can invoke phone_a_friend to obtain structured insights and decision-ready outputs, or use fax_a_friend to prepare a reusable master prompt for manual consultations.
How to install
Prerequisites:
- Python 3.8+ (recommended 3.10+)
- Access to an MCP client that supports the mcp_config format described here
- Optional: uv runner tooling to be installed on your system (uvx for Python-based runners)
Installation steps:
- Install Python tooling (if not already installed):
- macOS/Linux: install Python 3.x from https://www.python.org or via your package manager
- Windows: install Python 3.x from https://www.python.org
- Install the uv runner (uvx) so you can run the MCP server package. You can install via:
- pipx: pipx install uvx
- or pip: pip install uvx
- Alternatively, follow your environment’s recommended installation method for the uvx runner
- Verify installation:
- Run: uvx --version
- Prepare the MCP config snippet (provided above) and place it in your MCP client configuration file.
- Obtain an API key for your selected provider (e.g., OpenAI). Set the key in your environment or pass via CLI as shown in the configuration example.
- Run the MCP server via the configured launcher (the uvx command from step 2) using the provided arguments. For example:
- uvx phone-a-friend-mcp-server --provider openai --api-key "YOUR_API_KEY"
Additional notes
Tips and common issues:
- Ensure you provide a valid API key for the chosen provider (OpenAI, OpenRouter, etc.). The config supports switching providers via the --provider flag.
- If you encounter environment variable issues, verify you have OPENAI_API_KEY (and other provider keys if you switch providers) set in your shell or in the MCP client’s env block.
- The fax_a_friend tool helps you generate a master prompt file (fax_a_friend.md) for manual consultations—useful for validating prompts or sharing with human experts.
- The default model can be overridden with CLI flags or environment variables per the Advanced Configuration section in the README. Refer to those options if you need to tweak model selection or temperature.
- When using long-context scenarios, confirm your provider’s token limits and adjust the all_related_context and file_list inputs to avoid exceeding those limits.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP