MCP-wolfram-alpha
Connect your chat repl to wolfram alpha computational intelligence
claude mcp add --transport stdio secretiveshell-mcp-wolfram-alpha uv --directory C:\Users\root\Documents\MCP-wolfram-alpha run MCP-wolfram-alpha \ --env WOLFRAM_API_KEY="your-app-id"
How to use
This MCP server provides a Wolfram Alpha integration that mirrors the behavior of a Wolfram Alpha query tool. It exposes a prompt-style flow and a functional tool that queries the Wolfram Alpha API to fetch computed answers or data-driven results. The prompt concept is analogous to the wa helper, guiding the system to use Wolfram Alpha to answer user questions. The main tool is query_wolfram_alpha(query), which sends the query to Wolfram and returns the response as text. To use it, configure your WOLFRAM_API_KEY in the environment and run the server; then issue a query through your MCP client and read back the results returned by the Wolfram Alpha API. The server is designed to work with the full results API, though it may function with more limited endpoints depending on your API plan.
How to install
Prerequisites:
- Python installed on your machine (the server runs with uv, the Python microservice runner).
- A Wolfram Alpha API key (WOLFRAM_API_KEY).
- Access to a compatible MCP inspector/CLI if you want to debug interactively.
Step 1: Clone or download the MCP-wolfram-alpha repository to your working directory.
Step 2: Install any required Python dependencies. If a requirements.txt is provided, install with:
pip install -r requirements.txt
Step 3: Set up your Wolfram API key in the environment. You can export it in your shell or set it in your MCP configuration:
export WOLFRAM_API_KEY=your-app-id
Step 4: Run the MCP server using uv (as shown in the README example). Ensure the directory path points to your MCP project root:
uv --directory /path/to/MCP-wolfram-alpha run MCP-wolfram-alpha
Step 5: (Optional) If you want to debug with the MCP inspector, install and run a tool like mcp-cli-inspector as described in the project notes, and point it at your config.
Step 6: Verify the server starts and responds to queries by sending a test prompt via your MCP client.
Additional notes
Environment variable: WOLFRAM_API_KEY must be provided for the Wolfram Alpha integration to function. If you encounter API limit errors, check your Wolfram Alpha plan and consider implementing queuing or rate limiting in your client. When debugging, ensure the --directory path exactly matches your local project location and that the MCP server name in your config matches the one used in your run command. If the server fails to start, check file permissions on the working directory and confirm that Python is accessible in your PATH. This server expects the full results API; if you only have a partial API key, you may still receive partial responses depending on your plan.
Related MCP Servers
web-eval-agent
An MCP server that autonomously evaluates web applications.
mcp-neo4j
Neo4j Labs Model Context Protocol servers
Gitingest
mcp server for gitingest
zotero
Model Context Protocol (MCP) server for the Zotero API, in Python
fhir
FHIR MCP Server – helping you expose any FHIR Server or API as a MCP Server.
unitree-go2
The Unitree Go2 MCP Server is a server built on the MCP that enables users to control the Unitree Go2 robot using natural language commands interpreted by a LLM.