Get the FREE Ultimate OpenClaw Setup Guide →

mcp-wolframalpha

A Python-powered Model Context Protocol MCP server and client that uses Wolfram Alpha via API.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio akalaric-mcp-wolframalpha python /path/to/src/core/server.py \
  --env GeminiAPI="your_google_gemini_api_key (Optional)" \
  --env WOLFRAM_API_KEY="your_wolframalpha_appid"

How to use

This MCP server integrates Wolfram|Alpha capabilities into your chat applications. It exposes an MCP server that interfaces with the Wolfram Alpha API so you can perform computational queries and retrieve structured knowledge within a conversation. The repository also includes an MCP-Client example that demonstrates connecting a Gemini-based LLM (via LangChain) to the MCP server, enabling real-time querying of Wolfram Alpha from within large language model workflows. A Gradio-based UI is provided to offer a user-friendly web interface for interacting with both Google AI (Gemini) and Wolfram Alpha through the MCP server. To use it, configure the MCP server (e.g., via a VSCode MCP Server or Claude Desktop) and run the provided client tooling to issue Wolfram Alpha queries through the MCP protocol. The client UI/CLI examples show how to start the Gradio UI or run the CLI to send queries to Wolfram Alpha through the MCP layer.

How to install

Prerequisites:

  • Python 3.8+ installed on your system
  • Git
  • Access to Wolfram Alpha API (WOLFRAM_API_KEY)
  • Optional: Google Gemini API key (GeminiAPI) if you plan to use the MCP client with Gemini
  1. Clone the repository:
git clone https://github.com/ricocf/mcp-wolframalpha.git
cd mcp-wolframalpha
  1. Create and populate environment variables (example):
# .env or export variables in shell
export WOLFRAM_API_KEY=your_wolframalpha_appid
export GeminiAPI=your_google_gemini_api_key
  1. Install Python dependencies:
pip install -r requirements.txt
  1. (Optional but recommended) Ensure uv is installed if you plan to use UV-based tooling:
uv sync
  1. Run the MCP server (example using the provided configuration via VSCode or Claude Desktop):
# Example command from the provided MCP config:
python /path/to/src/core/server.py
  1. If you want to use the MCP configuration as in VSCode or Claude Desktop, create or edit the mcp.json accordingly (see README for the WolframAlphaServer example). For VSCode MCP Server, place a file at .vscode/mcp.json that points to the Python server script as shown in the README.

Additional notes

Tips and notes:

  • Ensure your Wolfram API key is valid and has permission for the queries you intend to perform.
  • If you plan to use the MCP Client with Gemini, provide a valid GeminiAPI key; the client UI will rely on Gradio and Gemini for the frontend/backend interaction.
  • The UI support uses Gradio; you can run the UI with the provided client entry points (e.g., python main.py --ui) to interact with Wolfram Alpha through the MCP server.
  • For containerized deployments, Docker images are demonstrated for both the UI and the client; adapt image names and Dockerfiles as needed for your environment.
  • If you encounter connection issues, verify that the MCP server is reachable from the client, and ensure the correct port and host are configured in your client setup.
  • The README mentions optional VSCode MCP Server and Claude Desktop configurations; those workflows expect you to point to the Python server script path in the mcp.json configuration.

Related MCP Servers

Sponsor this space

Reach thousands of developers