openecon-data
Query economic data from FRED, World Bank, IMF, and 10+ sources using plain English. MCP server + web app + API.
claude mcp add --transport stdio hanlulong-openecon-data python scripts/restart_dev.py \ --env OPENROUTER_API_KEY="Your OpenRouter API key (required for LLM parsing)"
How to use
OpenEcon Data provides an MCP-native endpoint that lets an AI agent query a unified economic data interface. The system routes natural language questions to the appropriate data providers (FRED, World Bank, IMF, Eurostat, UN Comtrade, BIS, and more), normalizes results, and returns interactive charts plus exports in CSV/JSON/DTA/Python formats. Use the hosted or self-hosted MCP endpoint to ask questions like “US GDP growth since 2015” or “inflation in the EU and UK since 2010” and receive a coherent visualization and data export. The MCP router parses intent with an LLM, selects the best provider based on requested indicators and geography, fetches and aligns data across sources, and renders a chart with source attribution.
To use it, connect your agent (Claude, Codex, or any MCP-compatible AI) to the endpoint and issue natural-language queries. You can also use the self-hosted instance locally to explore providers and export formats. The system supports real-time progress via Server-Sent Events and can return multiple export options for downstream analysis.
How to install
Prerequisites:
- Python 3.10+ and Node.js 18+
- Git
- Access to an OpenRouter API key for LLM parsing (required for local parsing) or use a local model if you configure alternatives
Installation steps:
-
Clone the repository git clone https://github.com/hanlulong/openecon-data.git cd openecon-data
-
Install backend dependencies python -m pip install --upgrade pip pip install -r requirements.txt
-
Install frontend dependencies (optional for local access to UI) npm install
-
Copy environment file and add keys cp .env.example .env
Edit .env to add OPENROUTER_API_KEY and any provider API keys as needed
-
Start the development server (MCP backend + UI)
If using the provided script for dev restart
python3 scripts/restart_dev.py
-
Verify locally Backend: http://localhost:3001 Frontend: http://localhost:5173
Note: If you prefer to run components separately, start the backend server that powers the MCP logic and the frontend UI separately and ensure both point to the same MCP endpoint.
Additional notes
Environment variables and configuration tips:
- OPENROUTER_API_KEY is required for LLM-based parsing during query interpretation. Obtain a key from OpenRouter and set it in your environment or .env file.
- Some data providers require their own API keys (e.g., FRED, Comtrade). Add these keys to your environment or .env as needed.
- When self-hosting, you can customize provider mappings in the MCP router to fine-tune which sources are preferred for particular indicators or geographies.
- The MCP endpoint supports exporting results in CSV, JSON, DTA (Stata), or Python code. Use the export options to integrate results into your data workflows.
- If you encounter port conflicts, adjust the backend/frontend ports in the startup script or environment configuration.
Related MCP Servers
fastapi_mcp
Expose your FastAPI endpoints as Model Context Protocol (MCP) tools, with Auth!
kodit
👩💻 MCP server to index external repositories
ollama -bridge
Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots
mcp-stata
A lightweight Model Context Protocol (MCP) server for Stata. Execute commands, inspect data, retrieve stored results (r()/e()), and view graphs in your chat interface. Built for economists who want to integrate LLM assistance into their Stata workflow.
prospectio-api
MCP/API server that helps you to connect to different lead generation app
github-second-brain
Providing tools to AI to explore github codebase like agent