fmp_mcp_server
A lightweight, production-ready **Model Context Protocol (MCP)** server that brings **real financial data** directly to AI Agents or LLM Apps — fundamentals, ratios, price data, transcripts, macro indicators, and more.
claude mcp add --transport stdio damianboh-fmp_mcp_server python fmp_mcp_server.py \ --env FMP_API_KEY="your_fmp_api_key_here"
How to use
This MCP server exposes a single endpoint that bridges AI tools with Financial Modeling Prep data. It offers access to company fundamentals, ratios, price data, earnings transcripts, macro indicators, news, and insider trades through a structured MCP interface. With the default STDIO mode, you can interact directly from a CLI or ChatGPT’s MCP integration; you can also run it in streamable HTTP mode to expose a remote endpoint for cloud-based clients. Available tools include company_profile, income_statement, balance_sheet, cash_flow, financial_ratios, historical_price_eod_full, earnings_call_transcript, economic_indicators, economic_calendar, stock_news_latest, stock_news_search, insider_trading_latest, ping, and when_should_i_use_fmp. Use the streamable-http option to serve an HTTP API at /mcp/ and connect via a tunnel or reverse proxy for external access.
How to install
Prerequisites:
- Python 3.9+ (recommended 3.10+)
- Access to install Python packages
- A valid FMP API key (set FMP_API_KEY)
Install steps:
-
Clone the repository: git clone https://github.com/damianboh/fmp_mcp_server.git cd fmp_mcp_server
-
Install dependencies: pip install -r requirements.txt
-
Set your FMP API key (example): export FMP_API_KEY=your_fmp_api_key_here # macOS/Linux setx FMP_API_KEY your_fmp_api_key_here # Windows
-
Run the server (default STDIO mode): python fmp_mcp_server.py
Optionally, run in HTTP mode to expose remotely: python fmp_mcp_server.py --transport streamable-http --host 0.0.0.0 --port 8000
Test health endpoint (if running in HTTP mode): curl http://127.0.0.1:8000/health
Additional notes
Environment and configuration tips:
- Ensure FMP_API_KEY is set; without it, some endpoints may be rate-limited or unavailable.
- In HTTP mode, the default path prefix is /mcp/; you can customize via --path.
- If you encounter rate limits with the free FMP key, consider upgrading to a paid API key.
- For local development with AI tools, STDIO mode is simplest; for cloud deployment, use streamable-http and an external tunnel or Cloudflare Ngrok alternative.
- If you plan to expose the server publicly, secure the endpoint and review rate limits and access controls provided by your hosting or tunneling solution.
Related MCP Servers
ai-engineering-hub
In-depth tutorials on LLMs, RAGs and real-world AI agent applications.
MAI-UI
MAI-UI: Real-World Centric Foundation GUI Agents ranging from 2B to 235B
5-Day-AI-Agents-Intensive-Course-with-Google
5-Day Gen AI Intensive Course with Google
sparql-llm
🦜✨ Chat system, MCP server, and reusable components to improve LLMs capabilities when generating SPARQL queries
agent-project
Agent 实战:智能路由、任务拆解和链路工程
pydata-london-2025
Multi-Agent AI Orchestration Workshop