statfin
Production-grade Model Context Protocol (MCP) server for Statistics Finland's StatFin database. Enables AI assistants like Claude to browse, search, and query Finnish statistical data.
claude mcp add --transport stdio markusl-statfin-mcp node /absolute/path/to/statfin-mcp/dist/server.js
How to use
Tilastokeskus StatFin MCP Server exposes 7 MCP tools that AI assistants can invoke to browse, explore, and query Finland's StatFin PxWeb data. The server is Node.js-based and provides discovery (search_statistics), structural exploration (list_subject_areas, list_tables, get_table_metadata), data retrieval (get_variable_values, query_table), and health checks (get_api_status). To use the server, run it locally or connect to a hosted instance via HTTP. When integrated with an AI assistant, you can start by asking for relevant tables with search_statistics, then drill into a specific table with get_table_metadata, fetch possible values with get_variable_values, and finally construct and execute a query with query_table. The tools are designed to guide the LLM through a safe, efficient workflow, including query size estimation and rate-limit awareness.
In practice, you can: (1) Discover: use search_statistics to locate tables related to your question (e.g., population, unemployment, housing). (2) Explore: inspect table structure and available variables using get_table_metadata and list_subject_areas to understand codes and hierarchies. (3) Query: build a query with query_table by selecting variables and values, leveraging selections like filter types and top n results for pagination. The server also handles rate limiting and caching to minimize unnecessary API calls, making interactions fast and reliable for long-running analysis or conversational data retrieval.
How to install
Prerequisites:
- Node.js (LTS) installed on your machine
- npm or pnpm for package management
- Git for cloning the repository
Installation steps:
-
Clone the repository git clone https://github.com/your-org/statfin-mcp.git cd statfin-mcp
-
Install dependencies npm install
-
Build the server (if applicable) npm run build
-
Run the server (development / stdio transport) npm run dev
or production HTTP server
npm start
-
Optional: Run with a custom port PORT=8080 npm start
-
Docker (if docker-compose is provided) docker-compose up --build
Configuration examples:
-
Local Node.js server path (for MCP config): { "command": "node", "args": ["/absolute/path/to/statfin-mcp/dist/server.js"] }
-
Remote/HTTP usage: see the repository’s remote deployment instructions for hosting and HTTP transport.
Additional notes
Tips and notes:
- The MCP server implements rate limiting (8 requests per minute per instance) to align with PxWeb API constraints; design queries to respect limits and avoid large, expensive calls.
- Caching: historical data is cached with timestamp validation to reduce redundant requests.
- Variables and codes: when querying, you may need to use exact PxWeb variable codes (e.g., region codes, year codes). Use get_table_metadata and get_variable_values to retrieve valid codes.
- Multi-language support is available; ensure your UI or LLM prompts are aligned with the language you intend (Finnish, English, Swedish).
- If deploying remotely, ensure the server’s URL is accessible by the AI assistant and that CORS or transport settings are configured for your integration (http transport is supported).
- When using Claude, Codex, or Cursor integrations, follow the documented steps to add the MCP server under the appropriate platform, using either the local path to dist/server.js or the hosted URL.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!
akyn-sdk
Turn any data source into an MCP server in 5 minutes. Build AI-agents-ready knowledge bases.
promptboard
The Shared Whiteboard for Your AI Agents via MCP. Paste screenshots, mark them up, and share with AI.