ta-lib
A Model Context Protocol server providing tools to access to TA-Lib technical analysis indicators.
claude mcp add --transport stdio hanai-ta-lib-mcp-server docker run -i --rm ghcr.io/hanai/ta-lib-mcp-server:main
How to use
The TA-Lib MCP Server exposes TA-Lib's technical analysis indicators through the Model Context Protocol (MCP). This allows AI assistants and MCP clients to request indicator calculations (such as moving averages, RSI, MACD, Bollinger Bands, etc.) on provided financial data. To use it, add the server configuration to your MCP-enabled IDE or orchestrator as shown in the README, which points the MCP client to the Docker image that runs the TA-Lib MCP server. Once running, you can query the server with requests containing the financial data (e.g., price series) and the desired TA-Lib indicator, and receive computed results suitable for ingestion into your AI workflow.
The server leverages TA-Lib under the hood and exposes a range of indicators via MCP endpoints. You can pass in your time series data (opens, highs, lows, closes, volumes) and request specific indicators with parameters (time period, type, etc.). This enables dynamic embedding of technical analysis capabilities directly into your chat or automation pipelines, enabling smarter financial data reasoning and analysis within your MCP-enabled environment.
How to install
Prerequisites:
- Docker installed and running on your host
- Internet access to pull the image
Installation steps:
-
Pull the TA-Lib MCP Server Docker image: docker pull ghcr.io/hanai/ta-lib-mcp-server:main
-
Run the MCP server in the background or attached mode: docker run -i --rm ghcr.io/hanai/ta-lib-mcp-server:main
-
Confirm the server is responding (optional): Use your MCP client to send a test request to the server endpoint provided by your environment. If you are using the provided IDE snippet, ensure your MCP client is configured to point at the running container.
-
Integrate with your MCP client: Configure your MCP client to reference the ta-lib server as shown in the README JSON snippet.
Notes:
- If your environment uses a different networking setup, you may need to expose ports or use a host networking mode depending on how your MCP client discovers services.
Additional notes
Tips:
- Ensure Docker has enough memory allocated, as TA-Lib and NumPy operations can be memory-intensive for large time series.
- The server image tag main is used in the example; prefer a specific version tag for production stability.
- If you encounter library compatibility issues, verify that the host's system libraries meet TA-Lib requirements and that the container can access those dependencies.
- This MCP server relies on TA-Lib; for large datasets, batch requests or streaming approaches can improve performance.
- Environment variables are not defined in the README; if the image supports configuration via env vars, refer to the official image documentation for any optional settings.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP