opensearch
OpenSearch 3.0 with MCP Server
claude mcp add --transport stdio capelabs-opensearch-mcp-server docker compose up -d \ --env OPENAI_API_KEY="your_openai_api_key_here"
How to use
This MCP server acts as a bridge between LangChain agents and an OpenSearch cluster, enabling natural language interactions with your OpenSearch data. It exposes three core tools that the LangChain agent can leverage: ListIndexTool to enumerate available indices, IndexMappingTool to fetch the mapping for a given index, and SearchIndexTool to perform complex search queries against OpenSearch. After starting the services with Docker Compose, the agent can be run from the agents directory, and you can query OpenSearch using natural language prompts that are converted into OpenSearch queries by the MCP server and the LangChain agent.
To use the tools, start the services, ensure OpenAI API access is configured, and run the LangChain-based Python agent. Then, ask questions like “Show me all indices,” “What is the mapping for the user index?” or “Find logs from the last 24 hours.” The MCP server will translate these natural language requests into OpenSearch queries, returning results that can be consumed by your agent or fed into downstream workflows.
How to install
Prerequisites:
- Docker and Docker Compose installed
- Python 3.10+ (for running the Python-based agent, if desired)
- An OpenAI API key for GPT model usage
- Clone the repository:
git clone https://github.com/capelabs/opensearch-mcp-server.git
cd opensearch-mcp-server
- Start the stack with Docker Compose (which includes OpenSearch, OpenSearch Dashboards, and the MCP server):
docker-compose up -d
- Optional: Set up the Python agent environment (in the agents directory) for local development:
cd agents
python -m venv venv
# Linux/macOS
source venv/bin/activate
# Windows
venv\Scripts\activate
pip install -r requirements.txt
- Configure environment variables for the Python agent by populating agents/.env with your keys, for example:
OPENAI_API_KEY=your_openai_api_key_here
- Run the LangChain agent (if using the Python agent):
cd agents
python agent.py
- Verify services:
# Check running containers
docker ps
# Check OpenSearch cluster status
curl -X GET "localhost:9200/_cluster/settings?pretty"
Additional notes
Tips and caveats:
- Ensure Docker Desktop is running and Docker Compose is accessible in your PATH.
- The OpenSearch and OpenSearch Dashboards services expose ports 9200 and 5601 respectively; ensure there are no port conflicts on the host.
- The OPENAI_API_KEY environment variable must be set for the GPT-enabled features of the agent.
- If MCP activation or OpenSearch connection fails, check container logs with docker-compose logs and verify OpenSearch readiness before starting the MCP server.
- For production deployments, consider securing OpenSearch with authentication and enabling SSL/TLS, and adjust JVM heap size in opensearch.yml as needed.
Related MCP Servers
TrendRadar
⭐AI-driven public opinion & trend monitor with multi-platform aggregation, RSS, and smart alerts.🎯 告别信息过载,你的 AI 舆情监控助手与热点筛选工具!聚合多平台热点 + RSS 订阅,支持关键词精准筛选。AI 翻译 + AI 分析简报直推手机,也支持接入 MCP 架构,赋能 AI 自然语言对话分析、情感洞察与趋势预测等。支持 Docker ,数据本地/云端自持。集成微信/飞书/钉钉/Telegram/邮件/ntfy/bark/slack 等渠道智能推送。
boltmcp
MCP server from boltmcp/boltmcp
nosia
Self-hosted AI RAG + MCP Platform
magg
Magg: The MCP Aggregator
mem0
✨ mem0 MCP Server: A memory system using mem0 for AI applications with model context protocl (MCP) integration. Enables long-term memory for AI agents as a drop-in MCP server.
Python-Runtime-Interpreter
PRIMS is a lightweight, open-source Model Context Protocol (MCP) server that lets LLM agents safely execute arbitrary Python code in a secure, throw-away sandbox.