Get the FREE Ultimate OpenClaw Setup Guide →

opensearch

OpenSearch 3.0 with MCP Server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio capelabs-opensearch-mcp-server docker compose up -d \
  --env OPENAI_API_KEY="your_openai_api_key_here"

How to use

This MCP server acts as a bridge between LangChain agents and an OpenSearch cluster, enabling natural language interactions with your OpenSearch data. It exposes three core tools that the LangChain agent can leverage: ListIndexTool to enumerate available indices, IndexMappingTool to fetch the mapping for a given index, and SearchIndexTool to perform complex search queries against OpenSearch. After starting the services with Docker Compose, the agent can be run from the agents directory, and you can query OpenSearch using natural language prompts that are converted into OpenSearch queries by the MCP server and the LangChain agent.

To use the tools, start the services, ensure OpenAI API access is configured, and run the LangChain-based Python agent. Then, ask questions like “Show me all indices,” “What is the mapping for the user index?” or “Find logs from the last 24 hours.” The MCP server will translate these natural language requests into OpenSearch queries, returning results that can be consumed by your agent or fed into downstream workflows.

How to install

Prerequisites:

  • Docker and Docker Compose installed
  • Python 3.10+ (for running the Python-based agent, if desired)
  • An OpenAI API key for GPT model usage
  1. Clone the repository:
git clone https://github.com/capelabs/opensearch-mcp-server.git
cd opensearch-mcp-server
  1. Start the stack with Docker Compose (which includes OpenSearch, OpenSearch Dashboards, and the MCP server):
docker-compose up -d
  1. Optional: Set up the Python agent environment (in the agents directory) for local development:
cd agents
python -m venv venv
# Linux/macOS
source venv/bin/activate
# Windows
venv\Scripts\activate

pip install -r requirements.txt
  1. Configure environment variables for the Python agent by populating agents/.env with your keys, for example:
OPENAI_API_KEY=your_openai_api_key_here
  1. Run the LangChain agent (if using the Python agent):
cd agents
python agent.py
  1. Verify services:
# Check running containers
docker ps
# Check OpenSearch cluster status
curl -X GET "localhost:9200/_cluster/settings?pretty"

Additional notes

Tips and caveats:

  • Ensure Docker Desktop is running and Docker Compose is accessible in your PATH.
  • The OpenSearch and OpenSearch Dashboards services expose ports 9200 and 5601 respectively; ensure there are no port conflicts on the host.
  • The OPENAI_API_KEY environment variable must be set for the GPT-enabled features of the agent.
  • If MCP activation or OpenSearch connection fails, check container logs with docker-compose logs and verify OpenSearch readiness before starting the MCP server.
  • For production deployments, consider securing OpenSearch with authentication and enabling SSL/TLS, and adjust JVM heap size in opensearch.yml as needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers