mcp -opensearch
MCP server from ibrooksSDX/mcp-server-opensearch
claude mcp add --transport stdio ibrookssdx-mcp-server-opensearch uvx mcp-server-opensearch --opensearch-url http://localhost:9200 --opensearch-api-key your_api_key --index-name your_index_name \ --env INDEX_NAME="Name of the OpenSearch index to use" \ --env OPENSEARCH_HOST="URL of the OpenSearch server, e.g. http://localhost" \ --env OPENSEARCH_HOSTPORT="Port of the OpenSearch host, e.g. 9200"
How to use
This MCP server provides a semantic memory layer on top of an OpenSearch cluster. It uses OpenSearch as the backing store to store and retrieve memories via MCP tools. The primary tool exposed is search-openSearch, which accepts a structured query payload and returns a confirmation or results that can be fed into LLM workflows. You can run the server via uvx for Python-based setups, then connect your MCP-enabled application or Claude Desktop to perform memory storage and lookup operations. The examples in the repository show how to configure Claude Desktop to launch the server and how to test the OpenSearch connection locally. Once running, you can issue memory storage operations and semantic queries that leverage OpenSearch’s search capabilities through the MCP protocol.
How to install
Prerequisites:
- Python and uv (or uvx) installed on your system
- Access to an OpenSearch cluster (locally at http://localhost:9200 by default) or a reachable OpenSearch endpoint
Installation steps:
- Install Smithery (optional for automated installation):
# If you want to install via Smithery (optional)
npx -y @smithery/cli install @ibrooksSDX/mcp-server-opensearch --client claude
- Install and run the MCP server using uv (recommended):
# Start the MCP server with OpenSearch configuration
uv run mcp-server-opensearch \
--opensearch-url "http://localhost:9200" \
--index-name "your_index_name"
- If you prefer Claude Desktop integration, add the opensearch server to claude_desktop_config.json as shown in the repository README, or use the FastMCP installer UI to load the server.
- Optional: Verify connectivity with a local test:
uv run python src/mcp-server-opensearch/test_opensearch.py
Additional notes
Tips and caveats:
- Ensure your OpenSearch instance is reachable at the configured URL and that the index exists or can be auto-created by the client.
- The OpenSearch async client may require extra dependencies; if installation fails, try the standard synchronous client or verify environment compatibility.
- Environment variable mappings (OPENSEARCH_HOST, OPENSEARCH_HOSTPORT, INDEX_NAME) can be used to parameterize deployments in containerized or cloud environments.
- If you encounter authentication errors, provide the appropriate API key or credentials required by your OpenSearch deployment.
- The mcpServers configuration shown in Claude Desktop examples is a reference; adjust paths and flags to match your local setup and workflow.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP