mcp-pinecone
Model Context Protocol server to allow for reading and writing from Pinecone. Rudimentary RAG
claude mcp add --transport stdio sirmews-mcp-pinecone uvx --index-name {your-index-name} --api-key {your-secret-api-key} mcp-pineconeHow to use
This MCP server provides a bridge to a Pinecone index for Claude Desktop. It exposes a set of tools that let you search, read, and manage documents within Pinecone, as well as process and upsert documents in chunks with embeddings generated via Pinecone's inference API. The available tools include semantic-search to query records, read-document to fetch a specific document, list-documents to enumerate all documents, pinecone-stats to retrieve index statistics, and process-document to chunk, embed, and upsert documents. Together with a Pinecone client, the server handles the end-to-end workflow from document ingestion to semantic search results, enabling Claude Desktop to retrieve relevant context from your Pinecone index during interactions.
To use the server, configure the MCP client (Claude Desktop) to point at the Pinecone MCP server. Provide your Pinecone index name and API key in the configuration (as shown in the example). Once started, you can invoke semantic-search to find relevant records by embedding similarity, read-document or list-documents to inspect content, and process-document to ingest new material by chunking, embedding, and upserting into Pinecone. The flow integrates with Claude Desktop to surface context from your Pinecone index during conversations and prompts.
How to install
Prerequisites
- Python 3.8+ and pip
- Access to Pinecone and a Pinecone API key
- Optional: uv (for running the MCP server) or use the provided UV-based installation method
Step 1: Install the MCP server package
- Via pip (recommended):
pip install mcp-pinecone
- Or install via your preferred Python environment manager
Step 2: Install UV (inside your environment) for local server running (if not already installed)
# follow UV installation guide, or use your system package manager
# example (Unix):
curl -L https://github.com/astral-sh/uv/releases/download/v0.0.0/uv-<platform>.tar.gz -o uv.tar.gz
# extract and add to PATH
Step 3: Run the MCP server
- With UV, start the server using your index name and API key (example placeholders):
uvx install mcp-pinecone
uvx --index-name <your-index-name> --api-key <your-secret-api-key> mcp-pinecone
Step 4: Configure Claude Desktop to connect to the MCP server
- In Claude Desktop's configuration, point to the MCP server using the path shown in your environment, for example (depending on your setup):
"mcpServers": {
"mcp-pinecone": {
"command": "uvx",
"args": [
"--index-name",
"{your-index-name}",
"--api-key",
"{your-secret-api-key}",
"mcp-pinecone"
]
}
}
Step 5: Validate installation
- Run the MCP Inspector if needed to debug:
npx @modelcontextprotocol/inspector uv --directory {project_dir} run mcp-pinecone
Additional notes
Environment variables and configuration tips:
- Ensure you supply the correct Pinecone index name with --index-name and a valid API key with --api-key.
- If you use uv’s development path, you may run the server directly via uv run mcp-pinecone with your project directory.
- The MCP Inspector can help debug the data flow between Claude Desktop and Pinecone during development.
- For Smithery-based installs, you can install automatically via: npx -y @smithery/cli install mcp-pinecone --client claude
- When updating embeddings or documents, typical steps involve using process-document to chunk and embed content before upserting into Pinecone.
- If you encounter connectivity or authentication issues, verify network access to Pinecone and ensure your API key has the appropriate permissions for the target index.
Related MCP Servers
MCP-Bridge
A middleware to provide an openAI compatible endpoint that can call MCP tools
asterisk
Asterisk Model Context Protocol (MCP) server.
rlm-claude
Recursive Language Models for Claude Code - Infinite memory solution inspired by MIT CSAIL paper
web-research-assistant
MCP server for SearXNG with 13 production-ready tools for web search, package info, GitHub integration, error translation, API docs, and more
llm-bridge
A model-agnostic Message Control Protocol (MCP) server that enables seamless integration with various Large Language Models (LLMs) like GPT, DeepSeek, Claude, and more.
cortivium
Create persistent AI tools through conversation. Ghost Skills turn plain-language instructions into real MCP tool registrations that trigger reliably — every time.