mcp-hubspot
A Model Context Protocol (MCP) server that enables AI assistants to interact with HubSpot CRM data, providing built-in vector storage and caching mechanisms help overcome HubSpot API limitations while improving response times.
claude mcp add --transport stdio peakmojo-mcp-hubspot docker run -i --rm -e HUBSPOT_ACCESS_TOKEN=your_token -v /path/to/storage:/storage buryhuang/mcp-hubspot:latest
How to use
The HubSpot MCP Server exposes HubSpot CRM capabilities to AI assistants via the MCP interface. It provides tools to create and query HubSpot data, with built-in vector storage (FAISS) and caching to support semantic search and fast retrieval across prior conversations. You can use it to create contacts and companies with duplicate prevention, fetch activity and recent engagement data, and perform semantic searches across retrieved HubSpot data for context-aware interactions. The server is designed for simplicity with a Docker deployment, and it supports persistent storage to retain embeddings, conversations, and index data between sessions.
Available tools include:
- hubspot_create_contact: Create contacts while preventing duplicates
- hubspot_create_company: Create companies while preventing duplicates
- hubspot_get_company_activity: Retrieve activity for specific companies
- hubspot_get_active_companies: Retrieve most recently active companies
- hubspot_get_active_contacts: Retrieve most recently active contacts
- hubspot_get_recent_conversations: Retrieve recent conversation threads with messages
- hubspot_search_data: Semantic search across previously retrieved HubSpot data
To use these tools, run the Docker container with your HubSpot access token and storage path, then point your MCP client (or Claude/other AI assistant) to the configured mcpServer as shown in the Docker configuration. The tools will be invoked by name from your AI prompts or via the MCP orchestration layer, enabling context-rich CRM actions within your workflows.
How to install
Prerequisites:
- Docker installed on the host
- A HubSpot access token with the required scopes (crm.objects.contacts, crm.objects.companies, sales-email-read)
- Optional: a persistent storage path on the host for embeddings and data
Step-by-step installation:
- Pull and run the HubSpot MCP server via Docker (provide token and storage path):
# Quick start (replace token and paths as needed)
docker run -i --rm \
-e HUBSPOT_ACCESS_TOKEN=your_token \
-v /path/to/storage:/storage \
buryhuang/mcp-hubspot:latest
- Alternatively, build and run locally (if you want to customize the image):
# Clone repository and build the image
git clone https://github.com/buryhuang/mcp-hubspot.git
cd mcp-hubspot
docker build -t mcp-hubspot .
docker run -i --rm \
-e HUBSPOT_ACCESS_TOKEN=your_token \
-v /path/to/storage:/storage \
mcp-hubspot:latest
- If you prefer using npx/CLI tooling (not required for Docker): follow the repository’s typical installation method after ensuring Docker is available. The README’s Quick Start primarily demonstrates Docker usage.
Additional notes
Tips and notes:
- Ensure HUBSPOT_ACCESS_TOKEN has the necessary scopes: crm.objects.contacts (read/write), crm.objects.companies (read/write), and sales-email-read.
- The -v /path/to/storage:/storage option enables persistent storage for embeddings, caches, and thread indexes; ensure the host path exists or Docker will create it.
- If you need to rotate tokens, update the environment variable HUBSPOT_ACCESS_TOKEN in the container configuration or restart the container with the new token.
- The server uses FAISS for vector storage and SentenceTransformer for embeddings; if you experience performance issues, consider increasing storage bandwidth or allocating more memory.
- For multi-platform deployment, use the official image buryhuang/mcp-hubspot:latest and ensure the host has the required architecture support.
- Review HubSpot rate limits and error handling in the server logs to adjust retry behavior if needed.
Related MCP Servers
mcp -qdrant
An official Qdrant Model Context Protocol (MCP) server implementation
chunkhound
Local first codebase intelligence
persistent-ai-memory
A persistent local memory for AI, LLMs, or Copilot in VS Code.
mcp -any-openapi
A MCP server that enables Claude to discover and call any API endpoint through semantic search. Intelligently chunks OpenAPI specifications to handle large API documentation, with built-in request execution capabilities. Perfect for integrating private APIs with Claude Desktop.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
local_faiss_mcp
Local FAISS vector store as an MCP server – drop-in local RAG for Claude / Copilot / Agents.