mcp -gohilvl
A set of tools to connect gohighlevel data to your LLMs
claude mcp add --transport stdio delorenj-mcp-server-gohilvl node server.js \ --env LOG_LEVEL="info" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env GOHIGHLEVEL_API_KEY="your-go-highlevel-api-key"
How to use
This MCP server is designed to bridge GoHighLevel data with large language models (LLMs) so you can build intelligent assistants, chat bots, and analytical tools that understand your GoHighLevel data. The server exposes a set of tools and endpoints that let you query contacts, deals, appointments, notes, and activities from GoHighLevel and pass that context to an LLM for tasks like summarization, question answering, and automated outreach suggestions. To get started, configure your API keys for GoHighLevel and your chosen LLM provider, then run the MCP server. Once running, you can call the server’s endpoints to fetch data from GoHighLevel and generate LLM-driven responses or actions based on that data.
Available capabilities typically include:
- Data retrieval: fetch records such as contacts, deals, tasks, and activities from GoHighLevel.
- Context augmentation: enrich prompts with relevant data (e.g., contact history, deal stage, upcoming tasks).
- LLM-driven actions: generate outreach templates, follow-up messages, or summary notes powered by your preferred LLM.
- Canned tools: pre-configured utilities to format data, run simple analyses, or assemble structured responses for downstream apps.
To use a tool, send a request to the MCP server with your GoHighLevel data context and specify the desired tool. The server will route the request to the appropriate tool, which will interface with the LLM and return a structured result that you can display in your app or chat interface.
How to install
Prerequisites:
- Node.js (>= 16) and npm
- Access tokens for GoHighLevel API and OpenAI (or your chosen LLM provider)
Installation steps:
- Clone the repository or download the MCP server package:
# Example clone (adjust URL to your repo)
git clone https://github.com/your-org/delorenj-mcp-server-gohilvl.git
cd delorenj-mcp-server-gohilvl
- Install dependencies:
npm install
- Configure environment variables:
- GoHighLevel API key: GOHIGHLEVEL_API_KEY
- OpenAI API key: OPENAI_API_KEY
- Optional: LOG_LEVEL, etc. Create a .env file or export variables:
export GOHIGHLEVEL_API_KEY="your-go-highlevel-api-key"
export OPENAI_API_KEY="your-openai-api-key"
export LOG_LEVEL="info"
- Run the server:
npm start
- Optional: run in production using a process manager or container:
- Docker: build and run a container with the environment variables set
- PM2 or similar: start the Node server with your process manager
Notes:
- Ensure your GoHighLevel API key has the necessary permissions for the data you intend to access.
- Monitor logs for any authentication or rate-limit issues and adjust your environment variables accordingly.
Additional notes
Tips:
- Keep your API keys secure; use environment variables or secret management if deploying to production.
- If you encounter authentication errors, double-check the GoHighLevel API permissions and the API key scope.
- Define clear prompts and data context when using LLM tools to avoid boilerplate or irrelevant results.
- Consider adding rate-limit handling in your client to avoid hitting MCP server limits.
- If you rename the server (e.g., in the slug), update the mcp_config server key accordingly to match your deployment.
Configuration options to consider:
- GOHIGHLEVEL_API_KEY: GoHighLevel API access token.
- OPENAI_API_KEY: Key for the LLM provider (OpenAI or compatible).
- LOG_LEVEL: Logging detail level (e.g., debug, info, warn, error).