perplexity
A Perplexity API MCP server that unlocks Perplexity's search-augmented AI capabilities for LLM agents. Features robust error handling, secure input validation, and transparent reasoning with the showThinking parameter.
claude mcp add --transport stdio cyanheads-perplexity-mcp-server node dist/index.js \ --env MCP_AUTH_MODE="jwt or oauth" \ --env MCP_HTTP_HOST="Host for HTTP server (if using http transport)" \ --env MCP_HTTP_PORT="Port number for HTTP (if using http transport)" \ --env MCP_LOG_LEVEL="debug | info | warn | error" \ --env MCP_TRANSPORT_TYPE="stdio or http" \ --env PERPLEXITY_API_KEY="Your Perplexity API key" \ --env MCP_AUTH_SECRET_KEY="JWT secret key (min 32 chars)"
How to use
Perplexity MCP Server exposes Perplexity AI capabilities through the MCP interface, enabling LLMs and agents to perform search-augmented queries and deep research via a standardized protocol. The server wraps Perplexity's search tools and a deep-research workflow, making them accessible as MCP operations (perplexity_search and perplexity_deep_research). You can integrate this server with your MCP clients to run quick lookups, filter results by recency or domain, and generate detailed research reports. The built-in cost tracking and robust error handling help manage usage and reliability in complex agent workflows. Tools are designed to be invoked like any MCP operation, returning structured results that can be consumed by your LLMs or agents.
How to install
Prerequisites:
- Node.js (>= 18.x)
- npm (comes with Node.js)
- A Perplexity API key
Installation steps:
-
Clone the repository:
git clone https://github.com/cyanheads/perplexity-mcp-server.git cd perplexity-mcp-server
-
Install dependencies:
npm install
-
Build the project (TypeScript):
npm run build
-
Create a .env file with required variables (example shown):
cp .env.example .env
Then edit .env to set:
PERPLEXITY_API_KEY=<your-key>
MCP_TRANSPORT_TYPE=stdio|http
MCP_HTTP_PORT=<port-if-http>
MCP_HTTP_HOST=<host-if-http>
MCP_LOG_LEVEL=info|debug|warn|error
MCP_AUTH_MODE=jwt|oauth
MCP_AUTH_SECRET_KEY=<your-secret-key>
-
Run the MCP server:
npm start
Note: If you prefer running directly from the built artifact, ensure dist/index.js (or the configured entry) exists after the build and start with node dist/index.js.
Additional notes
Environment variables control both connectivity and security. If using HTTP transport, ensure MCP_HTTP_PORT is open and MCP_HTTP_HOST is correctly set. For JWT authentication, keep MCP_AUTH_SECRET_KEY secure and long enough (recommended >= 32 characters). The Perplexity API key must be kept secret and loaded into PERPLEXITY_API_KEY. Use MCP_LOG_LEVEL to control output verbosity during development or production. If you encounter timeouts during deep research, consider extending client-side timeouts (for example, 180 seconds) as recommended in the README. The server leverages the mcp-ts-template, so most standard MCP tooling and patterns apply (validation via zod, AsyncLocalStorage for request correlation, etc.).
Related MCP Servers
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
mcp-omnisearch
🔍 A Model Context Protocol (MCP) server providing unified access to multiple search engines (Tavily, Brave, Kagi), AI tools (Perplexity, FastGPT), and content processing services (Jina AI, Kagi). Combines search, AI responses, content processing, and enhancement features through a single interface.
Matryoshka
MCP server for token-efficient large document analysis via the use of REPL state
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
Agentic -Skill
Agentic-MCP, Progressive MCP client with three-layer lazy loading. Validates AgentSkills.io pattern for efficient token usage. Use MCP without pre-install & wasting full-loading
ntfy
An MCP (Model Context Protocol) server designed to interact with the ntfy push notification service. It enables LLMs and AI agents to send notifications to your devices with extensive customization options.