listenhub
ListenHub's official MCP Server | The AI Voice for Every Creator
claude mcp add --transport stdio marswaveai-listenhub-mcp-server npx -y @marswave/listenhub-mcp-server@latest \ --env LISTENHUB_API_KEY="your_api_key_here"
How to use
ListenHub MCP Server exposes MCP endpoints and tooling to interact with ListenHub's AI podcast generation features, FlowSpeech creation, and more through a dedicated MCP server. After starting the server, clients and editors can connect to the MCP endpoint to create podcasts with single or dual speakers, generate FlowSpeech prompts, and manage ListenHub resources via the MCP protocol. The server is designed to work with common MCP client integrations such as Claude Desktop, Cursor, Windsurf, VS Code (Cline), Zed, Claude CLI, Codex CLI, and other MCP-capable tools. To authenticate, you must provide a ListenHub API key via the LISTENHUB_API_KEY environment variable, which authorizes access to your ListenHub account and features available in your plan (ListenHub Pro and above). The typical workflow is to run the MCP server (via npx -y @marswave/listenhub-mcp-server@latest) and then configure your MCP client to point at the server, including the API key in environment variables or client configuration where applicable.
How to install
Prerequisites:
- Node.js v18 or higher
- npm
- A ListenHub API key (required to access API features)
- Install Node.js (if not already installed)
- macOS: use the official installer or Homebrew
- Windows: use the official installer or a package manager likewinget or Chocolatey
- Linux: install via your distro’s package manager or NodeSource repository
- Obtain API key
- Get your ListenHub API key from the ListenHub API Keys Settings page: https://listenhub.ai/en/settings/api-keys
- Run the MCP server
- The easiest way is to run via npx (no local install required):
npx -y @marswave/listenhub-mcp-server@latest
- (Optional) Provide API key via environment variable
- In your shell, export the key before starting:
export LISTENHUB_API_KEY=your_api_key_here
npx -y @marswave/listenhub-mcp-server@latest
- Alternative configuration methods
- If you’re wiring this into an editor or CLI that expects an MCP config, use the following mcp.json snippet:
{
"mcpServers": {
"listenhub": {
"command": "npx",
"args": ["-y", "@marswave/listenhub-mcp-server@latest"],
"env": {
"LISTENHUB_API_KEY": "your_api_key_here"
}
}
}
}
- Verify the server is reachable
- Start the server and point your MCP client to the server endpoint (default local port or as configured by your client).
Additional notes
Tips and common notes:
- Ensure LISTENHUB_API_KEY is set in the environment for the server to access ListenHub features.
- The server supports HTTP transport as an optional mode via --transport http with a specified port if your editor requires it; in that case, start with: export LISTENHUB_API_KEY=your_api_key_here npx @marswave/listenhub-mcp-server --transport http --port 3000 Then configure your MCP client to connect to http://localhost:3000/mcp
- This MCP server enables AI podcast generation (single or dual-speaker), FlowSpeech creation, and additional ListenHub capabilities compatible with MCP clients.
- If you upgrade your ListenHub plan, you’ll typically gain access to more features via the same API key-based authentication.
- Common issues often relate to API key misconfiguration or network restrictions; ensure the environment variable is loaded in the environment from which the MCP server runs.
Related MCP Servers
CanvasMCPClient
Canvas MCP Client is an open-source, self-hostable dashboard application built around an infinite, zoomable, and pannable canvas. It provides a unified interface for interacting with multiple MCP (Model Context Protocol) servers through a flexible, widget-based system.
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
obsidian
MCP server for Obsidian vault management - enables Claude and other AI assistants to read, write, search, and organize your notes
GameMaker
GameMaker MCP server for Cursor - Build GM projects with AI
xgmem
Global Memory MCP server, that manage all projects data.
mcp-turso
MCP server for interacting with Turso-hosted LibSQL databases