voice-agent
A Model Context Protocol (MCP) server that integrates Twilio Voice, Deepgram AI, and OpenAI to create intelligent voice-based HR automation tools.
claude mcp add --transport stdio prakharbhardwaj-voice-agent-mcp-server node /path/to/your/voice-agent-mcp-server/mcp-server.js \ --env NODE_ENV="production" \ --env SERVER_URL="your_ngrok_url_or_server_url" \ --env DEEPGRAM_API_KEY="your_deepgram_api_key" \ --env TWILIO_AUTH_TOKEN="your_twilio_auth_token" \ --env TWILIO_ACCOUNT_SID="your_twilio_account_sid" \ --env TWILIO_PHONE_NUMBER="+your_twilio_phone_number"
How to use
Voice Agent MCP Server integrates Twilio Voice, Deepgram, and OpenAI to enable AI-driven voice workflows for HR tasks such as conducting candidate interviews, delivering results, and outreach. The server exposes MCP tools that Claude or other MCP clients can call to initiate calls, manage interviews, and send notifications. Real-time audio is streamed via Twilio Media Streams and processed through Deepgram with an OpenAI-backed prompt system to generate natural, context-aware conversations. Tools include conducting interviews, notifying candidates of results, discussing job openings, checking call status, and verifying Twilio configuration. To use it, configure the MCP server in Claude Desktop (or your MCP client) with the provided mcp-config.json entry, ensure environment variables are set, and start the web server. Then invoke the available tools in your conversations to trigger voice actions such as starting an interview or sending a notification.
How to install
Prerequisites:
- Node.js 22+ installed
- npm installed
- A Twilio account with Account SID, Auth Token, and a configured outbound number
- A Deepgram account with an API key
- A public URL (ngrok or production URL) for webhooks
Installation steps:
- Clone the repository and install dependencies
git clone https://github.com/prakharbhardwaj/voice-agent-mcp-server.git
cd voice-agent-mcp-server
npm install
- Create and configure environment variables
# Create a .env or set variables in your environment
PORT=3000
SERVER_URL=https://your.public-url
TWILIO_ACCOUNT_SID=your_twilio_account_sid
TWILIO_AUTH_TOKEN=your_twilio_auth_token
TWILIO_PHONE_NUMBER=+your_twilio_phone_number
DEEPGRAM_API_KEY=your_deepgram_api_key
- Configure MCP server entry
- Edit mcp-config.json (or the equivalent MCP manifest) to include your server:
{
"mcpServers": {
"voice-agent-mcp-server": {
"type": "stdio",
"command": "node",
"args": ["/path/to/your/voice-agent-mcp-server/mcp-server.js"],
"env": {
"NODE_ENV": "production",
"SERVER_URL": "your_ngrok_url_or_server_url",
"TWILIO_ACCOUNT_SID": "your_twilio_account_sid",
"TWILIO_AUTH_TOKEN": "your_twilio_auth_token",
"TWILIO_PHONE_NUMBER": "+your_twilio_phone_number"
}
}
}
}
- Start the server
npm run dev
- Ensure your MCP client (e.g., Claude Desktop) is configured to connect to the MCP server using the provided configuration and that the public URL in SERVER_URL is reachable.
Additional notes
Tips and common issues:
- Ensure SERVER_URL is publicly accessible and reachable from Twilio and Deepgram callbacks.
- Keep all credentials in environment variables; avoid committing .env files.
- If Claude desktop does not load the MCP server, verify absolute paths in mcp-config.json and restart Claude Desktop after changes.
- Review Deepgram and Twilio logs for connection or authorization errors.
- Use the supplied tools list to extend functionality by adding new tools in src/mcp/tools.js and wiring prompts in src/mcp/prompts.js.
Related MCP Servers
mcp-nodejs-debugger
🐞 MCP Node.js debugger
frontmcp
TypeScript-first framework for the Model Context Protocol (MCP). You write clean, typed code; FrontMCP handles the protocol, transport, DI, session/auth, and execution flow.
shinzo-ts
TypeScript SDK for MCP server observability, built on OpenTelemetry. Gain insight into agent usage patterns, contextualize tool calls, and analyze server performance across platforms. Integrate with any OpenTelemetry ingest service including the Shinzo platform.
openai -agent-dotnet
Sample to create an AI Agent using OpenAI models with any MCP server running on Azure Container Apps
mcp -templates
A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.
movie-context-provider
An OpenAI App demo built with the OpenAI Apps SDK, that's ready to deploy on Render.