Get the FREE Ultimate OpenClaw Setup Guide →

joinly

Make your meetings accessible to AI Agents

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio joinly-ai-joinly docker run --env-file .env ghcr.io/joinly-ai/joinly:latest

How to use

joinly is an MCP server that runs as a Docker container to provide an AI-assisted meeting experience. It acts as a centralized MCP server that can join video calls and expose tools to an external client via the MCP protocol. You can connect an external MCP client (or the included joinly-client) to this server to leverage the meeting tools, control the agent’s behavior, and direct tasks within a live meeting. The server supports a client-mode option as well, but by default it runs as a server and accepts external connections. To enable AI capabilities, you must supply LLM provider configuration in the environment (for example OpenAI). The server can be configured with multiple providers and models, and you can add external MCP servers or tool configurations to extend its capabilities.

How to install

Prerequisites:

  • Docker installed on your machine
  • Internet access to pull the joinly Docker image
  • Optional: an .env file with API keys and provider settings

Step-by-step installation:

  1. Create a working directory and prepare environment
  • mkdir joinly && cd joinly
  • Create a .env file with your LLM and provider configuration (examples below)
  1. Prepare environment variables (example .env content)

.env

for OpenAI LLM

JOINLY_LLM_MODEL=gpt-4o JOINLY_LLM_PROVIDER=openai OPENAI_API_KEY=your-openai-api-key

You can also configure other providers and models as needed

  1. Pull and run the Docker image
  • Pull the latest image: docker pull ghcr.io/joinly-ai/joinly:latest

  • Start the server (as MCP server by default) with environment file: docker run --env-file .env -d ghcr.io/joinly-ai/joinly:latest

  1. Connect a client or external MCP client
  • Use an MCP client (e.g., joinly-client) to connect to the server at the container’s exposed port (default port 8000 as described in the docs) and begin using tools in a meeting.

Notes:

  • If you want to run the client inside the container as a client, use the --client option as described in the official Quickstart.
  • You can customize provider configurations and switch between providers by editing the .env file and restarting the container.

Additional notes

Tips and common issues:

  • Ensure Docker is installed and the daemon is running before pulling the image.
  • The .env file must contain valid API keys for the LLM providers you intend to use. Replace placeholders with your actual keys.
  • If you encounter networking or port mapping issues, verify that the container port (default 8000) is accessible from your client and that you haven’t overridden it in erroneous ways.
  • You can add multiple MCP servers/tools by supplying an mcp-config to your joinly-client, enabling the agent to expose several tools in the meeting. See the example in the Quickstart for configuring multiple servers.
  • Review provider compatibility and model availability for the chosen LLM provider to ensure optimal performance during meetings.

Related MCP Servers

Sponsor this space

Reach thousands of developers