joinly
Make your meetings accessible to AI Agents
claude mcp add --transport stdio joinly-ai-joinly docker run --env-file .env ghcr.io/joinly-ai/joinly:latest
How to use
joinly is an MCP server that runs as a Docker container to provide an AI-assisted meeting experience. It acts as a centralized MCP server that can join video calls and expose tools to an external client via the MCP protocol. You can connect an external MCP client (or the included joinly-client) to this server to leverage the meeting tools, control the agent’s behavior, and direct tasks within a live meeting. The server supports a client-mode option as well, but by default it runs as a server and accepts external connections. To enable AI capabilities, you must supply LLM provider configuration in the environment (for example OpenAI). The server can be configured with multiple providers and models, and you can add external MCP servers or tool configurations to extend its capabilities.
How to install
Prerequisites:
- Docker installed on your machine
- Internet access to pull the joinly Docker image
- Optional: an .env file with API keys and provider settings
Step-by-step installation:
- Create a working directory and prepare environment
- mkdir joinly && cd joinly
- Create a .env file with your LLM and provider configuration (examples below)
- Prepare environment variables (example .env content)
.env
for OpenAI LLM
JOINLY_LLM_MODEL=gpt-4o JOINLY_LLM_PROVIDER=openai OPENAI_API_KEY=your-openai-api-key
You can also configure other providers and models as needed
- Pull and run the Docker image
-
Pull the latest image: docker pull ghcr.io/joinly-ai/joinly:latest
-
Start the server (as MCP server by default) with environment file: docker run --env-file .env -d ghcr.io/joinly-ai/joinly:latest
- Connect a client or external MCP client
- Use an MCP client (e.g., joinly-client) to connect to the server at the container’s exposed port (default port 8000 as described in the docs) and begin using tools in a meeting.
Notes:
- If you want to run the client inside the container as a client, use the --client option as described in the official Quickstart.
- You can customize provider configurations and switch between providers by editing the .env file and restarting the container.
Additional notes
Tips and common issues:
- Ensure Docker is installed and the daemon is running before pulling the image.
- The .env file must contain valid API keys for the LLM providers you intend to use. Replace placeholders with your actual keys.
- If you encounter networking or port mapping issues, verify that the container port (default 8000) is accessible from your client and that you haven’t overridden it in erroneous ways.
- You can add multiple MCP servers/tools by supplying an mcp-config to your joinly-client, enabling the agent to expose several tools in the meeting. See the example in the Quickstart for configuring multiple servers.
- Review provider compatibility and model availability for the chosen LLM provider to ensure optimal performance during meetings.
Related MCP Servers
nocturne_memory
一个基于uri而不是RAG的轻量级、可回滚、可视化的 **AI 外挂MCP记忆库**。让你的 AI 拥有跨模型,跨会话,跨工具的持久的结构化记忆。
apple-books
Apple Books MCP Server
workflowy
Powerful CLI and MCP server for WorkFlowy: reports, search/replace, backup support, and AI integration (Claude, LLMs)
omega-memory
Persistent memory for AI coding agents
MCP-OpenStack-Ops
Professional OpenStack operations automation via MCP server. Specialized tools for cluster monitoring, instance management, volume control & network analysis. FastMCP + OpenStack SDK + Bearer auth. Claude Desktop ready. Perfect for DevOps & cloud automation.
mcp-client-gen
Turn any MCP server into a type-safe TypeScript SDK in seconds - with OAuth 2.1 and multi-provider support