atmosphere
Stream Real Time and LLM response data over WebSocket, SSE, and the MCP protocol
claude mcp add --transport stdio atmosphere-atmosphere java -jar atmosphere-mcp.jar \ --env MCP_TLS="false" \ --env MCP_HOST="localhost" \ --env MCP_PORT="8080" \ --env MCP_LOG_LEVEL="info"
How to use
Atmosphere includes an MCP server module that exposes a Model Context Protocol endpoint over WebSocket, allowing clients to connect and exchange commands and streaming data via MCP sessions. The MCP server is built on top of Atmosphere's transport-agnostic framework, so clients can connect using WebSocket or other transports (SSE, long-polling, gRPC) without changing the server logic. You can define MCP tools on the server side using annotated methods (e.g., @McpTool and @McpParam) and stream results back to connected MCP sessions. This enables real-time, streaming tool calls and responses to AI assistants or other agents that rely on MCP for interoperability.
How to install
Prerequisites:
- Java 17+ and a JDK installation
- Maven or Gradle for building from source (optional if using pre-built jars)
- Git to clone the repository
Install and run from source:
-
Clone the repository git clone https://github.com/Atmosphere/atmosphere.git cd atmosphere
-
Build the MCP module (atmosphere-mcp) mvn -pl atmosphere-mcp -am clean package
-
Run the MCP server (example) java -jar atmosphere-mcp/target/atmosphere-mcp.jar
-
Verify the server is running by connecting a client to ws://localhost:8080 or the configured host/port
Notes:
- If you prefer Docker, build a container image that runs the java -jar command and expose the MCP port.
- You can adjust host/port with environment variables MCP_HOST and MCP_PORT as shown in the mcp_config example.
Additional notes
Tips and common considerations:
- MCP sessions are transport-agnostic; ensure your client selects a compatible transport (WebSocket is the default).
- Use MCP tool annotations (@McpTool, @McpParam) to expose server-side capabilities to AI agents or clients. Streaming tokens from long-running tools can be broadcast to topics or sessions in real time.
- Configure logging (MCP_LOG_LEVEL) to troubleshoot connection issues or tool execution.
- If running in a containerized environment, map the MCP port and ensure network policies allow WebSocket traffic.
- For production, consider securing MCP with TLS and proper authentication/authorization on your MCP endpoints.
Related MCP Servers
ai-code-helper
2025 年 AI 编程助手实战项目(作者:程序员鱼皮),基于 Spring Boot 3.5 + Java 21 + LangChain4j + AI 构建智能编程学习与求职辅导机器人,覆盖 AI 大模型接入、LangChain4j 核心特性、流式对话、Prompt 工程、RAG 检索增强、向量数据库、Tool Calling 工具调用、MCP 模型上下文协议、Web 爬虫、安全防护、Vue.js 前端开发、SSE 服务端推送等企业级 AI 应用开发技术。帮助开发者掌握 AI 时代必备技能,熟悉 LangChain 框架,提升编程学习效率和求职竞争力,成为企业需要的 AI 全栈开发人才。
sonarqube
SonarQube MCP Server
quarkus-workshop-langchain4j
Quarkus Langchain4J Workshop
WigAI
Bitwig Controller Extension that provides an MCP Server for AI Agent control
obsidian
MCP server for Obsidian vault management - enables Claude and other AI assistants to read, write, search, and organize your notes
SchemaCrawler-AI
Free database schema discovery and comprehension tool