signallama
lightweight chat interface designed to work seamlessly with your own MCP servers
claude mcp add --transport stdio fbarresi-signallama npx -y --verbose @modelcontextprotocol/server-everything \ --env USE_STDIO="true" \ --env MCP_ENDPOINT="https://localhost:7170"
How to use
Signallama is a lightweight chat UI built with a real-time SignalR frontend that can connect to your own MCP (Model Context Protocol) server. It does not enforce a single backend; instead, it can be wired to any MCP-compliant backend via the configured MCP endpoints (SSE or stdio-based integration). The README demonstrates two integration patterns: using SSE to connect to a running MCP server, and using a stdio-based external process (for example via NPX) that hosts an MCP-compatible backend. To start using it, configure your MCP connection in appsettings.json for the backend (McpSettings.Sse) by providing the endpoint, whether to use streamable HTTP, a server name, and a connection timeout. If you need to integrate with an external project, you can add a Stdio entry that launches the external MCP process and exposes a compatible interface for Signallama to communicate with.
Once configured, start the MCP backend (or use the provided signallama.mcp if available) and then start the Signallama frontend (signallama.web). The app will render a responsive chat UI powered by SignalR and will exchange messages with your MCP backend in real time. This setup enables you to leverage your own MCP logic while enjoying a modern, lightweight chat interface that can be embedded into larger systems.
How to install
Prerequisites:
- Ollama (optional, for model-backed features or local MCP testing)
- .NET Core SDK
- Access to a compatible MCP server or the ability to run Signallama's backend
Step-by-step installation:
- Install prerequisites
- Install .NET Core SDK from the official site (e.g., dotnet --version should show a valid SDK).
- Ensure Ollama is installed if you plan to use local models.
- Prepare the MCP backend (one of the following options):
- Use the provided Signallama backend (dotnet): ensure the backend DLL is available at path/to/Signallama.Server.dll
- Or run an external MCP process via stdio (e.g., using NPX): compatible command like npx -y --verbose @modelcontextprotocol/server-everything
- Configure the Signallama appsettings.json
- In McpSettings, add Sse configuration with Endpoint, UseStreamableHttp, Name, and ConnectionTimeout
- Example: (see README for exact JSON structure)
- Start the MCP backend
- For dotnet backend: dotnet path/to/Signallama.Server.dll
- For stdio backend: use the stdio command defined in your mcp_config (e.g., npx -y --verbose @modelcontextprotocol/server-everything)
- Start_signallama frontend
- Run the Signallama frontend project (the signallama.web project or equivalent startup command as per your repo)
- Open the UI
- Navigate to the Signallama frontend URL and connect to your MCP backend via the configured SSE or stdio channel.
Additional notes
Tips and common considerations:
- Ensure your MCP backend endpoint is accessible from the Signallama frontend (check firewall and TLS settings).
- When using SSE, confirm the endpoint supports the required features (streamable HTTP, proper CORS configuration if the frontend runs separately).
- For stdio integration, ensure the external process exposes a compatible interface that Signallama can communicate with (proper stdout/stderr handling and lifecycle management).
- In appsettings.json, the McpSettings.McpName (or equivalent Name fields) should be unique if you run multiple MCP backends.
- If you encounter connection timeouts, increase ConnectionTimeout in the Sse configuration to account for startup latency of your MCP server.
- Logging: enable verbose logs during setup to diagnose misconfigurations between the frontend and MCP backend.
- If you modify environment variables, document them in the mcp_config under the specific server entry so you know what to configure in different environments.
Related MCP Servers
Archive-Agent
Find your files with natural language and ask questions.
mem0 -selfhosted
Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.
mcpbi
PowerBI MCP server to give LLM clients (Claude, GH Copilot,etc) context from locally running PowerBI Desktop instances.
memory
A MCP (Model Context Protocol) server providing long-term memory for LLMs
unity -template
Simple template project for controlling Unity via MCP
vertx
A Vert.x MCP Server built on top of MCP Java SDK