Get the FREE Ultimate OpenClaw Setup Guide →

bagel

Chat with your robotics, drone, and IoT data — ChatGPT for the physical world.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio extelligence-ai-bagel docker compose run --service-ports ros2-kilted

How to use

Bagel is a Python-based MCP server that runs in Docker and provides a structured way to query and analyze data sources via LLMs. It analyzes metadata and topics from your data, writes relevant topic messages to an Apache Arrow file, and uses DuckDB to execute deterministic SQL queries against it. This approach makes the results auditable and allows you to guide Bagel to correct any errors. To use Bagel, start the Docker-based server and connect it to an LLM of your choice (Claude Code, Gemini CLI, Codex, Cursor, Copilot, etc.). Bagel ships with runbooks for several tested LLMs, and you can customize or add more via POML-based capabilities. After the server is up, you can prompt Bagel with natural language questions like asking for correlations in a given topic stream or requesting latency statistics for a data source, and Bagel will return structured, explainable results along with the underlying deterministic queries.

Once connected, you launch prompts against Bagel to prompt it to analyze a data source, generate the corresponding DuckDB queries, and return results. You can also teach Bagel new tricks using POML files to extend its capabilities, such as computing latency statistics or other domain-specific analyses. The quickstart focuses on running Bagel in a Docker environment and then connecting it to an LLM to begin prompting and retrieving results.

How to install

Prerequisites:

  • Docker Desktop installed on your machine (Docker Compose support required).
  • Git installed to clone the repository.

Install and run Bagel locally:

# 1) Clone the repository and navigate into it
git clone https://github.com/Extelligence-ai/bagel.git
cd bagel
# 2) Ensure Docker Desktop is running
# 3) Start Bagel via Docker (uses the recommended ros2-kilted service from compose.yaml)

# If you want to reproduce the quickstart exactly, use:
docker compose up -d --build

Note: The project provides a Docker Compose configuration (compose.yaml) with multiple services. The example shown in the README uses docker compose run --service-ports ros2-kilted to start a specific service. You can edit compose.yaml to pick the service that matches your data source and environment, and then bring it up with docker compose up or docker compose run as demonstrated in the Quickstart.

If you prefer a direct Node/PyPI install (not recommended for Bagel which is designed to run in Docker in this repo), you would typically install dependencies and run the server, but Bagel’s recommended path is via Docker Compose as described above.

Additional notes

Tips and gotchas:

  • Bagel is Dockerized to minimize local dependencies; ensure Docker Desktop is running and that you have permission to run containers.
  • If you need to access local files from the container, edit compose.yaml and adjust the volumes section for the chosen service before starting.
  • The server is designed to work with multiple LLMs via runbooks; you can follow the links in the Quickstart to configure Claude Code, Gemini CLI, Codex, Cursor, Copilot, etc.
  • Bagel outputs deterministic DuckDB SQL queries for auditing; review these queries to verify the results and understand how the answer was produced.
  • To teach Bagel new capabilities, create POML files describing the task and required outputs, then prompt Bagel to execute them against your data sources.
  • If you encounter port or connectivity issues, verify that your LLM integration runbooks match the endpoint exposed by the running container (default is usually http://0.0.0.0:8000).

Related MCP Servers

Sponsor this space

Reach thousands of developers