flutter-ai-labs
A curated collection of LLM-powered Flutter apps built using RAG, AI Agents, Multi-Agent Systems, MCP, and Voice Agents.
claude mcp add --transport stdio theshivamlko-flutter-ai-labs dart run bin/server.dart
How to use
This MCP server is implemented in Dart as part of the Flutter AI Labs project. It enables you to run a configurable MCP (Model Context Protocol) server that can orchestrate LLMs, agents, and multi-agent workflows for Flutter-based AI apps. The server is designed to integrate with various LLM providers and runtimes, enabling RAG pipelines, AI agents, and voice-enabled interactions within Flutter experiments. Once running, you can connect your MCP clients or tooling to this server to query models, manage agent plans, and execute multi-step conversations across different model backends. Typical usage involves launching the server, configuring MCP endpoints, and then using the provided MCP client tooling to register tools, intents, and memories that agents can use during reasoning and task execution.
With this server, you can explore capabilities such as: loading and routing prompts to multiple LLM providers (OpenAI, Anthropic, Google, xAI, and open-source options like Qwen and LLaMA), composing multi-step agent strategies, performing document grounding and retrieval augmented generation (RAG), and coordinating between agents for complex Flutter app scenarios. You can also experiment with voice-enabled prompts and agent-driven workflows to build responsive Flutter apps powered by LLMs and embedded memory contexts.
How to install
Prerequisites:
- Dart SDK installed on your machine (preferably the latest stable version).
- Git to clone the repository.
Step-by-step installation:
- Clone the repository: git clone https://github.com/theshivamlko/flutter-ai-labs.git
- Navigate to the MCP server example directory (adjust path if needed): cd flutter-ai-labs
- Install dependencies (if any Dart package manager step is required): dart pub get
- Run the MCP server: dart run bin/server.dart
- Verify the server is listening on the configured port (default is typically 8080 or as defined in your server code) and consult the README for any environment variable configurations.
If you plan to containerize, refer to your Dart project’s standard Docker setup and ensure the container exposes the MCP port and includes the Dart SDK in the image.
Additional notes
Notes and tips:
- This MCP server is implemented in Dart as part of the Flutter AI Labs ecosystem. Ensure your Dart environment is up to date to avoid compatibility issues.
- If you plan to connect multiple LLM providers, prepare your API keys and environment variables as required by each provider (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.).
- Check the server logs for any binding or dependency errors when starting up, and ensure network ports are not blocked by firewalls.
- The MCP configuration can be extended to include additional tools, adapters, and memory stores as your Flutter AI experiments grow. Document any new endpoint or tool in your MCP client configuration.
- For local testing, keep a small set of mock tools defined to validate the MCP message flow before wiring up real model providers.
Related MCP Servers
paperdebugger
A Plugin-Based Multi-Agent System for In-Editor Academic Writing, Review, and Editing
multimodal-agents-course
An MCP Multimodal AI Agent with eyes and ears!
mcp-dotnet-samples
A comprehensive set of samples of creating and using MCP servers and clients with .NET
mcp-probe
A Model Context Protocol (MCP) client library and debugging toolkit in Rust. This foundation provides both a production-ready SDK for building MCP integrations and the core architecture for an interactive debugger.
mcp -fuzzer
A generic mcp server fuzzer
mcp s
Production-ready Model Context Protocol (MCP) servers in Python, Go, and Rust for VS Code integration. Enables AI systems to interact with tools via standardized interfaces.