concierge-os
Open Source AI platform to build Agentic services, ChatGPT Apps and MCP Servers
claude mcp add --transport stdio concierge-hq-concierge-os python -m openmcp run concierge-os \ --env OPENMCP_CONFIG="placeholder for runtime configuration (if needed)"
How to use
Concierge OS is a Python-based MCP server that leverages the Concierge framework to expose application tools and workflows to agents via the MCP/OpenMCP ecosystem. It enables declarative, multi-step workflows and supports agent-based navigation through complex service hierarchies. With Concierge, you can publish tools (tasks) and workflows, render interactive widgets, and integrate with OpenMCP for seamless agent interactions. This server is designed to work with the openmcp-sdk and leverages the concepts of tasks, stages, and workflows to guide agents through business processes.
To use it, install the OpenMCP SDK, initialize your project, and deploy your MCP service. The server will expose your defined tools and workflows to agents, enabling them to call tasks, move between stages, and manage state as they complete goals. Concierge also supports OpenMCP features like widget support, inspector debugging, and compatibility with ChatGPT Apps, making it easier to build web-exposed services that agents can orchestrate. Refer to the documentation at getconcierge.app/docs for detailed usage and examples.
How to install
Prerequisites:
- Python 3.10+ installed on your system
- PIP (usually comes with Python)
- Internet access to install packages
Step-by-step install:
-
Create and activate a virtual environment (recommended) python -m venv venv
On Windows
venv\Scripts\activate
On macOS/Linux
source venv/bin/activate
-
Install the OpenMCP SDK (Conceirge/MCP core) pip install openmcp-sdk
-
Initialize the Concierge MCP project (if not already initialized) openmcp init
-
Deploy the Concierge MCP server openmcp deploy
-
Run the server (if needed directly) using the managed runner python -m openmcp run concierge-os
Notes:
- Ensure your environment has network access to fetch dependencies.
- If you maintain separate environments for different MCP servers, use dedicated virtual environments per server.
- Review any module-specific configuration in your project’s config files or environment variables as required by your deployment setup.
Additional notes
Environment variables and configuration tips:
- OPENMCP_CONFIG can be used to pass runtime configuration in some setups; placeholder shown in mcp_config if needed by your deployment environment.
- Keep Python dependencies isolated via virtual environments to avoid version conflicts.
- If you plan to expose additional tools or widgets, ensure they are declared in your MCP project configuration and that you have proper access controls for production deployments.
- Regularly update openmcp-sdk to benefit from new features and security patches.
- For debugging, leverage Concierge inspector tooling and the OpenMCP integration to validate tool execution, state management, and workflow transitions before deploying to production.
Related MCP Servers
dify
Production-ready platform for agentic workflow development.
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
lamda
The most powerful Android RPA agent framework, next generation of mobile automation robots.
solace-agent-mesh
An event-driven framework designed to build and orchestrate multi-agent AI systems. It enables seamless integration of AI agents with real-world data sources and systems, facilitating complex, multi-step workflows.
concierge
🚀 Universal SDK for building next-gen MCP servers
argo
ARGO is an open-source AI Agent platform that brings Local Manus to your desktop. With one-click model downloads, seamless closed LLM integration, and offline-first RAG knowledge bases, ARGO becomes a DeepResearch powerhouse for autonomous thinking, task planning, and 100% of your data stays locally. Support Win/Mac/Docker.