universal-agent-context
Universal context system for AI agents: discover, translate, and manage agent skills across formats. Includes MCP server, CLI, and Python library.
claude mcp add --transport stdio kylebrodeur-universal-agent-context uvx universal-agent-context serve
How to use
Universal Agent Context System (UACS) exposes the core UACS capabilities through an MCP server so other agents and orchestration tools can interact with its semantic conversation tracking, knowledge extraction, and memory/context management. The server is driven from the Python package universal-agent-context and is operable via the uvx tool for quick usage or via a deployed MCP server in more formal environments. End users can leverage the MCP server to initialize UACS contexts, persist memories, perform semantic searches across conversations, and access knowledge extraction outputs like decisions and conventions, all while benefiting from UACS’ deduplication, crash-resistant storage, and integrated CLI/API surface. Tools available include the Python API for programmatic access, the CLI surface for scripting, and the web UI for exploring stored context if you enable the bundled UI features. The MCP server acts as a bridge, allowing Claude, Cursor, Windsurf, Cline, or other clients to tap into UACS’ structured storage, semantic embeddings, and search capabilities without embedding the entire workflow into each client.
How to install
Prerequisites:
- Python 3.11+ (recommended)
- Internet access to fetch packages
- Optional: uvx installed for quick startup, or use a containerized approach
Install via uvx (recommended for MCP usage):
# Ensure uvx is installed (pip install uvx or follow uvx install docs)
# Then install the universal-agent-context MCP server package
uvx universal-agent-context serve
Install via pip (direct Python workflow):
# Create and activate a virtual environment (optional but recommended)
python3.11 -m venv venv
source venv/bin/activate
# Install the package
pip install universal-agent-context
# Run the MCP server via uvx on the package (recommended MCP entry)
uvx universal-agent-context serve
Alternative quick-start (one-liner):
uvx universal-agent-context serve
If you prefer a Docker-based deployment, follow the project’s MCP_SERVER_DOCKER guidance (not shown here) to run the container with the appropriate port bindings and environment configuration.
Additional notes
Tips and caveats:
- The MCP server exposes UACS features to clients via an MCP-compatible interface. Ensure clients know how to authenticate and authorize access as required by your deployment.
- If using memory/search features, consider enabling and configuring embeddings/models or hardware acceleration as appropriate for your environment.
- For local development, the Python CLI and uvx-based startup provide a straightforward workflow; for production, consider containerization and persistent storage for memories.
- When upgrading from previous versions, review the Migration Guide referenced in the project docs to map new semantic API features and any breaking changes.
- If you encounter network or binding issues, verify that the host/port configurations do not conflict with other services and that the MCP endpoint is reachable by your clients.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
dexto
A coding agent and general agent harness for building and orchestrating agentic applications.
sdk-typescript
A model-driven approach to building AI agents in just a few lines of code.
AutoDocs
We handle what engineers and IDEs won't: generating and maintaining technical documentation for your codebase, while also providing search with dependency-aware context to help your AI tools understand your codebase and its conventions.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.