osaurus
AI edge infrastructure for macOS. Run local or cloud models, share tools across apps via MCP, and power AI workflows with a native, always-on runtime.
claude mcp add --transport stdio osaurus-ai-osaurus osaurus mcp
How to use
Osaurus provides an MCP server interface that lets AI agents connect to local and remote model providers through a unified set of tools exposed via MCP. Once the Osaurus MCP server is running, clients can discover and invoke tools exposed by Osaurus and by remote MCP providers, enabling seamless tool calling, model inference, and workflow automation within agent runtimes. You can start the server, connect an MCP client (e.g., Cursor or Claude Desktop) using the example configuration, and optionally add remote MCP providers to aggregate tools from external MCP servers. The server exposes a local MLX-based model runtime for on-device inference, and it can route requests to compatible cloud APIs (OpenAI, Anthropic, Ollama, etc.) as needed, while also providing a shared tool surface for agents to call tasks implemented by Osaurus tools and plugins.
How to install
Prerequisites:
- macOS on Apple Silicon (recommended) or a supported macOS environment
- Homebrew installed
- Access to the internet to download dependencies
Step 1: Install Osaurus
- Brew-based installation (macOS): brew install --cask osaurus
Step 2: Confirm the MCP server capability
- Ensure Osaurus is installed and accessible from your PATH. You should be able to run the UI or CLI commands such as osaurus serve.
Step 3: Start the MCP server
- Start Osaurus in MCP mode to expose the MCP endpoints: osaurus serve
- By default Osaurus serves on port 1337. You can customize port via Osaurus configuration if needed.
Step 4: Configure an MCP client to connect
- Use the following MCP client configuration to connect to Osaurus: { "mcpServers": { "osaurus": { "command": "osaurus", "args": ["mcp"] } } }
Optional steps:
- In Osaurus, connect to additional remote MCP providers to aggregate tools from external MCP servers.
- Access the Management UI to configure providers and tools as needed.
Additional notes
Tips and notes:
- The MCP server is part of Osaurus, an edge runtime for macOS. It exposes tools to AI agents via MCP and can integrate remote MCP providers for broader tool access.
- Default server port for local API access is 1337; you can adjust networking settings in Osaurus configuration if required.
- If you encounter connectivity issues, verify that the OS firewall or network policies permit localhost:1337 and any configured ports.
- When using remote providers, ensure API keys or credentials are configured in Osaurus Management UI or environment as appropriate for each provider.
- The mcp configuration example uses the server name "osaurus"; you can rename the entry to fit your client configuration, but keep the command as the Osaurus binary and the argument as shown ( "mcp").
Related MCP Servers
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
mcp-llm
An MCP server that provides LLMs access to other LLMs
ToolRAG
Unlimited LLM tools, zero context penalties — ToolRAG serves exactly the LLM tools your user-query demands.
mcp -templates
A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.
mcp-chat-widget
Configure, host and embed MCP-enabled chat widgets for your website or product. Lightweight and extensible Chatbase clone to remotely configure and embed your agents anywhere.
SwiftMCP
Model Context Protocol (MCP) Swift