observee
Observee (YC S25) lets you build AI agents with 1000+ integrations using MCPs with managed OAuth, Security and Observability
claude mcp add --transport stdio observee-ai-observee node path/to/server.js \ --env LOG_LEVEL="info" \ --env OBSERVEE_API_KEY="your_observee_api_key" \ --env OBSERVEe_ORGANIZATION="your_org_id"
How to use
Observee provides an all-in-one SDK for building MCP-enabled agents with authentication and observability tooling. This MCP server integration centers around the Observee SDK, which exposes tools for AI agents, OAuth authentication flows, and structured logging/monitoring. With Observee, you can enable agents to call a suite of pre-built MCP tools (such as email access, messaging, task management, and other integrations) and manage authentication flows across multiple services. Use the agents package to connect an LLM provider (OpenAI, Claude, Gemini, etc.) to your MCP workflow, and leverage the auth module to simplify OAuth-based sign-in and token management. The logger component provides structured JSON logging and multiple transport options to help you monitor usage, latency, and errors across your MCP routes.
How to install
Prerequisites:\n- Node.js 14+ (for TypeScript/JavaScript usage) or Python 3.8+ (for Python usage)\n- Basic npm or pip tooling installed on your system.\n\nOption A: Node.js (TypeScript/JavaScript) installation\n1) Install the core SDK meta-package (recommended):\nbash\nnpm install @observee/sdk\n\n2) Or install individual packages as needed:\nbash\nnpm install @observee/agents @observee/auth @observee/logger\n\n3) Initialize and run the MCP server using your preferred server file (e.g., server.js) as indicated in your deployment. Ensure environment variables (see below) are configured.\n\nOption B: Python installation\n1) Install individual packages (no meta-package yet):\nbash\npip install mcp-agents agent-oauth mcp-logger\n\n2) Or install the all-in-one package if available:\nbash\npip install observee\n\n3) Run your MCP server script using Python (e.g., python -m observee_server).\n\nGeneral steps for both runtimes:\n- Create a server entry point that wires Observee SDK components (Agents, Auth, Logger) into your MCP routing.\n- Provide any required environment variables and configuration (see env section below).\n- Run the server with your deployment tooling (node server.js or python -m observee_server).
Additional notes
Environment variables and configuration can include API keys and OAuth credentials. Common issues include misconfigured provider credentials, network egress blocks, or missing permissions for the required MCP tools. Suggested variables to prepare: OBSERVEE_API_KEY (your Observee API key), OBSERVEe_ORGANIZATION (organization/tenant ID), LOG_LEVEL (debug|info|warn|error). When upgrading Observee SDK components, review breaking changes in the release notes, particularly around tool integrations and authentication flows. If you plan to run in Docker, package your server with the appropriate entrypoint and ensure environment variables are passed through. For local development, enable verbose logging to diagnose integration with specific MCP tools and providers.
Related MCP Servers
everything-claude-code
The agent harness performance optimization system. Skills, instincts, memory, security, and research-first development for Claude Code, Codex, Cowork, and beyond.
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
klavis
Klavis AI (YC X25): MCP integration platforms that let AI agents use tools reliably at any scale
aci
ACI.dev is the open source tool-calling platform that hooks up 600+ tools into any agentic IDE or custom AI agent through direct function calling or a unified MCP server. The birthplace of VibeOps.
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
llm-functions
Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.