lc
LimaCharlie MCP Server for AI Agents
claude mcp add --transport stdio refractionpoint-lc-mcp-server ./lc-mcp-server \ --env LC_OID="your-organization-id" \ --env MCP_MODE="stdio" \ --env LOG_LEVEL="info" \ --env LC_API_KEY="your-api-key" \ --env MCP_PROFILE="all"
How to use
This MCP server exposes LimaCharlie functionality through the Model Context Protocol, enabling AI assistants to query telemetry, inspect endpoints, respond to threats, manage detections and platform configurations, and even generate security content. It translates natural language requests into LimaCharlie API actions (for example retrieving sensor inventories, listing processes, or isolating hosts) using the MCP tool profiles. Claude Code can connect in STDIO mode for local development, while production deployments can leverage STDIO or HTTP transports with OAuth 2.1 for cloud access.
To use the server, start the binary after configuring credentials (organization ID and API key) and select an appropriate profile. The server supports multiple tool profiles such as core, live_investigation, threat_response, and platform_admin, each grouping a set of MCP tools. In Claude Code, point the MCP client to the running lc-mcp-server with environment variables set (MCP_MODE=stdio, MCP_PROFILE=all, along with LC_OID and LC_API_KEY). Claude can then issue natural language requests like querying recent detections, investigating a process, or isolating an endpoint; the MCP layer will route these to the corresponding LimaCharlie APIs and return structured results.
Key capabilities include querying telemetry with natural language (translated to LCQL), real-time endpoint investigation (processes, network, files), threat response actions (isolate hosts, tag sensors, task endpoints), detection management (D&R and YARA rules), platform administration (outputs, integrations, configurations), and AI-assisted content generation (rules, queries, playbooks). The server is designed for multi-tenant use, supports multiple profiles, and emphasizes performance and security with context-based auth and cache keys.
How to install
Prerequisites:
- Go 1.20+ toolchain installed on your system
- Git or a downloaded release containing lc-mcp-server binary
- Access to LimaCharlie API (LC_OID and LC_API_KEY)
- Build the server (from source):
# Clone the repository
git clone https://github.com/refractionpoint-lc-mcp-server.git
cd refractionpoint-lc-mcp-server
# Build the MCP server binary
go build -o lc-mcp-server ./cmd/server
- Configure credentials:
- Set environment variables in your shell or deployment:
export LC_OID="your-organization-id"
export LC_API_KEY="your-api-key"
export MCP_MODE="stdio" # or your preferred transport
export MCP_PROFILE="all" # choose a profile or combinations
export LOG_LEVEL="info" # info, debug, warn, error
- Run the server:
./lc-mcp-server
If you prefer to run a pre-built binary, download the release artifacts and execute the lc-mcp-server binary with the same environment variables as above.
Additional notes
Tips and common considerations:
- Ensure LC_OID and LC_API_KEY are valid for the target LimaCharlie organization.
- Use MCP_MODE=stdio for local development with Claude Code; switch to an HTTP/OAuth setup for cloud deployments.
- The server supports multiple tool profiles; you can define distinct MCP_PROFILE values per instance (e.g., live_investigation, detection_engineering) to tailor tool availability.
- Enable verbose logging during debugging by setting LOG_LEVEL=debug, then revert to info or warn in production.
- If you run into transport or authentication issues, verify that the server can reach LimaCharlie endpoints and that API keys are not expired.
- For multi-tenant setups, consider UID-based or OAuth configurations as described in the README to isolate access per organization.
Related MCP Servers
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
station
Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.
tiger-cli
Tiger CLI is the command-line interface for Tiger Cloud. It includes an MCP server for helping coding agents write production-level Postgres code.
gopls
MCP server for golang projects development: Expand AI Code Agent ability boundary to have a semantic understanding and determinisic information for golang projects.
kubernetes
A Model Context Protocol (MCP) server for the Kubernetes API.
gcp-cost
💰 An MCP server that enables AI assistants to estimate Google Cloud costs, powered by Cloud Billing Catalog API and built with Genkit for Go