aica
aica(AI Code Analyzer) reviews your code using AI. Supports CLI and GitHub Actions.
claude mcp add --transport stdio dotneet-aica node ./dist/server.js \ --env GITHUB_TOKEN="your_github_token (or ensure gh auth token is available)" \ --env AICA_LANGUAGE="preferred_output_language" \ --env GOOGLE_API_KEY="your_google_api_key" \ --env OPENAI_API_KEY="your_openai_api_key" \ --env AICA_LLM_PROVIDER="optional: openai|anthropic|google depending on your setup" \ --env ANTHROPIC_API_KEY="your_anthropic_api_key"
How to use
aica provides an AI-powered code analysis and review workflow exposed as an MCP server. It supports both stdio and SSE transports for integration with tools that follow the MCP protocol, enabling AI-assisted code reviews, change summaries, and AI-generated commit messages within your development environment. Start the MCP server using the included binary or script, then connect your tooling to the aica MCP endpoint to request reviews, agent-driven actions, and automated PR creation. You can customize the behavior through aica.toml and prompt configurations to tailor the AI agent, languages, and providers.
Once running, you can leverage the server to perform: AI-driven code reviews across diffs, generating AI-assisted commit messages, creating pull requests with AI-generated titles and bodies, and driving an active agent capable of performing tasks based on your prompts. The MCP setup supports both stdio-based connections for local tooling and SSE endpoints for streaming updates, enabling responsive, interactive AI workflows within your IDE or CI pipelines.
How to install
Prerequisites:
- Git installed
- Node.js and Bun installed (as per project guidelines)
- Access to a GitHub token or gh authentication for token retrieval
Installation steps:
-
Clone the repository git clone https://github.com/dotneet/aica.git cd aica
-
Install dependencies and build the binary bun install bun run build
-
Use the built binary as your MCP server (example integration) cp ./dist/aica /path/to/your/bin/directory
-
Ensure required environment variables are set (example) export GITHUB_TOKEN=your_github_token export AICA_LLM_PROVIDER=openai export OPENAI_API_KEY=your_openai_api_key export OPENAI_MODEL=o3-mini
-
Start the MCP server (as configured in mcp_config or via your runtime script) node dist/server.js
Additional notes
Notes:
- The MCP server supports both stdio and SSE transports. If you use SSE, ensure your consumer can establish a streaming connection to the provided SSE URL.
- Configure aica.toml to set up the mcp.json (example) and adjust language, prompts, and providers as needed.
- When using agent prompts, be mindful of potential filesystem effects; test prompts in a safe environment.
- For GitHub integration, ensure your token has appropriate permissions for repository access and PR creation.
- If you encounter token retrieval issues, the server can fall back to gh auth token retrieval if gh is installed and authenticated.
- You can customize prompts and rules through the aica.toml configuration and the .cursor/.clinerules context files used by MCP-based interactions.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
mcp-graphql
Model Context Protocol server for GraphQL
mcp-gemini
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
filesystem
A Model Context Protocol (MCP) server for platform-agnostic file capabilities, including advanced search/replace and directory tree traversal
mcp
A fetch API based TypeScript SDK for MCP
xgmem
Global Memory MCP server, that manage all projects data.