repo-stargazer
Talk to your starred github repositories
claude mcp add --transport stdio ksachdeva-repo-stargazer uvx --from repo-stargazer rsg run-adk-server --config rsg-config.toml \ --env RSG_DATA_HOME="path to data directory, e.g., $HOME/.local/share/rsg"
How to use
Repo Stargazer provides a local semantic search and AI-assisted exploration experience for your GitHub starred repositories. It relies on the UV toolchain to fetch your starred repos, download their readmes, chunk the content, embed it, and store it in a local vector store. The system then exposes an agent-driven interface that can be used to search, summarize, and reason about your starred projects, with the option to integrate with language model providers through the configured embedder settings. You can launch the UI and agent server through UV’s ADK integration to interact with your data via natural language prompts and structured searches.
To get started, install UV, clone or download Repo Stargazer, create your configuration file, and start the ADK server. The main commands you’ll use are building the local vector store (build) and running the ADK server (run-adk-server) to access the UI and agent backend. The configuration file (rsg-config.toml) drives how data is fetched (GitHub PAT), how embeddings are produced, and how the agent is configured (LitLLM params, embedder type, etc.).
How to install
Prerequisites:
- Python installed on your system
- The UV tool (uv/uvx) installed as described in the project's docs
- A GitHub Personal Access Token with access to your starred repositories
Installation steps:
- Install UV and set up the environment as per the UV documentation (https://docs.astral.sh/uv/getting-started/installation/).
- Clone the repository or download Repo Stargazer to a local directory.
- Create a configuration file named rsg-config.toml at the repository root (you can start from rsg-config.example.toml and adjust values).
- Obtain a GitHub PAT and add it to the rsg-config.toml under the [auth] section as required by the config.
- Build the database (this downloads your starred repos, fetches readmes, chunks content, and stores embeddings):
uvx --from repo-stargazer rsg build --config rsg-config.toml
- Run the ADK server to access the UI and agent interface:
uvx --from repo-stargazer rsg run-adk-server --config rsg-config.toml
Note: The data directory defaults to $HOME/.local/share/rsg on Linux/macOS. You can override it with the environment variable RSG_DATA_HOME.
Additional notes
Tips and notes:
- RSG_DATA_HOME controls where the vector store and downloaded data reside. Set it to a persistent location if you want to preserve data across runs.
- You must provide a valid GitHub PAT to fetch your starred repos and readmes; without it, requests may be rate-limited.
- The embedder you configure (ollama, openai, azure_openai, etc.) determines how embeddings are generated and which provider is used.
- The agent settings under [agent.litellm_params] in rsg-config.toml control how the LLM backend is invoked and how it interacts with the retrieved data.
- Some repositories may not have a README file; the build step handles missing content gracefully.
- The data stored locally is for your own use and is not uploaded anywhere unless you configure the tooling to do so.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
mcp-tool-kit
Agentic abstraction layer for building high precision vertical AI agents written in python for Model Context Protocol.
dremio
Dremio MCP server
context-engineering
[WIP] Context engineering: the art and science of shaping context-aware AI systems
Unified -Tool-Graph
Instead of dumping 1000+ tools into a model’s prompt and expecting it to choose wisely, the Unified MCP Tool Graph equips your LLM with structure, clarity, and relevance. It fixes tool confusion, prevents infinite loops, and enables modular, intelligent agent workflows.
knowledgebase
BioContextAI Knowledgebase MCP server for biomedical agentic AI