Get the FREE Ultimate OpenClaw Setup Guide →

antrikshGPT

AI-powered space exploration webapp with real-time satellite tracking, astronomical data, and intelligent conversational interfaces for cosmic discovery

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio tarun7r-antrikshgpt python webapp/backend/main.py \
  --env SECRET_KEY="A secret key for JWT (generate with your preferred method)" \
  --env GOOGLE_API_KEY="Your Google AI API key for the LLM"

How to use

antrikshGPT is an open-source AI space exploration server that combines a powerful language model with real-time space data from multiple APIs. The server is designed for serverless-friendly and local deployments, with a FastAPI backend handling API requests and a frontend that can connect via WebSockets for real-time updates. You can query the system about ISS tracking, SpaceX missions, space weather, astronaut status, planetary data, and more through natural language prompts. The backend relies on an LLM (via LangChain) and a collection of space data providers, with features like caching, rate limiting, and robust error handling to ensure responsive results. To run locally, start the Python backend and connect to the frontend or API endpoints provided by the server.

Usage highlights:

  • Endpoints and logic live under webapp/backend (main.py) and api (serverless entry). You can send queries that the system will answer using the integrated space APIs and the LLM.
  • Real-time updates are available through WebSockets for live data streams (e.g., ISS location, mission updates).
  • The project is designed for Vercel serverless deployment but also runs locally with a standard Python environment.

How to install

Prerequisites:

  • Python 3.8 or newer
  • pip (Python package manager)
  • Git
  • Optional: Vercel CLI for deployment (if you plan to deploy serverless)

Installation steps:

  1. Clone the repository: git clone https://github.com/tarun7r/antrikshGPT.git cd antrikshGPT

  2. Create and activate a virtual environment (recommended): python3 -m venv venv source venv/bin/activate # macOS/Linux .\venv\Scripts\activate # Windows

  3. Install dependencies: pip install -r requirements.txt

  4. Set up environment configuration:

    • Copy the sample env file and populate keys: cp env.sample .env

      Edit .env to add GOOGLE_API_KEY and SECRET_KEY

  5. Run the backend server locally: python webapp/backend/main.py

    The server will start and typically listen on http://localhost:8000

  6. Optional: If you plan to deploy to Vercel, install Vercel CLI and follow their deployment steps: npm i -g vercel vercel login vercel

  7. Verify by visiting the frontend or API endpoints and trying sample queries such as:

    • Where is the ISS right now?
    • What is the next SpaceX launch?
    • What is the weather like on Mars?

Additional notes

Tips and notes:

  • Ensure you provide a valid GOOGLE_API_KEY for the LLM integrations and a SECRET_KEY for JWT-based authentication.
  • The backend is designed to work with a modular set of space data providers; if a provider is temporarily unavailable, the system will degrade gracefully and still return useful results.
  • For local development, you can edit webapp/backend/main.py and adjacent modules under api/ and shared/ to customize behavior.
  • If you encounter port or CORS issues when integrating the frontend, ensure the backend is accessible at the expected host/port and that WebSocket connections are allowed by your browser.
  • Logs and console output on first launch will reveal default admin credentials for user management in users.json; secure these appropriately for production use.

Related MCP Servers

Sponsor this space

Reach thousands of developers