Common_Chronicle
Common Chronicle turns messy context into structured, sourced timelines.
claude mcp add --transport stdio intelligent-internet-common_chronicle python -m common_chronicle \ --env GOOGLE_API_KEY="your-google-api-key (optional)" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env SOURCE_LANGUAGES="comma-separated list of languages to search (optional)"
How to use
Common Chronicle is an AI-assisted timeline builder that aggregates events from diverse sources, normalizes dates, and links each item back to its original source. It focuses on an opinion-driven workflow: you start with a question or hypothesis, the AI retrieves and ranks relevant events, and it automatically assembles a structured, research-ready timeline with citations. Use the built timeline to explore historical narratives, compare perspectives across sources, and extract evidence for arguments or storytelling. The system supports multi-language data gathering, relevance-focused filtering, and export options so you can share or further analyze results.
How to install
Prerequisites:
- Python 3.12 or newer
- Git
- Optional: a virtual environment tool (venv, conda, etc.)
- Clone the repository:
git clone https://github.com/Intelligent-Internet/Common_Chronicle.git
cd Common_Chronicle
- Create and activate a virtual environment:
python -m venv venv
# On Windows
venv\Scripts\activate
# On macOS/Linux
source venv/bin/activate
- Install Python dependencies:
pip install -r requirements.txt
-
Install or prepare any optional AI-provider tools you plan to use (OpenAI, Google Gemini, etc.). Set environment variables as described in the mcp_config section.
-
Run the MCP server:
python -m common_chronicle
If you prefer a Docker setup, see the environment notes and adapt accordingly to run the image with your API keys and configuration.
Additional notes
Environment variables:
- OPENAI_API_KEY: required if using OpenAI-compatible providers
- GOOGLE_API_KEY: optional for Google Gemini integration
- SOURCE_LANGUAGES: comma-separated list to broaden source coverage Configuration tips:
- Ensure your API keys are kept secure and not committed to version control
- If using multiple providers, confirm rate limits and cost settings align with your usage
- For large datasets, consider enabling incremental indexing and chunked exports to manage memory usage Common issues:
- Import or module not found errors: ensure you activated the virtual environment and installed dependencies
- API key authentication failures: verify keys are valid and have necessary permissions
- Date normalization inconsistencies: adjust locale or date parsing rules via config if available
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
evo-ai
Evo AI is an open-source platform for creating and managing AI agents, enabling integration with different AI models and services.
MCP-PostgreSQL-Ops
🔍Professional MCP server for PostgreSQL operations & monitoring: 30+ extension-independent tools for performance analysis, table bloat detection, autovacuum monitoring, schema introspection, and database management. Supports PostgreSQL 12-17.
tiktok
A Model Context Protocol service for TikTok video discovery and metadata extraction.
mcp -ccxt
Cryptocurrency Market Data MCP Server
zerodha
Zerodha MCP Server & Client - AI Agent (w/Agno & w/Google ADK)