lex
UK legal API for AI agents and researchers.
claude mcp add --transport stdio i-dot-ai-lex python -m lex.backend \ --env DATABASE_URL="Postgres/DB connection string (if applicable)" \ --env AZURE_OPENAI_KEY="Azure OpenAI API key" \ --env AZURE_OPENAI_ENDPOINT="Azure OpenAI endpoint URL"
How to use
Lex exposes a UK legal dataset API with semantic search capabilities and MCP integration, enabling AI assistants to ground responses in authoritative UK legislation. The server provides endpoints for querying legislation, instruments, amendments, and explanatory notes, with tooling designed to support semantic search, filtering, and model grounding. With MCP integration, clients can connect their AI agents (e.g., Claude, Cursor, Copilot, VS Code integrations) to Lex to fetch relevant legal documents and keep model outputs aligned with official UK sources. The live MCP documentation and examples show how to register Lex as a data source and how to query the API for legislation and related metadata. To get started locally, run the API server, then use the provided documentation endpoints to explore the schema and available search capabilities. Tools like semantic search filters, dataset-specific payloads, and indexing utilities are available to tailor results to statutory acts, SI instruments, and explanatory notes.
How to install
Prerequisites:
- Python 3.12+ installed
- Docker and Docker Compose (optional for data loading and deployment)
- Azure OpenAI credentials (for OpenAI integration)
Installation steps:
-
Clone the repository and install dependencies git clone https://github.com/i-dot-ai/lex.git && cd lex python -m venv venv source venv/bin/activate pip install -r requirements.txt
-
Set up environment variables cp .env.example .env # then fill in keys such as AZURE_OPENAI_KEY and AZURE_OPENAI_ENDPOINT
-
Run the API locally (without Docker) uv run python -m lex.backend # or use the provided make run command if available
-
Load sample data (optional for quick tests) docker compose up -d # if you want to use Docker-based services make ingest-all-sample
-
Access API documentation and endpoints Visit http://localhost:8000/docs for API docs
Notes:
- The project provides multiple make targets for data loading and development (e.g., make ingest-legislation-sample, make ingest-all-sample, make install, make test, make run).
- If you prefer Docker, use the docker-compose workflow described in the repository to start services and ingest sample data.
Additional notes
Tips and common issues:
- Ensure Python 3.12+ and uv are installed; verify with python --version and uv --version.
- For MCP integration, consult the live documentation linked in the README to configure client connections for Claude, Cursor, Copilot, and VS Code integrations.
- When loading data, you can run quick sample ingestion commands to validate the setup before loading full datasets.
- If you encounter API binding issues, verify that port 8000 is available and not used by another process.
- Environment variables: AZURE_OPENAI_KEY and AZURE_OPENAI_ENDPOINT must be set for Azure OpenAI usage; DATABASE_URL (if required) should point to your data store.
- The project emphasizes that Lex is experimental and not intended for production use; plan accordingly for development and testing scenarios.
Related MCP Servers
ragflow
RAGFlow is a leading open-source Retrieval-Augmented Generation (RAG) engine that fuses cutting-edge RAG with Agent capabilities to create a superior context layer for LLMs
cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
deepchat
🐬DeepChat - A smart assistant that connects powerful AI to your personal world
magic
Super Magic. The first open-source all-in-one AI productivity platform (Generalist AI Agent + Workflow Engine + IM + Online collaborative office system)
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
headroom
The Context Optimization Layer for LLM Applications