Get the FREE Ultimate OpenClaw Setup Guide →

lex

UK legal API for AI agents and researchers.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio i-dot-ai-lex python -m lex.backend \
  --env DATABASE_URL="Postgres/DB connection string (if applicable)" \
  --env AZURE_OPENAI_KEY="Azure OpenAI API key" \
  --env AZURE_OPENAI_ENDPOINT="Azure OpenAI endpoint URL"

How to use

Lex exposes a UK legal dataset API with semantic search capabilities and MCP integration, enabling AI assistants to ground responses in authoritative UK legislation. The server provides endpoints for querying legislation, instruments, amendments, and explanatory notes, with tooling designed to support semantic search, filtering, and model grounding. With MCP integration, clients can connect their AI agents (e.g., Claude, Cursor, Copilot, VS Code integrations) to Lex to fetch relevant legal documents and keep model outputs aligned with official UK sources. The live MCP documentation and examples show how to register Lex as a data source and how to query the API for legislation and related metadata. To get started locally, run the API server, then use the provided documentation endpoints to explore the schema and available search capabilities. Tools like semantic search filters, dataset-specific payloads, and indexing utilities are available to tailor results to statutory acts, SI instruments, and explanatory notes.

How to install

Prerequisites:

  • Python 3.12+ installed
  • Docker and Docker Compose (optional for data loading and deployment)
  • Azure OpenAI credentials (for OpenAI integration)

Installation steps:

  1. Clone the repository and install dependencies git clone https://github.com/i-dot-ai/lex.git && cd lex python -m venv venv source venv/bin/activate pip install -r requirements.txt

  2. Set up environment variables cp .env.example .env # then fill in keys such as AZURE_OPENAI_KEY and AZURE_OPENAI_ENDPOINT

  3. Run the API locally (without Docker) uv run python -m lex.backend # or use the provided make run command if available

  4. Load sample data (optional for quick tests) docker compose up -d # if you want to use Docker-based services make ingest-all-sample

  5. Access API documentation and endpoints Visit http://localhost:8000/docs for API docs

Notes:

  • The project provides multiple make targets for data loading and development (e.g., make ingest-legislation-sample, make ingest-all-sample, make install, make test, make run).
  • If you prefer Docker, use the docker-compose workflow described in the repository to start services and ingest sample data.

Additional notes

Tips and common issues:

  • Ensure Python 3.12+ and uv are installed; verify with python --version and uv --version.
  • For MCP integration, consult the live documentation linked in the README to configure client connections for Claude, Cursor, Copilot, and VS Code integrations.
  • When loading data, you can run quick sample ingestion commands to validate the setup before loading full datasets.
  • If you encounter API binding issues, verify that port 8000 is available and not used by another process.
  • Environment variables: AZURE_OPENAI_KEY and AZURE_OPENAI_ENDPOINT must be set for Azure OpenAI usage; DATABASE_URL (if required) should point to your data store.
  • The project emphasizes that Lex is experimental and not intended for production use; plan accordingly for development and testing scenarios.

Related MCP Servers

Sponsor this space

Reach thousands of developers