sparql-llm
🦜✨ Chat system, MCP server, and reusable components to improve LLMs capabilities when generating SPARQL queries
claude mcp add --transport stdio sib-swiss-sparql-llm uvx sparql-llm \ --env SETTINGS_FILEPATH="<path-to-your-sparql-llm-settings.json>"
How to use
This MCP server exposes tooling to help LLMs generate and execute SPARQL queries against SIB biodiversity endpoints. It provides three main capabilities: (1) Retrieve relevant documents, which returns documents and schema details from endpoint metadata to aid in crafting queries; (2) Retrieve relevant classes schema, which helps identify potential classes and concepts present in the SPARQL endpoints; and (3) Execute a SPARQL query directly against a specified endpoint. When connected via stdio (uvx), you can interact with these tools by sending structured JSON requests to the server's command interface and receiving responses that include metadata, schemas, or query results. The server is designed to layer retrieval-augmented generation with validation and is suitable for use in VSCode MCP workflows or other client integrations that support the MCP stdio transport.
How to install
Prerequisites:
- Python >= 3.10
- pip (Python package manager) or uv (uvx) tooling
Install the SPARQL-LLM package and its dependencies:
Option A: Install and use via pip (Python environment)
pip install sparql-llm
Option B: Install and use via uv (MCP stdio transport)
uv add sparql-llm
If you plan to customize the MCP server, prepare a settings file (JSON) that defines your organization, topics, and endpoints, for example:
{
"app_org": "Your organization",
"app_topics": "genes, proteins, lipids, chemical reactions, and metabolomics data",
"endpoints": [
{
"label": "UniProt",
"endpoint_url": "https://sparql.uniprot.org/sparql/",
"description": "UniProt is a comprehensive resource for protein sequence and annotation data."
},
{
"label": "Bgee",
"description": "Bgee is a database for retrieval and comparison of gene expression patterns across multiple animal species.",
"endpoint_url": "https://www.bgee.org/sparql/",
"homepage_url": "https://www.bgee.org/"
}
]
}
Then point the MCP client to this settings file by setting SETTINGS_FILEPATH to the path of your file in the mcp.json configuration (see example in the README):
{
"servers": {
"expasy-mcp": {
"type": "stdio",
"command": "uvx",
"env": {
"SETTINGS_FILEPATH": "/Users/you/sparql-mcp.json"
},
"args": [
"sparql-llm"
]
}
}
}
Additional notes
Tips and notes:
- The MCP server relies on endpoint metadata (VoID, SPARQL examples) to strengthen query generation and validation. Ensure your endpoints advertise rich metadata for best results.
- If you customize the settings, you can decouple endpoint selection from the deployed environment, enabling easy testing with different endpoint sets.
- When using the stdio transport, you typically interact via JSON-RPC-like messages expected by the MCP client. Refer to the VSCode MCP docs for exact request/response formats.
- If you encounter connectivity or metadata issues, verify that the endpoint URLs are reachable from the host and that any required authentication is properly provided in the endpoint configuration.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
mem0
✨ mem0 MCP Server: A memory system using mem0 for AI applications with model context protocl (MCP) integration. Enables long-term memory for AI agents as a drop-in MCP server.
time
⏰ Time MCP Server: Giving LLMs Time Awareness Capabilities
workflowy
Powerful CLI and MCP server for WorkFlowy: reports, search/replace, backup support, and AI integration (Claude, LLMs)
rdkit
MCP server that enables language models to interact with RDKit through natural language
gtm
An MCP server for Google Tag Manager. Connect it to your LLM, authenticate once, and start managing GTM through natural language.