huggingface
MCP server from shreyaskarnik/huggingface-mcp-server
claude mcp add --transport stdio shreyaskarnik-huggingface-mcp-server uv --directory /absolute/path/to/huggingface-mcp-server run huggingface_mcp_server.py \ --env HF_TOKEN="your_token_here"
How to use
The Hugging Face MCP Server provides read-only access to Hugging Face Hub APIs through the MCP protocol. It exposes resources such as models, datasets, spaces, papers, and collections via the hf:// URI scheme, enabling LLMs to discover and reference Hugging Face content in a structured, language-agnostic way. It also includes prompt templates like compare-models and summarize-paper to help compose structured queries and summarize research content. Tools are organized into four categories—Model Tools, Dataset Tools, Space Tools, Paper Tools, and Collection Tools—allowing you to search, inspect, and retrieve detailed information about models, datasets, spaces, papers, and collections from Hugging Face when communicating with an MCP-enabled agent or workflow.
You can interact with the server using MCP-compatible clients or inspector tooling. For example, Model Tools let you search models with filters (query, author, tags, limit) and fetch detailed model info; Dataset Tools provide dataset search and detailed dataset info; Space Tools enable searching and retrieving Space details; Paper Tools expose paper info and a list of daily papers; and Collection Tools let you search and inspect curated collections. The server can also render human-friendly names and JSON content for each resource, making it easier to incorporate Hugging Face data into multi-step reasoning or tooling workflows.
How to install
Prerequisites:
- Python 3.8+ and pip
- A working MCP environment or MCP-compatible runner
- Optional: HF_TOKEN if you need authenticated access to private resources or higher rate limits
Step 1: Install dependencies (example if you are packaging locally)
# If distributing via PyPI or a local wheel, ensure dependencies are installed
uv install -y . # or your preferred installation method for the uv/UV tooling
Step 2: Prepare the server directory
- Clone or download the huggingface-mcp-server repository to a directory, e.g. /absolute/path/to/huggingface-mcp-server
- Ensure hugggingface_mcp_server.py is present in that directory
Step 3: Run the server with UV
uv sync
uv build
uv run huggingface_mcp_server.py --directory /absolute/path/to/huggingface-mcp-server
Step 4: Configure environment (optional)
- HF_TOKEN: set your Hugging Face API token to increase rate limits or access private repos
Step 5: Verify operation
- Use MCP Inspector or your MCP client to query resources such as hf://model/{model_id}, hf://dataset/{dataset_id}, or hf://space/{space_id} to confirm responses.
Additional notes
Notes and tips:
- The server is read-only by design, focusing on discovery and retrieval of public Hugging Face content; authenticated access via HF_TOKEN enables higher rate limits and access to private items if you are authorized.
- If you encounter authentication errors, ensure HF_TOKEN is correctly exported in your environment where the MCP server runs.
- The server uses the hf:// URI scheme for resource addressing; test queries against common IDs to validate integration.
- If you deploy locally, consider network latency to Hugging Face APIs and adjust timeouts in your MCP client accordingly.
- Development and debugging can be aided by MCP Inspector to visualize request/response flows and protocol messages.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP