I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

shreyaskarnik_huggingface-mcp-server
Mirror ofhttps://github.com/shreyaskarnik/huggingface-mcp-server
0
Github Watches
0
Github Forks
0
Github Stars
🤗 Hugging Face MCP Server 🤗
A Model Context Protocol (MCP) server that provides read-only access to the Hugging Face Hub APIs. This server allows LLMs like Claude to interact with Hugging Face's models, datasets, spaces, papers, and collections.
Components
Resources
The server exposes popular Hugging Face resources:
- Custom
hf://
URI scheme for accessing resources - Models with
hf://model/{model_id}
URIs - Datasets with
hf://dataset/{dataset_id}
URIs - Spaces with
hf://space/{space_id}
URIs - All resources have descriptive names and JSON content type
Prompts
The server provides two prompt templates:
-
compare-models
: Generates a comparison between multiple Hugging Face models- Required
model_ids
argument (comma-separated model IDs) - Retrieves model details and formats them for comparison
- Required
-
summarize-paper
: Summarizes a research paper from Hugging Face- Required
arxiv_id
argument for paper identification - Optional
detail_level
argument (brief/detailed) to control summary depth - Combines paper metadata with implementation details
- Required
Tools
The server implements several tool categories:
-
Model Tools
-
search-models
: Search models with filters for query, author, tags, and limit -
get-model-info
: Get detailed information about a specific model
-
-
Dataset Tools
-
search-datasets
: Search datasets with filters -
get-dataset-info
: Get detailed information about a specific dataset
-
-
Space Tools
-
search-spaces
: Search Spaces with filters including SDK type -
get-space-info
: Get detailed information about a specific Space
-
-
Paper Tools
-
get-paper-info
: Get information about a paper and its implementations -
get-daily-papers
: Get the list of curated daily papers
-
-
Collection Tools
-
search-collections
: Search collections with various filters -
get-collection-info
: Get detailed information about a specific collection
-
Configuration
The server does not require configuration, but supports optional Hugging Face authentication:
- Set
HF_TOKEN
environment variable with your Hugging Face API token for:- Higher API rate limits
- Access to private repositories (if authorized)
- Improved reliability for high-volume requests
Quickstart
Install
Installing via Smithery
To install huggingface-mcp-server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @shreyaskarnik/huggingface-mcp-server --client claude
Claude Desktop
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Development/Unpublished Servers Configuration
"mcpServers": {
"huggingface": {
"command": "uv",
"args": [
"--directory",
"/absolute/path/to/huggingface-mcp-server",
"run",
"huggingface_mcp_server.py"
],
"env": {
"HF_TOKEN": "your_token_here" // Optional
}
}
}
Development
Building and Publishing
To prepare the package for distribution:
- Sync dependencies and update lockfile:
uv sync
- Build package distributions:
uv build
This will create source and wheel distributions in the dist/
directory.
- Publish to PyPI:
uv publish
Note: You'll need to set PyPI credentials via environment variables or command flags:
- Token:
--token
orUV_PUBLISH_TOKEN
- Or username/password:
--username
/UV_PUBLISH_USERNAME
and--password
/UV_PUBLISH_PASSWORD
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory /path/to/huggingface-mcp-server run huggingface_mcp_server.py
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
Example Prompts for Claude
When using this server with Claude, try these example prompts:
- "Search for BERT models on Hugging Face with less than 100 million parameters"
- "Find the most popular datasets for text classification on Hugging Face"
- "What are today's featured AI research papers on Hugging Face?"
- "Summarize the paper with arXiv ID 2307.09288 using the Hugging Face MCP server"
- "Compare the Llama-3-8B and Mistral-7B models from Hugging Face"
- "Show me the most popular Gradio spaces for image generation"
- "Find collections created by TheBloke that include Mixtral models"
Troubleshooting
If you encounter issues with the server:
-
Check server logs in Claude Desktop:
- macOS:
~/Library/Logs/Claude/mcp-server-huggingface.log
- Windows:
%APPDATA%\Claude\logs\mcp-server-huggingface.log
- macOS:
-
For API rate limiting errors, consider adding a Hugging Face API token
-
Make sure your machine has internet connectivity to reach the Hugging Face API
-
If a particular tool is failing, try accessing the same data through the Hugging Face website to verify it exists
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
Mirror ofhttps://github.com/agentience/practices_mcp_server
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Reviews

user_XqpNRtmH
The shreyaskarnik_huggingface-mcp-server by MCP-Mirror is an exceptional tool for managing and serving machine learning models. Its seamless integration and user-friendly interface make it a must-have for developers looking to streamline their AI workflows. Highly recommend checking it out on GitHub!