MCP cover image
See in Github
2024-12-09

镜像://github.com/chatmcp/mcp-server-collector

0

Github Watches

1

Github Forks

0

Github Stars

mcp-server-collector MCP server

A MCP Server used to collect MCP Servers over the internet.

Components

Resources

No resources yet.

Prompts

No prompts yet.

Tools

The server implements 3 tools:

  • extract-mcp-servers-from-url: Extracts MCP Servers from given URL.
    • Takes "url" as required string argument
  • extract-mcp-servers-from-content: Extracts MCP Servers from given content.
    • Takes "content" as required string argument
  • submit-mcp-server: Submits a MCP Server to the MCP Server Directory like mcp.so.
    • Takes "url" as required string argument and "avatar_url" as optional string argument

Configuration

.env file is required to be set up.

OPENAI_API_KEY="sk-xxx"
OPENAI_BASE_URL="https://api.openai.com/v1"
OPENAI_MODEL="gpt-4o-mini"

MCP_SERVER_SUBMIT_URL="https://mcp.so/api/submit-project"

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration ``` "mcpServers": { "fetch": { "command": "uvx", "args": ["mcp-server-fetch"] }, "mcp-server-collector": { "command": "uv", "args": [ "--directory", "path-to/mcp-server-collector", "run", "mcp-server-collector" ], "env": { "OPENAI_API_KEY": "sk-xxx", "OPENAI_BASE_URL": "https://api.openai.com/v1", "OPENAI_MODEL": "gpt-4o-mini", "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project" } } } ```
Published Servers Configuration ``` "mcpServers": { "fetch": { "command": "uvx", "args": ["mcp-server-fetch"] }, "mcp-server-collector": { "command": "uvx", "args": [ "mcp-server-collector" ], "env": { "OPENAI_API_KEY": "sk-xxx", "OPENAI_BASE_URL": "https://api.openai.com/v1", "OPENAI_MODEL": "gpt-4o-mini", "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project" } } } ```

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory path-to/mcp-server-collector run mcp-server-collector

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Community

About the author

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Bora Yalcin
  • Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.

  • Andris Teikmanis
  • Latvian GPT assistant for developing GPT applications

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://cantaspinar.com
  • Summarizes videos and answers related questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • modelcontextprotocol
  • 模型上下文协议服务器

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • OffchainLabs
  • 进行以太坊的实施

  • metorial
  • 数百个MCP服务器的容器化版本📡📡

    Reviews

    2 (1)
    Avatar
    user_R1SOfUSk
    2025-04-15

    I've been using UniAuto MCP Server for a few months now and it has significantly streamlined our automation processes. The interface is user-friendly and the performance has been reliable. Samuelvinay91 did an excellent job creating this server. I highly recommend it to anyone looking for a robust MCP solution.