Cover image
Try Now
2024-12-09

Ein MCP -Server, mit dem MCP -Server über das Internet gesammelt werden.

3 years

Works with Finder

1

Github Watches

4

Github Forks

18

Github Stars

mcp-server-collector MCP server

A MCP Server used to collect MCP Servers over the internet.

Components

Resources

No resources yet.

Prompts

No prompts yet.

Tools

The server implements 3 tools:

  • extract-mcp-servers-from-url: Extracts MCP Servers from given URL.
    • Takes "url" as required string argument
  • extract-mcp-servers-from-content: Extracts MCP Servers from given content.
    • Takes "content" as required string argument
  • submit-mcp-server: Submits a MCP Server to the MCP Server Directory like mcp.so.
    • Takes "url" as required string argument and "avatar_url" as optional string argument

Configuration

.env file is required to be set up.

OPENAI_API_KEY="sk-xxx"
OPENAI_BASE_URL="https://api.openai.com/v1"
OPENAI_MODEL="gpt-4o-mini"

MCP_SERVER_SUBMIT_URL="https://mcp.so/api/submit-project"

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration ``` "mcpServers": { "fetch": { "command": "uvx", "args": ["mcp-server-fetch"] }, "mcp-server-collector": { "command": "uv", "args": [ "--directory", "path-to/mcp-server-collector", "run", "mcp-server-collector" ], "env": { "OPENAI_API_KEY": "sk-xxx", "OPENAI_BASE_URL": "https://api.openai.com/v1", "OPENAI_MODEL": "gpt-4o-mini", "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project" } } } ```
Published Servers Configuration ``` "mcpServers": { "fetch": { "command": "uvx", "args": ["mcp-server-fetch"] }, "mcp-server-collector": { "command": "uvx", "args": [ "mcp-server-collector" ], "env": { "OPENAI_API_KEY": "sk-xxx", "OPENAI_BASE_URL": "https://api.openai.com/v1", "OPENAI_MODEL": "gpt-4o-mini", "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project" } } } ```

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory path-to/mcp-server-collector run mcp-server-collector

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Community

About the author

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.

  • jae-jae
  • MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.

  • ravitemer
  • Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)

  • patruff
  • Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden

  • pontusab
  • Die Cursor & Windsurf -Community finden Regeln und MCPs

  • av
  • Führen Sie mühelos LLM -Backends, APIs, Frontends und Dienste mit einem Befehl aus.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.

  • Mintplex-Labs
  • Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.

  • appcypher
  • Awesome MCP -Server - eine kuratierte Liste von Modellkontext -Protokollservern für Modellkontext

    Reviews

    2 (1)
    Avatar
    user_GISaQvaS
    2025-04-17

    As a dedicated mcp user, I am thoroughly impressed with mcp-server-collector by chatmcp. Its efficient data collection capabilities and seamless integration into existing systems are game-changers. The GitHub repository provides comprehensive documentation, making setup a breeze. Highly recommend this tool for anyone looking to enhance their server data management! Check it out at https://github.com/chatmcp/mcp-server-collector.