Cover image
Try Now
2025-04-04

MCP Servers discovery spec

3 years

Works with Finder

1

Github Watches

0

Github Forks

1

Github Stars

Discorevy local MCP Servers. Specification.

Version: 1.0.1

Goals

This specification aims to allow the standard way to list and configure MCP Servers on a local machine.

Many tools already use MCP Servers to augment the LLMs. Currently, there is no standard approach to quickly registering an MCP Server.

Here are the most popular MCP Clients:

  • IntelliJ IDEA
  • Anthropic Claude
  • OpenAI ChatGPT
  • Cursor
  • Windsurf
  • Warp
  • PR to add you

Supported in the following tools

  • PR if you support the spec

Specification

Create a file per each MCP server in the ~/.mcp folder (%USER_HOME% on Windows).

The file is a Markdown file that explains an LLM (such as Claude or CharGPT) about your MCP Server details. The client uses LLM to transform your explanation into an actionable MCP Server.

The client is required to regularly refresh the information from the disc to discover new MCP Servers.

This protocol does not resolve any security implications for MCP servers. That is still the responsibility of MCP Clients.

For example:

~/.mcp/my-mcp-server-tool-id.md

Create the following text inside:

---
version: 1.0.1
---

# MCP Server: Production

Here, I describe what my MCP Server is doing and why the LLM would decide to include my server in a specific request, which is queried.

## Basic Information
- **Name**: Production Jonnyzzz MCP Server
- **ID**: prod-mcp-01
- **Version**: 3.2.1
- **URL**: https://mcp-prod.jonnyzzz.com:8443
- **API Version**: v2

## Authentication
- **Type**: oauth2
- **Client ID**: client_123
- **Token Endpoint**: https://auth.example.com/token

## Capabilities
- compute
- storage
- networking

## Regions
### us-east
- us-east-1a
- us-east-1b

### eu-west
- eu-west-1a
- eu-west-1b

## Health Check
- **Endpoint**: /health
- **Interval**: 60 seconds

## Metadata
- **Environment**: production
- **Owner**: platform-team
- **Priority**: high

How to use the Spec?

An MCP client uses the LLM (e.g., Claude, ChatGPT) to extract the necessary information from each of the available MCP server's markdown files, which are discovered in the files under the ~/.mcp folder. It is up to the LLM and the client to decide if to use a specific LLM, ask for credentials, and so on. The client should refresh the files from the disk regularly.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Lists Tailwind CSS classes in monospaced font

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • apappascs
  • Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.

  • ShrimpingIt
  • Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx

  • jae-jae
  • MCP server for fetch web page content using Playwright headless browser.

  • HiveNexus
  • An AI chat bot for small and medium-sized teams, supporting models such as Deepseek, Open AI, Claude, and Gemini. 专为中小团队设计的 AI 聊天应用,支持 Deepseek、Open AI、Claude、Gemini 等模型。

  • ravitemer
  • A powerful Neovim plugin for managing MCP (Model Context Protocol) servers

  • patruff
  • Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools

    Reviews

    2 (1)
    Avatar
    user_g2C4xEaS
    2025-04-16

    mcp-local-spec by jonnyzzz is a fantastic tool for managing local specifications. Its easy-to-navigate structure and efficient features make it perfect for streamlining your development workflow. The integration process is smooth, and the detailed documentation on the GitHub page provides clear guidance. Highly recommend for any developer looking to improve their project management. Check it out at https://github.com/jonnyzzz/mcp-local-spec.