
mcp-local-spec
MCP Servers discovery spec
1
Github Watches
0
Github Forks
1
Github Stars
Discorevy local MCP Servers. Specification.
Version: 1.0.1
Goals
This specification aims to allow the standard way to list and configure MCP Servers on a local machine.
Many tools already use MCP Servers to augment the LLMs. Currently, there is no standard approach to quickly registering an MCP Server.
Here are the most popular MCP Clients:
- IntelliJ IDEA
- Anthropic Claude
- OpenAI ChatGPT
- Cursor
- Windsurf
- Warp
- PR to add you
Supported in the following tools
- PR if you support the spec
Specification
Create a file per each MCP server in the ~/.mcp
folder (%USER_HOME%
on Windows).
The file is a Markdown file that explains an LLM (such as Claude or CharGPT) about your MCP Server details. The client uses LLM to transform your explanation into an actionable MCP Server.
The client is required to regularly refresh the information from the disc to discover new MCP Servers.
This protocol does not resolve any security implications for MCP servers. That is still the responsibility of MCP Clients.
For example:
~/.mcp/my-mcp-server-tool-id.md
Create the following text inside:
---
version: 1.0.1
---
# MCP Server: Production
Here, I describe what my MCP Server is doing and why the LLM would decide to include my server in a specific request, which is queried.
## Basic Information
- **Name**: Production Jonnyzzz MCP Server
- **ID**: prod-mcp-01
- **Version**: 3.2.1
- **URL**: https://mcp-prod.jonnyzzz.com:8443
- **API Version**: v2
## Authentication
- **Type**: oauth2
- **Client ID**: client_123
- **Token Endpoint**: https://auth.example.com/token
## Capabilities
- compute
- storage
- networking
## Regions
### us-east
- us-east-1a
- us-east-1b
### eu-west
- eu-west-1a
- eu-west-1b
## Health Check
- **Endpoint**: /health
- **Interval**: 60 seconds
## Metadata
- **Environment**: production
- **Owner**: platform-team
- **Priority**: high
How to use the Spec?
An MCP client uses the LLM (e.g., Claude, ChatGPT) to extract the necessary information from each of the available MCP server's markdown files, which are discovered in the files under the ~/.mcp
folder.
It is up to the LLM and the client to decide if to use a specific LLM, ask for credentials, and so on.
The client should refresh the files from the disk regularly.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Take an adjectivised noun, and create images making it progressively more adjective!
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Mirror ofhttps://github.com/agentience/practices_mcp_server
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Reviews

user_g2C4xEaS
mcp-local-spec by jonnyzzz is a fantastic tool for managing local specifications. Its easy-to-navigate structure and efficient features make it perfect for streamlining your development workflow. The integration process is smooth, and the detailed documentation on the GitHub page provides clear guidance. Highly recommend for any developer looking to improve their project management. Check it out at https://github.com/jonnyzzz/mcp-local-spec.