Confidential guide on numerology and astrology, based of GG33 Public information

MCP-EMPLIO
Servidor MCP que ejerce todas las características del protocolo MCP
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
MCP Everything
Note: This project was extracted from https://github.com/modelcontextprotocol/servers/tree/main/src/everything to create a standalone implementation.
This MCP server attempts to exercise all the features of the MCP protocol. It is not intended to be a useful server, but rather a test server for builders of MCP clients. It implements prompts, tools, resources, sampling, and more to showcase MCP capabilities.
Installation
Local Installation
# Clone the repository
git clone https://github.com/modelcontextprotocol/mcp-everything.git
cd mcp-everything
# Install dependencies
npm install
# Build the project
npm run build
# Start the server
npm start
Global Installation
# Install globally from npm
npm install -g mcp-everything
# Run the server
mcp-everything
Docker
# Build the Docker image
docker build -t mcp-everything .
# Run the container
docker run -it mcp-everything
Usage with Claude Desktop
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"everything": {
"command": "npx",
"args": [
"-y",
"mcp-everything"
]
}
}
}
Components
Tools
-
echo
- Simple tool to echo back input messages
- Input:
-
message
(string): Message to echo back
-
- Returns: Text content with echoed message
-
add
- Adds two numbers together
- Inputs:
-
a
(number): First number -
b
(number): Second number
-
- Returns: Text result of the addition
-
longRunningOperation
- Demonstrates progress notifications for long operations
- Inputs:
-
duration
(number, default: 10): Duration in seconds -
steps
(number, default: 5): Number of progress steps
-
- Returns: Completion message with duration and steps
- Sends progress notifications during execution
-
sampleLLM
- Demonstrates LLM sampling capability using MCP sampling feature
- Inputs:
-
prompt
(string): The prompt to send to the LLM -
maxTokens
(number, default: 100): Maximum tokens to generate
-
- Returns: Generated LLM response
-
getTinyImage
- Returns a small test image
- No inputs required
- Returns: Base64 encoded PNG image data
-
printEnv
- Prints all environment variables
- Useful for debugging MCP server configuration
- No inputs required
- Returns: JSON string of all environment variables
-
annotatedMessage
- Demonstrates how annotations can be used to provide metadata about content
- Inputs:
-
messageType
(enum: "error" | "success" | "debug"): Type of message to demonstrate different annotation patterns -
includeImage
(boolean, default: false): Whether to include an example image
-
- Returns: Content with varying annotations
Resources
The server provides 100 test resources in two formats:
-
Even numbered resources:
- Plaintext format
- URI pattern:
test://static/resource/{even_number}
- Content: Simple text description
-
Odd numbered resources:
- Binary blob format
- URI pattern:
test://static/resource/{odd_number}
- Content: Base64 encoded binary data
Resource features:
- Supports pagination (10 items per page)
- Allows subscribing to resource updates
- Demonstrates resource templates
- Auto-updates subscribed resources every 5 seconds
Prompts
-
simple_prompt
- Basic prompt without arguments
- Returns: Single message exchange
-
complex_prompt
- Advanced prompt demonstrating argument handling
- Required arguments:
-
temperature
(number): Temperature setting
-
- Optional arguments:
-
style
(string): Output style preference
-
- Returns: Multi-turn conversation with images
Logging
The server sends random-leveled log messages every 15 seconds to demonstrate the logging capabilities of MCP.
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Espejo dehttps: //github.com/agentience/practices_mcp_server
Espejo de https: //github.com/bitrefill/bitrefill-mcp-server
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
Reviews

user_jOcvxLJ4
mcp-everything is an incredibly versatile and comprehensive tool developed by s2005. It offers a seamless integration for various applications and functions flawlessly. I highly recommend checking out their GitHub page for further details and updates. This tool truly covers all bases and is essential for any developer's toolkit.