Confidential guide on numerology and astrology, based of GG33 Public information

mcp-everything
MCP server that exercises all the features of the MCP protocol
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
MCP Everything
Note: This project was extracted from https://github.com/modelcontextprotocol/servers/tree/main/src/everything to create a standalone implementation.
This MCP server attempts to exercise all the features of the MCP protocol. It is not intended to be a useful server, but rather a test server for builders of MCP clients. It implements prompts, tools, resources, sampling, and more to showcase MCP capabilities.
Installation
Local Installation
# Clone the repository
git clone https://github.com/modelcontextprotocol/mcp-everything.git
cd mcp-everything
# Install dependencies
npm install
# Build the project
npm run build
# Start the server
npm start
Global Installation
# Install globally from npm
npm install -g mcp-everything
# Run the server
mcp-everything
Docker
# Build the Docker image
docker build -t mcp-everything .
# Run the container
docker run -it mcp-everything
Usage with Claude Desktop
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"everything": {
"command": "npx",
"args": [
"-y",
"mcp-everything"
]
}
}
}
Components
Tools
-
echo
- Simple tool to echo back input messages
- Input:
-
message
(string): Message to echo back
-
- Returns: Text content with echoed message
-
add
- Adds two numbers together
- Inputs:
-
a
(number): First number -
b
(number): Second number
-
- Returns: Text result of the addition
-
longRunningOperation
- Demonstrates progress notifications for long operations
- Inputs:
-
duration
(number, default: 10): Duration in seconds -
steps
(number, default: 5): Number of progress steps
-
- Returns: Completion message with duration and steps
- Sends progress notifications during execution
-
sampleLLM
- Demonstrates LLM sampling capability using MCP sampling feature
- Inputs:
-
prompt
(string): The prompt to send to the LLM -
maxTokens
(number, default: 100): Maximum tokens to generate
-
- Returns: Generated LLM response
-
getTinyImage
- Returns a small test image
- No inputs required
- Returns: Base64 encoded PNG image data
-
printEnv
- Prints all environment variables
- Useful for debugging MCP server configuration
- No inputs required
- Returns: JSON string of all environment variables
-
annotatedMessage
- Demonstrates how annotations can be used to provide metadata about content
- Inputs:
-
messageType
(enum: "error" | "success" | "debug"): Type of message to demonstrate different annotation patterns -
includeImage
(boolean, default: false): Whether to include an example image
-
- Returns: Content with varying annotations
Resources
The server provides 100 test resources in two formats:
-
Even numbered resources:
- Plaintext format
- URI pattern:
test://static/resource/{even_number}
- Content: Simple text description
-
Odd numbered resources:
- Binary blob format
- URI pattern:
test://static/resource/{odd_number}
- Content: Base64 encoded binary data
Resource features:
- Supports pagination (10 items per page)
- Allows subscribing to resource updates
- Demonstrates resource templates
- Auto-updates subscribed resources every 5 seconds
Prompts
-
simple_prompt
- Basic prompt without arguments
- Returns: Single message exchange
-
complex_prompt
- Advanced prompt demonstrating argument handling
- Required arguments:
-
temperature
(number): Temperature setting
-
- Optional arguments:
-
style
(string): Output style preference
-
- Returns: Multi-turn conversation with images
Logging
The server sends random-leveled log messages every 15 seconds to demonstrate the logging capabilities of MCP.
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Mirror ofhttps://github.com/agentience/practices_mcp_server
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
An AI chat bot for small and medium-sized teams, supporting models such as Deepseek, Open AI, Claude, and Gemini. 专为中小团队设计的 AI 聊天应用,支持 Deepseek、Open AI、Claude、Gemini 等模型。
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Reviews

user_jOcvxLJ4
mcp-everything is an incredibly versatile and comprehensive tool developed by s2005. It offers a seamless integration for various applications and functions flawlessly. I highly recommend checking out their GitHub page for further details and updates. This tool truly covers all bases and is essential for any developer's toolkit.