MCP cover image
See in Github
2025-04-10

1

Github Watches

0

Github Forks

0

Github Stars

MCP Server

A Model Context Protocol (MCP) server implementation using Node.js and TypeScript.

What is MCP?

The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs (Large Language Models). It allows for seamless communication between AI applications and various data sources and tools.

Features

This server implementation provides:

  • A calculator tool with basic arithmetic operations (add, subtract, multiply, divide)
  • Server information resource with system metrics
  • Echo resource for message reflection
  • Greeting prompt template with formal/casual options
  • Code review prompt template
  • Support for both stdio and HTTP/SSE transport methods
  • Comprehensive logging system

Getting Started

Prerequisites

  • Node.js 18+ and npm

Installation

  1. Clone the repository
  2. Install dependencies:
npm install
  1. Build the project:
npm run build

Running the Server

Using stdio (for integration with MCP clients)

npm start

Using HTTP/SSE (for web-based usage)

npm start -- --http

The server will start on port 3000 by default. You can change this by setting the PORT environment variable:

PORT=8080 npm start -- --http

API

Tools

  • calculate: Perform arithmetic operations
    • Parameters:
      • operation (add/subtract/multiply/divide)
      • a (number)
      • b (number)

Resources

  • server://info: Get server information including:
    • Server name and version
    • Uptime
    • Node.js version
    • Platform
    • Memory usage
  • echo://{message}: Echo back the provided message

Prompts

  • greeting: Generate personalized greetings
    • Parameters:
      • name (string): Name to greet
      • formal (optional string): Whether to use formal language
  • code-review: Review code for improvements
    • Parameters:
      • code (string): Code to review
      • language (string): Programming language
      • focus (optional string): Review focus area

Development

To run the server in development mode with auto-reloading:

npm run dev

For HTTP/SSE transport in development mode:

npm run dev -- --http

License

ISC

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • modelcontextprotocol
  • 模型上下文协议服务器

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • open-webui
  • 用户友好的AI接口(支持Ollama,OpenAi API,...)

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • metorial
  • 数百个MCP服务器的容器化版本📡📡

    Reviews

    4 (1)
    Avatar
    user_E4CZSNSx
    2025-04-17

    The exp-mcp-server by semo94 is a must-have for any MCP application fan. The server is reliable, efficient, and seamlessly integrates with my existing setup. The clear documentation and welcoming information on the product page make it easy to get started. Highly recommended!