MCP cover image

Ein Model Context Protocol (MCP) -Server, der Tools zur Abfrage von Erick Wendels Beiträgen auf verschiedenen Plattformen bietet

2

Github Watches

15

Github Forks

77

Github Stars

erickwendel-contributions-mcp

CI Status smithery badge

A Model Context Protocol (MCP) server that provides tools to query Erick Wendel's contributions across different platforms. Query talks, blog posts, and videos using natural language through Claude, Cursor or similars. This project was built using Cursor IDE with the default agent (trial version).

This MCP server is also available on Smithery for direct integration.

Available Tools

This MCP server provides the following tools to interact with the API:

  • get-talks: Retrieves a paginated list of talks with optional filtering

    • Supports filtering by ID, title, language, city, country, and year
    • Can return counts grouped by language, country, or city
  • get-posts: Fetches posts with optional filtering and pagination

    • Supports filtering by ID, title, language, and portal
  • get-videos: Retrieves videos with optional filtering and pagination

    • Supports filtering by ID, title, and language
  • check-status: Verifies if the API is alive and responding

Integration with AI Tools

Inspect MCP Server Capabilities

You can inspect this MCP server's capabilities using Smithery:

npx -y @smithery/cli@latest inspect @ErickWendel/erickwendel-contributions-mcp

This will show you all available tools, their parameters, and how to use them.

Setup

  1. Make sure you're using Node.js v23+
node -v
#v23.9.0
  1. Clone this repository:
git clone https://github.com/erickwendel/erickwendel-contributions-mcp.git
cd erickwendel-contributions-mcp
  1. Restore dependencies:
npm ci

Integration with AI Tools

Cursor Setup

  1. Open Cursor Settings

  2. Navigate to MCP section

  3. Click "Add new MCP server"

  4. Configure the server:

    Name = erickwendel-contributions
    Type = command
    Command = node ABSOLUTE_PATH_TO_PROJECT/src/index.ts
    

    or if you prefer executing it from Smithery

    Name = erickwendel-contributions
    Type = command
    Command = npm exec -- @smithery/cli@latest run @ErickWendel/erickwendel-contributions-mcp
    

or configure directly from the Cursor's global MCP file located in ~/.cursor/mcp.json and add the following:

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "node",
      "args": ["ABSOLUTE_PATH_TO_PROJECT/src/index.ts"]
    }
  }
}

or if you prefer executing it from Smithery

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "npm",
      "args": [
        "exec",
        "--",
        "@smithery/cli@latest",
        "run",
        "@ErickWendel/erickwendel-contributions-mcp"
      ]
    }
  }
}
  1. Make sure Cursor chat is in Agent mode by selecting "Agent" in the lower left side dropdown

  2. Go to the chat an ask "how many videos were published about JavaScript in 2024"

Claude Desktop Setup

Installing via Smithery

To install Erick Wendel Contributions for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @ErickWendel/erickwendel-contributions-mcp --client claude

Note: The Smithery CLI installation for Claude is currently experiencing issues. Please use the manual installation method below until this is resolved.

Manual Setup

  1. Go to Claude settings
  2. Click in the Developer tab
  3. Click in edit config
  4. Open the config in a code editor
  5. Add the following configuration to your Claude Desktop config:
{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "node",
      "args": ["ABSOLUTE_PATH_TO_PROJECT/src/index.ts"]
    }
  }
}

or if you prefer executing it from Smithery

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "npm",
      "args": [
        "exec",
        "--",
        "@smithery/cli@latest",
        "run",
        "@ErickWendel/erickwendel-contributions-mcp"
      ]
    }
  }
}
  1. Save file and Restart Claude Desktop
  2. Open the Developer tab again and check if it's in the "running" state as follows:

  1. Go to the chat and ask "Are there videos about RAG?"

Free Alternative Using MCPHost

If you don't have access to Claude Desktop nor Cursor, you can use MCPHost with Ollama as a free alternative. MCPHost is a CLI tool that enables Large Language Models to interact with MCP servers.

  1. Install MCPHost:
go install github.com/mark3labs/mcphost@latest
  1. Create a config file (e.g. ./mcp.jsonc):
{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "node",
      "args": ["ABSOLUTE_PATH_TO_PROJECT/src/index.ts"]
    }
  }
}

or if you prefer executing it from Smithery

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "npm",
      "args": [
        "exec",
        "--",
        "@smithery/cli@latest",
        "run",
        "@ErickWendel/erickwendel-contributions-mcp"
      ]
    }
  }
}
  1. Run MCPHost with your preferred Ollama model:
ollama pull MODEL_NAME
mcphost --config ./mcp.jsonc -m ollama:MODEL_NAME

Example Queries

Here are some examples of queries you can ask Claude, Cursor or any MCP Client:

  1. "How many talks were given in 2023?"

  1. "Show me talks in Spanish"

  1. "Find posts about WebXR"

Development

Features

  • Built with Model Context Protocol (MCP)
  • Type-safe with TypeScript and Zod schema validation
  • Native TypeScript support in Node.js without transpilation
  • Generated SDK using GenQL
  • Modular architecture with separation of concerns
  • Standard I/O transport for easy integration
  • Structured error handling
  • Compatible with Claude Desktop, Cursor, and MCPHost (free alternative)

Note: This project requires Node.js v23+ as it uses the native TypeScript support added in the last year.

Architecture

The codebase follows a modular structure:

src/
  ├── config/      # Configuration settings
  ├── types/       # TypeScript interfaces and types
  ├── tools/       # MCP tool implementations
  ├── utils/       # Utility functions
  ├── services/    # API service layer
  └── index.ts     # Main entry point

Testing

To run the test suite:

npm test

For development mode with watch:

npm run test:dev

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Author

Erick Wendel

License

This project is licensed under the MIT License - see the LICENSE file for details.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • apappascs
  • Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.

  • jae-jae
  • MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.

  • Mintplex-Labs
  • Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.

  • ravitemer
  • Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)

  • patruff
  • Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden

  • pontusab
  • Die Cursor & Windsurf -Community finden Regeln und MCPs

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.

  • n8n-io
  • Fair-Code-Workflow-Automatisierungsplattform mit nativen KI-Funktionen. Kombinieren Sie visuelles Gebäude mit benutzerdefiniertem Code, SelbstHost oder Cloud, 400+ Integrationen.

  • av
  • Führen Sie mühelos LLM -Backends, APIs, Frontends und Dienste mit einem Befehl aus.

    Reviews

    3 (1)
    Avatar
    user_tSEjSgKV
    2025-04-17

    As a dedicated user of the erickwendel-contributions-mcp, I am genuinely impressed with its functionality and efficiency. ErickWendel has done an outstanding job in creating this tool, which has significantly streamlined my workflow. The quality of the code and the ease of use make it an indispensable asset in my daily tasks. Highly recommended for anyone looking to enhance their productivity! Check it out at https://github.com/ErickWendel/erickwendel-contributions-mcp.