MCP cover image

Un serveur de protocole de contexte de modèle (MCP) qui fournit des outils pour interroger les contributions d'Erick Wendel sur différentes plates-formes

2

Github Watches

15

Github Forks

77

Github Stars

erickwendel-contributions-mcp

CI Status smithery badge

A Model Context Protocol (MCP) server that provides tools to query Erick Wendel's contributions across different platforms. Query talks, blog posts, and videos using natural language through Claude, Cursor or similars. This project was built using Cursor IDE with the default agent (trial version).

This MCP server is also available on Smithery for direct integration.

Available Tools

This MCP server provides the following tools to interact with the API:

  • get-talks: Retrieves a paginated list of talks with optional filtering

    • Supports filtering by ID, title, language, city, country, and year
    • Can return counts grouped by language, country, or city
  • get-posts: Fetches posts with optional filtering and pagination

    • Supports filtering by ID, title, language, and portal
  • get-videos: Retrieves videos with optional filtering and pagination

    • Supports filtering by ID, title, and language
  • check-status: Verifies if the API is alive and responding

Integration with AI Tools

Inspect MCP Server Capabilities

You can inspect this MCP server's capabilities using Smithery:

npx -y @smithery/cli@latest inspect @ErickWendel/erickwendel-contributions-mcp

This will show you all available tools, their parameters, and how to use them.

Setup

  1. Make sure you're using Node.js v23+
node -v
#v23.9.0
  1. Clone this repository:
git clone https://github.com/erickwendel/erickwendel-contributions-mcp.git
cd erickwendel-contributions-mcp
  1. Restore dependencies:
npm ci

Integration with AI Tools

Cursor Setup

  1. Open Cursor Settings

  2. Navigate to MCP section

  3. Click "Add new MCP server"

  4. Configure the server:

    Name = erickwendel-contributions
    Type = command
    Command = node ABSOLUTE_PATH_TO_PROJECT/src/index.ts
    

    or if you prefer executing it from Smithery

    Name = erickwendel-contributions
    Type = command
    Command = npm exec -- @smithery/cli@latest run @ErickWendel/erickwendel-contributions-mcp
    

or configure directly from the Cursor's global MCP file located in ~/.cursor/mcp.json and add the following:

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "node",
      "args": ["ABSOLUTE_PATH_TO_PROJECT/src/index.ts"]
    }
  }
}

or if you prefer executing it from Smithery

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "npm",
      "args": [
        "exec",
        "--",
        "@smithery/cli@latest",
        "run",
        "@ErickWendel/erickwendel-contributions-mcp"
      ]
    }
  }
}
  1. Make sure Cursor chat is in Agent mode by selecting "Agent" in the lower left side dropdown

  2. Go to the chat an ask "how many videos were published about JavaScript in 2024"

Claude Desktop Setup

Installing via Smithery

To install Erick Wendel Contributions for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @ErickWendel/erickwendel-contributions-mcp --client claude

Note: The Smithery CLI installation for Claude is currently experiencing issues. Please use the manual installation method below until this is resolved.

Manual Setup

  1. Go to Claude settings
  2. Click in the Developer tab
  3. Click in edit config
  4. Open the config in a code editor
  5. Add the following configuration to your Claude Desktop config:
{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "node",
      "args": ["ABSOLUTE_PATH_TO_PROJECT/src/index.ts"]
    }
  }
}

or if you prefer executing it from Smithery

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "npm",
      "args": [
        "exec",
        "--",
        "@smithery/cli@latest",
        "run",
        "@ErickWendel/erickwendel-contributions-mcp"
      ]
    }
  }
}
  1. Save file and Restart Claude Desktop
  2. Open the Developer tab again and check if it's in the "running" state as follows:

  1. Go to the chat and ask "Are there videos about RAG?"

Free Alternative Using MCPHost

If you don't have access to Claude Desktop nor Cursor, you can use MCPHost with Ollama as a free alternative. MCPHost is a CLI tool that enables Large Language Models to interact with MCP servers.

  1. Install MCPHost:
go install github.com/mark3labs/mcphost@latest
  1. Create a config file (e.g. ./mcp.jsonc):
{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "node",
      "args": ["ABSOLUTE_PATH_TO_PROJECT/src/index.ts"]
    }
  }
}

or if you prefer executing it from Smithery

{
  "mcpServers": {
    "erickwendel-contributions": {
      "command": "npm",
      "args": [
        "exec",
        "--",
        "@smithery/cli@latest",
        "run",
        "@ErickWendel/erickwendel-contributions-mcp"
      ]
    }
  }
}
  1. Run MCPHost with your preferred Ollama model:
ollama pull MODEL_NAME
mcphost --config ./mcp.jsonc -m ollama:MODEL_NAME

Example Queries

Here are some examples of queries you can ask Claude, Cursor or any MCP Client:

  1. "How many talks were given in 2023?"

  1. "Show me talks in Spanish"

  1. "Find posts about WebXR"

Development

Features

  • Built with Model Context Protocol (MCP)
  • Type-safe with TypeScript and Zod schema validation
  • Native TypeScript support in Node.js without transpilation
  • Generated SDK using GenQL
  • Modular architecture with separation of concerns
  • Standard I/O transport for easy integration
  • Structured error handling
  • Compatible with Claude Desktop, Cursor, and MCPHost (free alternative)

Note: This project requires Node.js v23+ as it uses the native TypeScript support added in the last year.

Architecture

The codebase follows a modular structure:

src/
  ├── config/      # Configuration settings
  ├── types/       # TypeScript interfaces and types
  ├── tools/       # MCP tool implementations
  ├── utils/       # Utility functions
  ├── services/    # API service layer
  └── index.ts     # Main entry point

Testing

To run the test suite:

npm test

For development mode with watch:

npm run test:dev

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Author

Erick Wendel

License

This project is licensed under the MIT License - see the LICENSE file for details.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • jae-jae
  • MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • ravitemer
  • Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)

  • patruff
  • Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle

  • pontusab
  • La communauté du curseur et de la planche à voile, recherchez des règles et des MCP

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

  • n8n-io
  • Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.

  • av
  • Exécutez sans effort LLM Backends, API, Frontends et Services avec une seule commande.

    Reviews

    3 (1)
    Avatar
    user_tSEjSgKV
    2025-04-17

    As a dedicated user of the erickwendel-contributions-mcp, I am genuinely impressed with its functionality and efficiency. ErickWendel has done an outstanding job in creating this tool, which has significantly streamlined my workflow. The quality of the code and the ease of use make it an indispensable asset in my daily tasks. Highly recommended for anyone looking to enhance their productivity! Check it out at https://github.com/ErickWendel/erickwendel-contributions-mcp.