MCP cover image
See in Github
2025-04-10

Un protocole de contexte de modèle (MCP) pour analyser et interroger les référentiels GitHub à l'aide de l'API de chat GitHub.

0

Github Watches

3

Github Forks

24

Github Stars

GitHub Chat MCP

A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API. Official Site: https://github-chat.com

Installation

# Install with pip
pip install github-chat-mcp

# Or install with the newer uv package manager
uv install github-chat-mcp
  1. Start using it with Claude!

Example prompts:

  • "Use github-chat-mcp to analyze the React repository"
  • "Index the TypeScript repository with github-chat-mcp and ask about its architecture"

GitHub Chat MCP server

smithery badge

Setup Instructions

Before anything, ensure you have a GitHub Chat API key. This is required to use the service.

Install uv first.

MacOS/Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

Windows:

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Setup with Cursor (Recommended)

In mcp.json:

{
  "mcpServers": {
    "github-chat": {
      "command": "uvx",
      "args": [
        "github-chat-mcp"
      ]
    }
  }
}

With above, no envs required since it's a freemium release.

Setup with Claude Desktop

# claude_desktop_config.json
# Can find location through:
# Hamburger Menu -> File -> Settings -> Developer -> Edit Config
# Must perform: brew install uv
{
  "mcpServers": {
    "github-chat": {
      "command": "uvx",
      "args": ["github-chat-mcp"],
      "env": {
      }
    }
  }
}

Installing via Smithery

You can install GitHub Chat for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install github-chat-mcp --client claude

Using GitHub Chat with Claude

  1. Index a GitHub repository first: "Index the GitHub repository at https://github.com/username/repo"

  2. Then ask questions about the repository: "What is the core tech stack used in this repository?"

Debugging

Run:

npx @modelcontextprotocol/inspector uvx github-chat-mcp

Local/Dev Setup Instructions

Clone repo

git clone https://github.com/yourusername/github-chat-mcp.git

Install dependencies

Install uv first.

MacOS/Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

Windows:

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Then install MCP server dependencies:

cd github-chat-mcp

# Create virtual environment and activate it
uv venv

source .venv/bin/activate # MacOS/Linux
# OR
.venv/Scripts/activate # Windows

# Install dependencies
uv sync

Setup with Claude Desktop

Using MCP CLI SDK

# `pip install mcp[cli]` if you haven't
mcp install /ABSOLUTE/PATH/TO/PARENT/FOLDER/github-chat-mcp/src/github_chat_mcp/server.py -v "GITHUB_API_KEY=API_KEY_HERE"

Manually

# claude_desktop_config.json
# Can find location through:
# Hamburger Menu -> File -> Settings -> Developer -> Edit Config
{
  "mcpServers": {
    "github-chat": {
      "command": "uv",
      "args": [
        "--directory",
        "/ABSOLUTE/PATH/TO/PARENT/FOLDER/github-chat-mcp",
        "run",
        "github-chat-mcp"
      ],
      "env": {
      }
    }
  }
}

Using GitHub Chat with Claude

  1. Index a GitHub repository first: "Index the GitHub repository at https://github.com/username/repo"

  2. Then ask questions about the repository: "What is the core tech stack used in this repository?"

Debugging

Run:

# If mcp cli installed (`pip install mcp[cli]`)
mcp dev /ABSOLUTE/PATH/TO/PARENT/FOLDER/github-chat-mcp/src/github_chat_mcp/server.py

# If not
npx @modelcontextprotocol/inspector \
      uv \
      --directory /ABSOLUTE/PATH/TO/PARENT/FOLDER/github-chat-mcp \
      run \
      github-chat-mcp

Then access MCP Inspector at http://localhost:5173. You may need to add your GitHub API key in the environment variables in the inspector under GITHUB_API_KEY.

Notes

  • Level of logging is adjustable through the FASTMCP_LOG_LEVEL environment variable (e.g. FASTMCP_LOG_LEVEL="ERROR")
  • This MCP server provides two main tools:
    1. Repository Indexing - Index and analyze a GitHub repository
    2. Repository Querying - Ask questions about the indexed repository

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • modelcontextprotocol
  • Serveurs de protocole de contexte modèle

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • ravitemer
  • Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)

  • jae-jae
  • MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.

  • patruff
  • Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle

  • pontusab
  • La communauté du curseur et de la planche à voile, recherchez des règles et des MCP

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

    Reviews

    2 (1)
    Avatar
    user_LMY33r0t
    2025-04-17

    I've been using github-chat-mcp by AsyncFuncAI, and it has thoroughly impressed me. The seamless integration with GitHub and the efficient communication features significantly boost productivity for my projects. The intuitive interface and responsiveness make it a must-have tool for any developer working with GitHub repositories. Highly recommended! Check it out at https://github.com/AsyncFuncAI/github-chat-mcp.