MCP cover image
See in Github
2025-04-02

Ein Modellkontext -Protokollserver (MCP) für den Cursor

1

Github Watches

0

Github Forks

1

Github Stars

User Prompt MCP

A Model Context Protocol (MCP) server for Cursor that enables requesting user input during generation. This is mostly AI-generated code.

Overview

This project implements an MCP server that allows Cursor (or any MCP-compatible client) to request additional input from users during model generation without ending the generation process. It serves as a bridge between the AI model and the user, creating a more interactive experience.

Features

  • User Input Prompting: Allows the AI to ask for more information during generation
  • Simple GUI: Presents input prompts in a dialog box with text wrapping
  • Cross-Platform: Should work on both Linux (tested) and macOS
  • Stdio Transport: Integration with Cursor via stdio

Installation

Prerequisites

  • For GUI functionality:
    • Linux: zenity
    • macOS: osascript (built-in)

Quick Install (Recommended)

The easiest way to install user-prompt-mcp is using our installation script, which automatically downloads the appropriate binary for your system:

# Install the latest version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash

# Or install a specific version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash -s -- -v v1.0.0

The installation script will:

  • Detect your OS and architecture (Linux or macOS, amd64 or arm64)
  • Download the appropriate pre-compiled binary
  • Install it to a suitable directory in your PATH
  • Make it executable

Install from source

If you prefer to build from source (requires Go 1.18+):

go install github.com/nazar256/user-prompt-mcp/cmd/user-prompt-mcp@latest

Manual Installation

You can also download pre-compiled binaries directly from the Releases page.

Usage with Cursor

  1. Open Cursor and go to Settings
  2. Navigate to the MCP Servers section and add a new server by editing JSON:
{
  "mcpServers": {
    "prompt": {
      "command": "user-prompt-mcp"
    }
  }
}
  1. Save and refresh

  2. Add to global rules something like: "Important - do not finish responding until explicitly told to do so, instead always use MCP input_user_prompt and wait for response in all cases, either you feel stuck, you have question or you finished work on a prompt - always communicate with the user using this MCP."

Configuration

The server can be configured using command-line flags or environment variables:

Timeout Configuration

By default, the server will wait 20 minutes for user input before timing out. You can customize this timeout using:

  • Command line flag: --timeout <seconds>
    user-prompt-mcp --timeout 600  # Set timeout to 10 minutes
    
  • Environment variable: USER_PROMPT_TIMEOUT=<seconds>
    export USER_PROMPT_TIMEOUT=1800  # Set timeout to 30 minutes
    user-prompt-mcp
    

Now when using Cursor, the AI can request additional input from you without ending its generation.

License

MIT

Acknowledgements

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.

  • modelcontextprotocol
  • Modellkontext -Protokollserver

  • Mintplex-Labs
  • Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.

  • n8n-io
  • Fair-Code-Workflow-Automatisierungsplattform mit nativen KI-Funktionen. Kombinieren Sie visuelles Gebäude mit benutzerdefiniertem Code, SelbstHost oder Cloud, 400+ Integrationen.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.

    Reviews

    3 (1)
    Avatar
    user_wocUNS9M
    2025-04-16

    MCP GitHub by aghonem1 is an impressive tool that seamlessly integrates with your GitHub repositories. It streamlines my workflow, making it easier to manage and track progress on various projects. Highly recommended for developers looking for an efficient solution to boost productivity. Check it out [here](https://mcp.so/server/mcp-github/aghonem1).