I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

user-prompt-mcp
Un servidor de protocolo de contexto modelo (MCP) para cursor que permite solicitar la entrada del usuario durante la generación
3 years
Works with Finder
1
Github Watches
0
Github Forks
1
Github Stars
User Prompt MCP
A Model Context Protocol (MCP) server for Cursor that enables requesting user input during generation. This is mostly AI-generated code.
Overview
This project implements an MCP server that allows Cursor (or any MCP-compatible client) to request additional input from users during model generation without ending the generation process. It serves as a bridge between the AI model and the user, creating a more interactive experience.
Features
- User Input Prompting: Allows the AI to ask for more information during generation
- Simple GUI: Presents input prompts in a dialog box with text wrapping
- Cross-Platform: Should work on both Linux (tested) and macOS
- Stdio Transport: Integration with Cursor via stdio
Installation
Prerequisites
- For GUI functionality:
- Linux: zenity
- macOS: osascript (built-in)
Quick Install (Recommended)
The easiest way to install user-prompt-mcp is using our installation script, which automatically downloads the appropriate binary for your system:
# Install the latest version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash
# Or install a specific version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash -s -- -v v1.0.0
The installation script will:
- Detect your OS and architecture (Linux or macOS, amd64 or arm64)
- Download the appropriate pre-compiled binary
- Install it to a suitable directory in your PATH
- Make it executable
Install from source
If you prefer to build from source (requires Go 1.18+):
go install github.com/nazar256/user-prompt-mcp/cmd/user-prompt-mcp@latest
Manual Installation
You can also download pre-compiled binaries directly from the Releases page.
Usage with Cursor
- Open Cursor and go to Settings
- Navigate to the MCP Servers section and add a new server by editing JSON:
{
"mcpServers": {
"prompt": {
"command": "user-prompt-mcp"
}
}
}
-
Save and refresh
-
Add to global rules something like: "Important - do not finish responding until explicitly told to do so, instead always use MCP
input_user_prompt
and wait for response in all cases, either you feel stuck, you have question or you finished work on a prompt - always communicate with the user using this MCP."
Configuration
The server can be configured using command-line flags or environment variables:
Timeout Configuration
By default, the server will wait 20 minutes for user input before timing out. You can customize this timeout using:
- Command line flag:
--timeout <seconds>
user-prompt-mcp --timeout 600 # Set timeout to 10 minutes
- Environment variable:
USER_PROMPT_TIMEOUT=<seconds>
export USER_PROMPT_TIMEOUT=1800 # Set timeout to 30 minutes user-prompt-mcp
Now when using Cursor, the AI can request additional input from you without ending its generation.
License
MIT
Acknowledgements
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Espejo dehttps: //github.com/agentience/practices_mcp_server
Espejo de https: //github.com/bitrefill/bitrefill-mcp-server
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
Reviews

user_wocUNS9M
MCP GitHub by aghonem1 is an impressive tool that seamlessly integrates with your GitHub repositories. It streamlines my workflow, making it easier to manage and track progress on various projects. Highly recommended for developers looking for an efficient solution to boost productivity. Check it out [here](https://mcp.so/server/mcp-github/aghonem1).