I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-openai-complete
Serveur MCP pour l'achèvement du texte OpenAI
1
Github Watches
1
Github Forks
0
Github Stars
OpenAI Complete MCP Server
An MCP (Model Context Protocol) server that provides a clean interface for LLMs to use text completion capabilities through the MCP protocol. This server acts as a bridge between an LLM client and any OpenAI's compatible API. The primary use case is for base models, as the server does not provide support for chat completions.
Features
- Provides a single tool named "complete" for generating text completions
- Properly handles asynchronous processing to avoid blocking
- Implements timeout handling with graceful fallbacks
- Supports cancellation of ongoing requests
Installation
# Clone the repository
git clone <repository-url>
cd mcp-openai-complete
# Install dependencies
pnpm install
# Build the project
pnpm run build
Configuration
The following environment variables are required:
OPENAI_API_KEY=your-hyperbolic-api-key
OPENAI_API_BASE=https://api.hyperbolic.xyz/v1
OPENAI_MODEL=meta-llama/Meta-Llama-3.1-405B
Usage
Start the server:
pnpm start
This will start the server on stdio, making it available for MCP clients to communicate with.
Docker Usage
Building the Docker Image
docker build -t mcp-openai-complete .
Running the Container
# Run with environment variables
docker run -it --rm \
-e OPENAI_API_KEY="your-api-key" \
-e OPENAI_MODEL="gpt-3.5-turbo-instruct" \
mcp-openai-complete
You can also use a .env file:
# Run with .env file
docker run -it --rm \
--env-file .env \
mcp-openai-complete
Parameters for the "complete" tool
-
prompt
(string, required): The text prompt to complete -
max_tokens
(integer, optional): Maximum tokens to generate, default: 150 -
temperature
(number, optional): Controls randomness (0-1), default: 0.7 -
top_p
(number, optional): Controls diversity via nucleus sampling, default: 1.0 -
frequency_penalty
(number, optional): Decreases repetition of token sequences, default: 0.0 -
presence_penalty
(number, optional): Increases likelihood of talking about new topics, default: 0.0
Development
For development with auto-reloading:
npm run dev
License
MIT
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.
🧑🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.
Une liste organisée des serveurs de protocole de contexte de modèle (MCP)
Reviews

user_VGmshbtF
I've been using mcp-openai-complete for a while now, and it has significantly enhanced my coding efficiency. The author, aiamblichus, has done a fantastic job in creating a straightforward and powerful tool. The integration with OpenAI is seamless, and the support for multiple languages ensures versatility in projects. Highly recommend checking it out on GitHub!