I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

LLM-Model-Providers-MCP
用于获取可用LLM型号的MCP服务器
1
Github Watches
1
Github Forks
0
Github Stars
llm-model-providers MCP Server
Get available models from each LLM provider
Development
Install dependencies:
pnpm install
Build the server:
pnpm run build
For development with auto-rebuild:
pnpm run watch
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"llm-model-providers": {
"command": "/path/to/llm-model-providers/build/index.js"
"env": {
"OPENAI_API_KEY": "",
"ANTHROPIC_API_KEY": ""
}
}
}
}
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
pnpm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Reviews

user_b49pCNs3
As a dedicated user of llm-model-providers-mcp by jhsu, I am thoroughly impressed by its seamless integration and efficiency. This tool simplifies working with various language models, offering a user-friendly experience and robust functionality. Highly recommend for anyone in the machine learning community!