Confidential guide on numerology and astrology, based of GG33 Public information

llm-mcp-server-template
LLM-MCP server 开发的模版项目
3 years
Works with Finder
1
Github Watches
0
Github Forks
1
Github Stars
llm-mcp-server-template
LLM-MCP server 开发的模版项目
- server为数字比大小能力
- client为各模型、各种调用方式
MCP
- https://modelcontextprotocol.io/introduction
- 简单理解,
- 模型(client)是大脑,负责规划和执行文本类任务,如第一步写代码,第二步执行,第三步分析。
- MCP(Server)是工具,负责提供模型需要的其他能力,如数学计算、文件读写、网络请求等。
MCP-Server
- 主要负责实现mcp能力,如数学计算、文件读写、网络请求等
- https://modelcontextprotocol.io/quickstart/server
- 开发过程中,可以使用 mcp-server-inspector 工具协助调试
- 几种类型的server:
- 1、本地Stdio,client直接调用本地的server文件
- 2、远程聚合平台SSE,如Glama.ai、MCP.so等
MCP-Client
- 主要负责协调模型和mcp-server,最终实现"模型调用了mcp能力"
- 几种类型的client:
- 1、直接用现成的客户端软件,如Cursor、MCP Inspector等。
- 2、云平台上的各种智能体agent,如阿里云百炼应用管理。
- 3、自己写脚本支持
- 3.1、模型的sdk已经支持了mcp,使用脚本方便的进行调用
- 如claude,https://modelcontextprotocol.io/quickstart/client
- 如openai,https://openai.github.io/openai-agents-python/mcp/
- 3.2、模型的sdk没有支持mcp,需要自己实现
- 3.2.1、使用框架,如langchain-mcp-adapter(https://github.com/langchain-ai/langchain-mcp-adapters)
- 3.2.2、完全原生,具体包括多轮请求:
- a、使用 prompt 请求模型,告知模型任务和 mcp servers 的介绍,模型返回调用server的接口和参数
- b、使用上一轮返回的信息,请求tools,获取结果
- c、使用上一轮返回的信息,请求模型,获取结果
- d、重复b、c,直到模型返回最终结果
- 3.1、模型的sdk已经支持了mcp,使用脚本方便的进行调用
项目相关
环境准备
# 安装 uv 包管理工具,若已有,则跳过
curl -LsSf https://astral.sh/uv/install.sh | sh
source ~/.local/bin/env
uv --version
# 安装python3.10,若已有,则跳过
uv python list
uv python install 3.10
cd server
# server 目录,创建虚拟环境, 若已有,则跳过
uv venv --python 3.10
source .venv/bin/activate
uv add "mcp[cli]" httpx
# server目录,开发的时候用mcp-server-inspect调试
mcp dev math.py
cd client
# client 目录,创建虚拟环境, 若已有,则跳过
uv venv --python 3.10
source .venv/bin/activate
uv add openai-agents socksio
touch .env # 在.env文件中,设置OPENAI_API_KEY等
# client 目录,正常运行
source .venv/bin/activate
python openai_client.py
项目结果
MCP Server,mcp-server-inspect调试页面:
MCP Client,...
TODO
1、 mcp-server,本地开发和调试
2、mcp-server,发布远程托管平台
3、mcp-client,使用openai-agent-sdk调用server
4、mcp-client,使用anthropic-sdk调用本地server
5、mcp-client,使用langchain-mcp-adapter调用server
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
🔍 Enabling AI assistants to search and access PyPI package information through a simple MCP interface.
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Reviews

user_dpXuX0EF
I've been using the LLM-MCP-Server-Template by coderzzy and it's an outstanding tool for developing MCP applications. The template is clean, well-structured, and easy to understand. It significantly reduces setup time, allowing developers to focus on building functionality rather than boilerplate code. I highly recommend it to anyone looking to streamline their MCP server development process. Great job, coderzzy!