I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-Workers-AI
MCP Servidores SDK para trabajadores de Cloudflare
1
Github Watches
0
Github Forks
3
Github Stars
MCP Workers AI
MCP servers sdk for Cloudflare Workers
Usage
Install:
yarn add mcp-workers-ai
# or
npm install -S mcp-workers-ai
Load the MCP server tools:
import { loadTools } from "mcp-workers-ai"
const tools = await loadTools([
import("@modelcontextprotocol/server-gitlab"),
import("@modelcontextprotocol/server-slack"),
...
]);
// Pass `tools` to the LLM inference request.
Call a tool:
import { callTool } from "mcp-workers-ai"
// Typically the LLM selects a tool to use.
const selected_tool = {
arguments: {
project_id: 'svensauleau/test',
branch: 'main',
files: [ ... ],
commit_message: 'added unit tests'
},
name: 'push_files'
};
const res = await callTool(selected_tool)
// Pass `res` back into a LLM inference request.
Demo
wrangler configuration:
name = "test"
main = "src/index.ts"
[ai]
binding = "AI"
[vars]
GITLAB_PERSONAL_ACCESS_TOKEN = "glpat-aaaaaaaaaaaaaaaaaaaa"
[alias]
"@modelcontextprotocol/sdk/server/index.js" = "mcp-workers-ai/sdk/server/index.js"
"@modelcontextprotocol/sdk/server/stdio.js" = "mcp-workers-ai/sdk/server/stdio.js"
Worker:
import { loadTools, callTool } from "mcp-workers-ai"
export default {
async fetch(request: Request, env: any): Promise<Response> {
// Make sure to set the token before importing the tools
process.env.GITLAB_PERSONAL_ACCESS_TOKEN = env.GITLAB_PERSONAL_ACCESS_TOKEN;
const tools = await loadTools([
import("@modelcontextprotocol/server-gitlab/dist/"),
]);
const prompt = await request.text();
const response = await env.AI.run(
"@hf/nousresearch/hermes-2-pro-mistral-7b",
{
messages: [{ role: "user", content: prompt }],
tools,
},
);
if (response.tool_calls && response.tool_calls.length > 0) {
const selected_tool = response.tool_calls[0];
const res = await callTool(selected_tool)
if (res.content.length > 1) {
throw new Error("too many responses")
}
const finalResponse = await env.AI.run(
"@hf/nousresearch/hermes-2-pro-mistral-7b",
{
messages: [
{
role: "user",
content: prompt,
},
{
role: "assistant",
content: "",
tool_call: selected_tool.name,
},
{
role: "tool",
name: selected_tool.name,
content: res.content[0].text,
},
],
tools,
},
);
return new Response(finalResponse.response);
} else {
return new Response(response.response);
}
}
};
Calling the AI:
$ curl http://example.com \
-d "create a file called 'joke.txt' in my svensauleau/test project with your favorite joke on the main branch. Use the commit message 'added unit tests'"
I have successfully added a file called 'joke.txt' with a joke to your project 'svensauleau/test' on the main branch. The commit message used was 'added unit tests'. You can view the commit and the file in your project's repository.
Result:
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
Espejo dehttps: //github.com/agentience/practices_mcp_server
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Reviews

user_7w1k3zzc
As a devoted user of QueryPie MCP, I am extremely impressed with its seamless server integration and efficient querying capabilities. The user interface is intuitive, making data management straightforward and stress-free. The support and regular updates from querypie are commendable. Highly recommend checking it out here: https://mcp.so/server/querypie-mcp-server/querypie.