I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

mcp-workers-ai
MCP servers sdk for Cloudflare Workers
1
Github Watches
0
Github Forks
3
Github Stars
MCP Workers AI
MCP servers sdk for Cloudflare Workers
Usage
Install:
yarn add mcp-workers-ai
# or
npm install -S mcp-workers-ai
Load the MCP server tools:
import { loadTools } from "mcp-workers-ai"
const tools = await loadTools([
import("@modelcontextprotocol/server-gitlab"),
import("@modelcontextprotocol/server-slack"),
...
]);
// Pass `tools` to the LLM inference request.
Call a tool:
import { callTool } from "mcp-workers-ai"
// Typically the LLM selects a tool to use.
const selected_tool = {
arguments: {
project_id: 'svensauleau/test',
branch: 'main',
files: [ ... ],
commit_message: 'added unit tests'
},
name: 'push_files'
};
const res = await callTool(selected_tool)
// Pass `res` back into a LLM inference request.
Demo
wrangler configuration:
name = "test"
main = "src/index.ts"
[ai]
binding = "AI"
[vars]
GITLAB_PERSONAL_ACCESS_TOKEN = "glpat-aaaaaaaaaaaaaaaaaaaa"
[alias]
"@modelcontextprotocol/sdk/server/index.js" = "mcp-workers-ai/sdk/server/index.js"
"@modelcontextprotocol/sdk/server/stdio.js" = "mcp-workers-ai/sdk/server/stdio.js"
Worker:
import { loadTools, callTool } from "mcp-workers-ai"
export default {
async fetch(request: Request, env: any): Promise<Response> {
// Make sure to set the token before importing the tools
process.env.GITLAB_PERSONAL_ACCESS_TOKEN = env.GITLAB_PERSONAL_ACCESS_TOKEN;
const tools = await loadTools([
import("@modelcontextprotocol/server-gitlab/dist/"),
]);
const prompt = await request.text();
const response = await env.AI.run(
"@hf/nousresearch/hermes-2-pro-mistral-7b",
{
messages: [{ role: "user", content: prompt }],
tools,
},
);
if (response.tool_calls && response.tool_calls.length > 0) {
const selected_tool = response.tool_calls[0];
const res = await callTool(selected_tool)
if (res.content.length > 1) {
throw new Error("too many responses")
}
const finalResponse = await env.AI.run(
"@hf/nousresearch/hermes-2-pro-mistral-7b",
{
messages: [
{
role: "user",
content: prompt,
},
{
role: "assistant",
content: "",
tool_call: selected_tool.name,
},
{
role: "tool",
name: selected_tool.name,
content: res.content[0].text,
},
],
tools,
},
);
return new Response(finalResponse.response);
} else {
return new Response(response.response);
}
}
};
Calling the AI:
$ curl http://example.com \
-d "create a file called 'joke.txt' in my svensauleau/test project with your favorite joke on the main branch. Use the commit message 'added unit tests'"
I have successfully added a file called 'joke.txt' with a joke to your project 'svensauleau/test' on the main branch. The commit message used was 'added unit tests'. You can view the commit and the file in your project's repository.
Result:
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
Mirror ofhttps://github.com/agentience/practices_mcp_server
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Reviews

user_7w1k3zzc
As a devoted user of QueryPie MCP, I am extremely impressed with its seamless server integration and efficient querying capabilities. The user interface is intuitive, making data management straightforward and stress-free. The support and regular updates from querypie are commendable. Highly recommend checking it out here: https://mcp.so/server/querypie-mcp-server/querypie.