Cover image
Try Now
2024-12-05

SDK des serveurs MCP pour les travailleurs de CloudFlare

3 years

Works with Finder

1

Github Watches

0

Github Forks

3

Github Stars

MCP Workers AI

MCP servers sdk for Cloudflare Workers

Usage

Install:

yarn add mcp-workers-ai
# or
npm install -S mcp-workers-ai

Load the MCP server tools:

import { loadTools } from "mcp-workers-ai"

const tools = await loadTools([
  import("@modelcontextprotocol/server-gitlab"),
  import("@modelcontextprotocol/server-slack"),
  ...
]);

// Pass `tools` to the LLM inference request.

Call a tool:

import { callTool } from "mcp-workers-ai"

// Typically the LLM selects a tool to use.
const selected_tool = {
  arguments: {
    project_id: 'svensauleau/test',
    branch: 'main',
    files: [ ... ],
    commit_message: 'added unit tests'
  },
  name: 'push_files'
};

const res = await callTool(selected_tool)

// Pass `res` back into a LLM inference request.

Demo

wrangler configuration:

name = "test"
main = "src/index.ts"

[ai]
binding = "AI"

[vars]
GITLAB_PERSONAL_ACCESS_TOKEN = "glpat-aaaaaaaaaaaaaaaaaaaa"

[alias]
"@modelcontextprotocol/sdk/server/index.js" = "mcp-workers-ai/sdk/server/index.js"
"@modelcontextprotocol/sdk/server/stdio.js" = "mcp-workers-ai/sdk/server/stdio.js"

Worker:

import { loadTools, callTool } from "mcp-workers-ai"

export default {
  async fetch(request: Request, env: any): Promise<Response> {
    // Make sure to set the token before importing the tools
    process.env.GITLAB_PERSONAL_ACCESS_TOKEN = env.GITLAB_PERSONAL_ACCESS_TOKEN;

    const tools = await loadTools([
      import("@modelcontextprotocol/server-gitlab/dist/"),
    ]);

    const prompt = await request.text();

    const response = await env.AI.run(
      "@hf/nousresearch/hermes-2-pro-mistral-7b",
      {
        messages: [{ role: "user", content: prompt }],
        tools,
      },
    );

    if (response.tool_calls && response.tool_calls.length > 0) {
      const selected_tool = response.tool_calls[0];
      const res = await callTool(selected_tool)

      if (res.content.length > 1) {
        throw new Error("too many responses")
      }

      const finalResponse = await env.AI.run(
        "@hf/nousresearch/hermes-2-pro-mistral-7b",
        {
          messages: [
            {
              role: "user",
              content: prompt,
            },
            {
              role: "assistant",
              content: "",
              tool_call: selected_tool.name,
            },
            {
              role: "tool",
              name: selected_tool.name,
              content: res.content[0].text,
            },
          ],
          tools,
        },
      );
      return new Response(finalResponse.response);

    } else {
      return new Response(response.response);
    }
  }
};

Calling the AI:

$ curl http://example.com \
  -d "create a file called 'joke.txt' in my svensauleau/test project with your favorite joke on the main branch. Use the commit message 'added unit tests'"

I have successfully added a file called 'joke.txt' with a joke to your project 'svensauleau/test' on the main branch. The commit message used was 'added unit tests'. You can view the commit and the file in your project's repository.

Result: demo

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • https://hubeiqiao.com
  • IT problem solver with clear, step-by-step guidance.

  • Andris Teikmanis
  • Latvian GPT assistant for developing GPT applications

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Navid RezaeiSarchoghaei
  • Professional Flask/SQLAlchemy code guide. Follow: https://x.com/navid_re

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • https://cantaspinar.com
  • Summarizes videos and answers related questions.

  • Khalid kalib
  • Write professional emails

  • https://tovuti.be
  • Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • OffchainLabs
  • Aller la mise en œuvre de la preuve de la participation Ethereum

  • oatpp
  • Implémentation du protocole de contexte du modèle d'Anthropic pour l'avoine ++

  • huahuayu
  • Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.

  • deemkeen
  • Contrôlez votre MBOT2 avec un combo d'alimentation: MQTT + MCP + LLM

  • zhaoyunxing92
  • 本项目是一个钉钉 MCP (Protocole de connecteur de message) 服务 , 提供了与钉钉企业应用交互的 API 接口。项目基于 Go 语言开发 , 支持员工信息查询和消息发送等功能。

    Reviews

    4 (1)
    Avatar
    user_7w1k3zzc
    2025-04-16

    As a devoted user of QueryPie MCP, I am extremely impressed with its seamless server integration and efficient querying capabilities. The user interface is intuitive, making data management straightforward and stress-free. The support and regular updates from querypie are commendable. Highly recommend checking it out here: https://mcp.so/server/querypie-mcp-server/querypie.