Confidential guide on numerology and astrology, based of GG33 Public information

Higress-Ops-MCP-Server
Une implémentation du serveur de protocole de contexte de modèle (MCP) qui permet une configuration et une gestion complètes de Higress.
3 years
Works with Finder
7
Github Watches
2
Github Forks
9
Github Stars
Higress OPS MCP Server
A Model Context Protocol (MCP) server implementation that enables comprehensive configuration and management of Higress. This repository also provides an MCP client built on top of LangGraph and LangChain MCP Adapters, facilitating interaction with the Higress MCP Server through a well-designed agent flow architecture.
Demo
https://github.com/user-attachments/assets/bae66b77-a158-452e-9196-98060bac0df7
Config Environment Variables
Copy the .env.example
file to .env
and fill in the corresponding values.
Start MCP Client and MCP Server
In stdio mode, the MCP server process is started by the MCP client program. Run the following command to start the MCP client and MCP server:
uv run client.py
Add a new tool
Step 1: Create a new tool class or extend an existing one
- Create a new file in the tools directory if adding a completely new tool category
- Or add your tool to an existing class if it fits an existing category
from typing import Dict, List, Any
from fastmcp import FastMCP
class YourTools:
def register_tools(self, mcp: FastMCP):
@mcp.tool()
async def your_tool_function(arg1: str, arg2: int) -> List[Dict]:
"""
Your tool description.
Args:
arg1: Description of arg1
arg2: Description of arg2
Returns:
Description of the return value
Raises:
ValueError: If the request fails
"""
# Implementation using self.higress_client to make API calls
return self.higress_client.your_api_method(arg1, arg2)
Step 2: Add a new method to HigressClient if your tool needs to interact with the Higress Console API
- Add methods to utils/higress_client.py that encapsulate API calls
- Use the existing HTTP methods (get, put, post) for actual API communication
def your_api_method(self, arg1: str, arg2: int) -> List[Dict]:
"""
Description of what this API method does.
Args:
arg1: Description of arg1
arg2: Description of arg2
Returns:
Response data
Raises:
ValueError: If the request fails
"""
path = "/v1/your/api/endpoint"
data = {"arg1": arg1, "arg2": arg2}
return self.put(path, data) # or self.get(path) or self.post(path, data)
Step 3: Register your tool class in the server
- Add your tool class to the tool_classes list in server.py
- This list is used by ToolsRegister to instantiate and register all tools
- The ToolsRegister will automatically set logger and higress_client attributes
tool_classes = [
CommonTools,
RequestBlockTools,
RouteTools,
ServiceSourceTools,
YourTools # Add your tool class here
]
Step 4: Add your tool to SENSITIVE_TOOLS
if it requires human confirmation
- Tools in this list will require human confirmation before execution
# Define write operations that require human confirmation
SENSITIVE_TOOLS = [
"add_route",
"add_service_source",
"update_route",
"update_request_block_plugin",
"update_service_source",
"your_tool_function" # Add your tool name here if it requires confirmation
]
相关推荐
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Miroir dehttps: //github.com/bitrefill/bitrefill-mcp-server
MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.
Un bot de chat IA pour les petites et moyennes équipes, soutenant des modèles tels que Deepseek, Open AI, Claude et Gemini. 专为中小团队设计的 Ai 聊天应用 , 支持 Deepseek 、 Open Ai 、 Claude 、 Gemini 等模型。
Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)
Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle
Reviews

user_LaJ0utPb
I've been using the higress-ops-mcp-server developed by the higress-group, and it has significantly streamlined my microservices control processes. The server's robust capabilities offer seamless integration and management, ensuring a smooth operational workflow. Highly recommended for anyone looking to enhance their microservices architecture. Check it out at https://github.com/higress-group/higress-ops-mcp-server!