Confidential guide on numerology and astrology, based of GG33 Public information

higress-ops-mcp-server
A Model Context Protocol (MCP) server implementation that enables comprehensive configuration and management of Higress.
3 years
Works with Finder
7
Github Watches
2
Github Forks
9
Github Stars
Higress OPS MCP Server
A Model Context Protocol (MCP) server implementation that enables comprehensive configuration and management of Higress. This repository also provides an MCP client built on top of LangGraph and LangChain MCP Adapters, facilitating interaction with the Higress MCP Server through a well-designed agent flow architecture.
Demo
https://github.com/user-attachments/assets/bae66b77-a158-452e-9196-98060bac0df7
Config Environment Variables
Copy the .env.example
file to .env
and fill in the corresponding values.
Start MCP Client and MCP Server
In stdio mode, the MCP server process is started by the MCP client program. Run the following command to start the MCP client and MCP server:
uv run client.py
Add a new tool
Step 1: Create a new tool class or extend an existing one
- Create a new file in the tools directory if adding a completely new tool category
- Or add your tool to an existing class if it fits an existing category
from typing import Dict, List, Any
from fastmcp import FastMCP
class YourTools:
def register_tools(self, mcp: FastMCP):
@mcp.tool()
async def your_tool_function(arg1: str, arg2: int) -> List[Dict]:
"""
Your tool description.
Args:
arg1: Description of arg1
arg2: Description of arg2
Returns:
Description of the return value
Raises:
ValueError: If the request fails
"""
# Implementation using self.higress_client to make API calls
return self.higress_client.your_api_method(arg1, arg2)
Step 2: Add a new method to HigressClient if your tool needs to interact with the Higress Console API
- Add methods to utils/higress_client.py that encapsulate API calls
- Use the existing HTTP methods (get, put, post) for actual API communication
def your_api_method(self, arg1: str, arg2: int) -> List[Dict]:
"""
Description of what this API method does.
Args:
arg1: Description of arg1
arg2: Description of arg2
Returns:
Response data
Raises:
ValueError: If the request fails
"""
path = "/v1/your/api/endpoint"
data = {"arg1": arg1, "arg2": arg2}
return self.put(path, data) # or self.get(path) or self.post(path, data)
Step 3: Register your tool class in the server
- Add your tool class to the tool_classes list in server.py
- This list is used by ToolsRegister to instantiate and register all tools
- The ToolsRegister will automatically set logger and higress_client attributes
tool_classes = [
CommonTools,
RequestBlockTools,
RouteTools,
ServiceSourceTools,
YourTools # Add your tool class here
]
Step 4: Add your tool to SENSITIVE_TOOLS
if it requires human confirmation
- Tools in this list will require human confirmation before execution
# Define write operations that require human confirmation
SENSITIVE_TOOLS = [
"add_route",
"add_service_source",
"update_route",
"update_request_block_plugin",
"update_service_source",
"your_tool_function" # Add your tool name here if it requires confirmation
]
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Mirror ofhttps://github.com/agentience/practices_mcp_server
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
An AI chat bot for small and medium-sized teams, supporting models such as Deepseek, Open AI, Claude, and Gemini. 专为中小团队设计的 AI 聊天应用,支持 Deepseek、Open AI、Claude、Gemini 等模型。
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Reviews

user_LaJ0utPb
I've been using the higress-ops-mcp-server developed by the higress-group, and it has significantly streamlined my microservices control processes. The server's robust capabilities offer seamless integration and management, ensuring a smooth operational workflow. Highly recommended for anyone looking to enhance their microservices architecture. Check it out at https://github.com/higress-group/higress-ops-mcp-server!