I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

gomcptest
Une preuve de concept démontrant un hôte personnalisé implémentant une API compatible OpenAI avec Google Vertex AI, une fonction de fonction et une interaction avec les serveurs MCP.
1
Github Watches
6
Github Forks
29
Github Stars
gomcptest: Proof of Concept for MCP with Custom Host
This project is a proof of concept (POC) demonstrating how to implement a Model Context Protocol (MCP) with a custom-built host to play with agentic systems. The code is primarily written from scratch to provide a clear understanding of the underlying mechanisms.
Goal
The primary goal of this project is to enable easy testing of agentic systems through the Model Context Protocol. For example:
- The
dispatch_agent
could be specialized to scan codebases for security vulnerabilities - Create code review agents that can analyze pull requests for potential issues
- Build data analysis agents that process and visualize complex datasets
- Develop automated documentation agents that can generate comprehensive docs from code
These specialized agents can be easily tested and iterated upon using the tools provided in this repository.
Prerequisites
- Go >= 1.21
- Access to the Vertex AI API on Google Cloud Platform
-
github.com/mark3labs/mcp-go
The tools use the default GCP login credentials configured by gcloud auth login
.
Project Structure
-
host/openaiserver
: Implements a custom host that mimics the OpenAI API, using Google Gemini and function calling. This is the core of the POC. -
tools
: Contains various MCP-compatible tools that can be used with the host:- Bash: Execute bash commands
- Edit: Edit file contents
- GlobTool: Find files matching glob patterns
- GrepTool: Search file contents with regular expressions
- LS: List directory contents
- Replace: Replace entire file contents
- View: View file contents
Components
Key Features
- OpenAI Compatibility: The API is designed to be compatible with the OpenAI v1 chat completion format.
- Google Gemini Integration: It utilizes the VertexAI API to interact with Google Gemini models.
- Streaming Support: The server supports streaming responses.
- Function Calling: Allows Gemini to call external functions and incorporate their results into chat responses.
- MCP Server Interaction: Demonstrates interaction with a hypothetical MCP (Model Control Plane) server for tool execution.
- Single Chat Session: The application uses single chat session, and new conversation will not trigger a new session.
Building the Tools
You can build all the tools using the included Makefile:
# Build all tools
make all
# Or build individual tools
make Bash
make Edit
make GlobTool
make GrepTool
make LS
make Replace
make View
Configuration
Read the .envrc
file in the bin
directory to set up the required environment variables:
export GCP_PROJECT=your-project-id
export GCP_REGION=your-region
export GEMINI_MODELS=gemini-2.0-flash
export IMAGEN_MODELS=imagen-3.0-generate-002
export IMAGE_DIR=/tmp/images
Testing the CLI
You can test the CLI (a tool similar to Claude Code) from the bin
directory with:
./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./dispatch_agent -glob-path .GlobTool -grep-path ./GrepTool -ls-path ./LS -view-path ./View;./Bash;./Replace"
Caution
⚠️ WARNING: These tools have the ability to execute commands and modify files on your system. They should preferably be used in a chroot or container environment to prevent potential damage to your system.
Quickstart
This guide will help you quickly run the openaiserver
located in the host/openaiserver
directory.
Prerequisites
- Go installed and configured.
- Environment variables properly set.
Running the Server
-
Navigate to the
host/openaiserver
directory:cd host/openaiserver
-
Set the required environment variables. Refer to the Configuration section for details on the environment variables. A minimal example:
export IMAGE_DIR=/path/to/your/image/directory export GCP_PROJECT=your-gcp-project-id export IMAGE_DIR=/tmp/images # Directory must exist
-
Run the server:
go run .
or
go run main.go
The server will start and listen on the configured port (default: 8080).
Configuration
The openaiserver
application is configured using environment variables. The following variables are supported:
Global Configuration
Variable | Description | Default | Required |
---|---|---|---|
PORT |
The port the server listens on | 8080 |
No |
LOG_LEVEL |
Log level (DEBUG, INFO, WARN, ERROR) | INFO |
No |
IMAGE_DIR |
Directory to store images | Yes |
GCP Configuration
Variable | Description | Default | Required |
---|---|---|---|
GCP_PROJECT |
Google Cloud Project ID | Yes | |
GEMINI_MODELS |
Comma-separated list of Gemini models | gemini-1.5-pro,gemini-2.0-flash |
No |
GCP_REGION |
Google Cloud Region | us-central1 |
No |
IMAGEN_MODELS |
Comma-separated list of Imagen models | No | |
IMAGE_DIR |
Directory to store images | Yes | |
PORT |
The port the server listens on | 8080 |
No |
Notes
- This is a POC and has limitations.
- The code is provided as is for educational purposes to understand how to implement MCP with a custom host.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
I find academic articles and books for research and literature reviews.
This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.
Reviews

user_VZZC2SHW
The EU AI Act Compliance Chatbot by thoughtgeek is an essential tool for ensuring your AI applications meet regulatory compliance. The chatbot is straightforward, effective, and incredibly user-friendly. It guides you through the necessary steps to adhere to the EU AI Act effortlessly. Highly recommended for anyone needing to navigate complex compliance landscapes with ease. Check it out at https://mcp.so/server/aicompliance_mcp/thoughtgeek.