Cover image
Try Now
2025-04-04

Agente de IA que recupera los datos meteorológicos del servidor MCP para proporcionar pronósticos automatizados. Ideal para la integración en aplicaciones relacionadas con el clima.

3 years

Works with Finder

1

Github Watches

0

Github Forks

0

Github Stars

Gemini API with MCP Tool Integration

This project demonstrates how to integrate the Google Gemini API with custom tools managed by the MCP (Multi-Cloud Platform) framework. It uses the Gemini API to process natural language queries, and leverages MCP tools to execute specific actions based on the query's intent.

Prerequisites

Before running this project, ensure you have the following:

  • Python 3.7 or higher

  • A Google Cloud project with the Gemini API enabled and an API key.

  • An MCP environment set up with the necessary tools.

  • .env file with the following environment variables:

    GEMINI_API_KEY=<your_gemini_api_key>
    GEMINI_MODEL=<your_gemini_model_name>
    MCP_RUNNER=<path_to_mcp_runner>
    MCP_SCRIPT=<path_to_mcp_script>
    

Installation

  1. Clone the repository:

    git clone <repository_url>
    cd <repository_directory>
    
  2. Create a virtual environment (recommended):

    python3 -m venv venv
    source venv/bin/activate  # On macOS/Linux
    
    
  3. Install the required dependencies using uv:

    uv pip install dotenv google-generativeai mcp
    uv add "mcp[cli]" httpx
    uv pip install python-dotenv google-generativeai mcp
    
  4. Create a .env file in the project root and add your environment variables.

GEMINI_API_KEY=your_api_key_here
GEMINI_MODEL=gemini-pro
MCP_RUNNER=path_to_mcp_runner
MCP_SCRIPT=path_to_mcp_script

Usage

To run the application, execute the following command:

python main.py

How It Works

  1. The application loads environment variables and validates their presence
  2. Establishes a connection with the MCP client
  3. Retrieves available tools from the MCP session
  4. Sends the prompt to Gemini's API along with tool definitions
  5. Processes any tool calls made by the model
  6. Returns the final response that includes results from tool calls

Customization

To customize the prompt or behavior:

  1. Modify the prompt variable with your desired text
  2. Adjust the get_contents() function to change how prompts are formatted
  3. Extend process_response() to handle different response types

License

MIT License

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • HiveNexus
  • Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

  • pontusab
  • La comunidad de cursor y windsurf, encontrar reglas y MCP

  • JackKuo666
  • 🔍 Habilitar asistentes de IA para buscar y acceder a la información del paquete PYPI a través de una interfaz MCP simple.

  • av
  • Ejecute sin esfuerzo LLM Backends, API, frontends y servicios con un solo comando.

    Reviews

    3 (1)
    Avatar
    user_dlccgN7j
    2025-04-18

    The Weather-AI-Agent by hitechdk is a phenomenal tool for anyone interested in accurate and up-to-date weather information. Its innovative approach leverages AI to deliver precise forecasts and insights. The user interface is intuitive, and the support from the developer is commendable. As someone who relies heavily on weather updates, this agent has greatly enhanced my planning and decision-making processes. Highly recommend checking it out on GitHub!