MCP cover image
See in Github
2025-01-06

Un servidor de protocolo de contexto modelo que proporciona acceso a BigQuery

1

Github Watches

11

Github Forks

58

Github Stars

BigQuery MCP server

smithery badge

A Model Context Protocol server that provides access to BigQuery. This server enables LLMs to inspect database schemas and execute queries.

Components

Tools

The server implements one tool:

  • execute-query: Executes a SQL query using BigQuery dialect
  • list-tables: Lists all tables in the BigQuery database
  • describe-table: Describes the schema of a specific table

Configuration

The server can be configured with the following arguments:

  • --project (required): The GCP project ID.
  • --location (required): The GCP location (e.g. europe-west9).
  • --dataset (optional): Only take specific BigQuery datasets into consideration. Several datasets can be specified by repeating the argument (e.g. --dataset my_dataset_1 --dataset my_dataset_2). If not provided, all datasets in the project will be considered.

Quickstart

Install

Installing via Smithery

To install BigQuery Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install mcp-server-bigquery --client claude

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration
"mcpServers": {
  "bigquery": {
    "command": "uv",
    "args": [
      "--directory",
      "{{PATH_TO_REPO}}",
      "run",
      "mcp-server-bigquery",
      "--project",
      "{{GCP_PROJECT_ID}}",
      "--location",
      "{{GCP_LOCATION}}"
    ]
  }
}
Published Servers Configuration
"mcpServers": {
  "bigquery": {
    "command": "uvx",
    "args": [
      "mcp-server-bigquery",
      "--project",
      "{{GCP_PROJECT_ID}}",
      "--location",
      "{{GCP_LOCATION}}"
    ]
  }
}

Replace {{PATH_TO_REPO}}, {{GCP_PROJECT_ID}}, and {{GCP_LOCATION}} with the appropriate values.

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory {{PATH_TO_REPO}} run mcp-server-bigquery

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • Mintplex-Labs
  • La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

  • pontusab
  • La comunidad de cursor y windsurf, encontrar reglas y MCP

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.

  • n8n-io
  • Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.

  • av
  • Ejecute sin esfuerzo LLM Backends, API, frontends y servicios con un solo comando.

    Reviews

    3 (1)
    Avatar
    user_BXKZ88Og
    2025-04-17

    The mcp-server-bigquery by LucasHild is a fantastic tool for integrating with BigQuery effortlessly. Its seamless connection and efficient query handling have greatly improved our data workflows. The straightforward setup and robust performance make it a must-have for any data engineer. Highly recommended!