MCP cover image
See in Github
2024-12-04

Miroir dehttps: //github.com/lucashild/mcp-server-bigquery

0

Github Watches

1

Github Forks

0

Github Stars

BigQuery MCP server

A Model Context Protocol server that provides access to BigQuery. This server enables LLMs to inspect database schemas and execute queries.

Components

Tools

The server implements one tool:

  • execute-query: Executes a SQL query using BigQuery dialect
  • list-tables: Lists all tables in the BigQuery database
  • describe-table: Describes the schema of a specific table

Configuration

The server can be configured with the following arguments:

  • --project (required): The GCP project ID.
  • --location (required): The GCP location (e.g. europe-west9).
  • --dataset (optional): Only take specific BigQuery datasets into consideration. Several datasets can be specified by repeating the argument (e.g. --dataset my_dataset_1 --dataset my_dataset_2). If not provided, all tables in the project will be considered.

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration ``` "mcpServers": { "bigquery": { "command": "uv", "args": [ "--directory", "{{PATH_TO_REPO}}", "run", "mcp-server-bigquery", "--project", "{{GCP_PROJECT_ID}}", "--location", "{{GCP_LOCATION}}" ] } } ```
Published Servers Configuration ``` "mcpServers": { "bigquery": { "command": "uvx", "args": [ "mcp-server-bigquery", "--project", "{{GCP_PROJECT_ID}}", "--location", "{{GCP_LOCATION}}" ] } } ```

Replace {{PATH_TO_REPO}}, {{GCP_PROJECT_ID}}, and {{GCP_LOCATION}} with the appropriate values.

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory {{PATH_TO_REPO}} run mcp-server-bigquery

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Andris Teikmanis
  • Latvian GPT assistant for developing GPT applications

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://cantaspinar.com
  • Summarizes videos and answers related questions.

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • modelcontextprotocol
  • Serveurs de protocole de contexte modèle

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • n8n-io
  • Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.

  • OffchainLabs
  • Aller la mise en œuvre de la preuve de la participation Ethereum

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

  • open-webui
  • Interface AI conviviale (prend en charge Olllama, Openai API, ...)

    Reviews

    5 (1)
    Avatar
    user_ZcqEXyPn
    2025-04-15

    I've been using Notion MCP Server by KingGuizzard for a few weeks now and it's fantastic! It offers seamless integration with Notion, enhancing my productivity significantly. The user-friendly interface and robust features make it a must-have for anyone looking to optimize their workflow. Highly recommend checking it out!