
MCP-Server-BigQuery
Un serveur de protocole de contexte modèle qui donne accès à BigQuery
1
Github Watches
11
Github Forks
58
Github Stars
BigQuery MCP server
A Model Context Protocol server that provides access to BigQuery. This server enables LLMs to inspect database schemas and execute queries.
Components
Tools
The server implements one tool:
-
execute-query
: Executes a SQL query using BigQuery dialect -
list-tables
: Lists all tables in the BigQuery database -
describe-table
: Describes the schema of a specific table
Configuration
The server can be configured with the following arguments:
-
--project
(required): The GCP project ID. -
--location
(required): The GCP location (e.g.europe-west9
). -
--dataset
(optional): Only take specific BigQuery datasets into consideration. Several datasets can be specified by repeating the argument (e.g.--dataset my_dataset_1 --dataset my_dataset_2
). If not provided, all datasets in the project will be considered.
Quickstart
Install
Installing via Smithery
To install BigQuery Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install mcp-server-bigquery --client claude
Claude Desktop
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Development/Unpublished Servers Configuration
"mcpServers": {
"bigquery": {
"command": "uv",
"args": [
"--directory",
"{{PATH_TO_REPO}}",
"run",
"mcp-server-bigquery",
"--project",
"{{GCP_PROJECT_ID}}",
"--location",
"{{GCP_LOCATION}}"
]
}
}
Published Servers Configuration
"mcpServers": {
"bigquery": {
"command": "uvx",
"args": [
"mcp-server-bigquery",
"--project",
"{{GCP_PROJECT_ID}}",
"--location",
"{{GCP_LOCATION}}"
]
}
}
Replace {{PATH_TO_REPO}}
, {{GCP_PROJECT_ID}}
, and {{GCP_LOCATION}}
with the appropriate values.
Development
Building and Publishing
To prepare the package for distribution:
- Sync dependencies and update lockfile:
uv sync
- Build package distributions:
uv build
This will create source and wheel distributions in the dist/
directory.
- Publish to PyPI:
uv publish
Note: You'll need to set PyPI credentials via environment variables or command flags:
- Token:
--token
orUV_PUBLISH_TOKEN
- Or username/password:
--username
/UV_PUBLISH_USERNAME
and--password
/UV_PUBLISH_PASSWORD
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory {{PATH_TO_REPO}} run mcp-server-bigquery
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.
🧑🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.
Une liste organisée des serveurs de protocole de contexte de modèle (MCP)
Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)
Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle
Reviews

user_BXKZ88Og
The mcp-server-bigquery by LucasHild is a fantastic tool for integrating with BigQuery effortlessly. Its seamless connection and efficient query handling have greatly improved our data workflows. The straightforward setup and robust performance make it a must-have for any data engineer. Highly recommended!