I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

servidor BigQuery-MCP
3 years
Works with Finder
1
Github Watches
1
Github Forks
0
Github Stars
BigQuery MCP Server
A Model Context Protocol (MCP) server for accessing Google BigQuery. This server enables Large Language Models (LLMs) to understand BigQuery dataset structures and execute SQL queries.
Features
Authentication and Connection Management
- Supports Application Default Credentials (ADC) or service account key files
- Configurable project ID and location settings
- Authentication verification on startup
Tools
-
query
- Execute read-only (SELECT) BigQuery SQL queries
- Configurable maximum results and bytes billed
- Security checks to prevent non-SELECT queries
-
list_all_datasets
- List all datasets in the project
- Returns an array of dataset IDs
-
list_all_tables_with_dataset
- List all tables in a specific dataset with their schemas
- Requires a datasetId parameter
- Returns table IDs, schemas, time partitioning information, and descriptions
-
get_table_information
- Get table schema and sample data (up to 20 rows)
- Support for partitioned tables with partition filters
- Warnings for queries on partitioned tables without filters
-
dry_run_query
- Check query validity and estimate cost without execution
- Returns processing size and estimated cost
Security Features
- Only SELECT queries are allowed (read-only access)
- Default limit of 500GB for query processing to prevent excessive costs
- Partition filter recommendations for partitioned tables
- Secure handling of authentication credentials
Installation
Local Installation
# Clone the repository
git clone https://github.com/yourusername/bigquery-mcp-server.git
cd bigquery-mcp-server
# Install dependencies
bun install
# Build the server
bun run build
# Install command to your own path.
cp dist/bigquery-mcp-server /path/to/your_place
Docker Installation
You can also run the server in a Docker container:
# Build the Docker image
docker build -t bigquery-mcp-server .
# Run the container
docker run -it --rm \
bigquery-mcp-server \
--project-id=your-project-id
Or using Docker Compose:
# Edit docker-compose.yml to set your project ID and other options
# Then run:
docker-compose up
MCP Configuration
To use this server with an MCP-enabled LLM, add it to your MCP configuration:
{
"mcpServers": {
"BigQuery": {
"command": "/path/to/dist/bigquery-mcp-server",
"args": [
"--project-id",
"your-project-id",
"--location",
"asia-northeast1",
"--max-results",
"1000",
"--max-bytes-billed",
"500000000000"
],
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/service-account-key.json"
}
}
}
}
You can also use Application Default Credentials instead of a service account key file:
{
"mcpServers": {
"BigQuery": {
"command": "/path/to/dist/bigquery-mcp-server",
"args": [
"--project-id",
"your-project-id",
"--location",
"asia-northeast1",
"--max-results",
"1000",
"--max-bytes-billed",
"500000000000"
]
}
}
}
Setting up Application Default Credentials
To authenticate using Application Default Credentials:
-
Install the Google Cloud SDK if you haven't already:
# For macOS brew install --cask google-cloud-sdk # For other platforms, see: https://cloud.google.com/sdk/docs/install
-
Run the authentication command:
gcloud auth application-default login
-
Follow the prompts to log in with your Google account that has access to the BigQuery project.
-
The credentials will be saved to your local machine and automatically used by the BigQuery MCP server.
Testing
You can use inspector for testing and debugging.
npx @modelcontextprotocol/inspector dist/bigquery-mcp-server --project-id={{your_own_project}}
Usage
Using the Helper Script
The included run-server.sh
script makes it easy to start the server with common configurations:
# Make the script executable
chmod +x run-server.sh
# Run with Application Default Credentials
./run-server.sh --project-id=your-project-id
# Run with a service account key file
./run-server.sh \
--project-id=your-project-id \
--location=asia-northeast1 \
--key-file=/path/to/service-account-key.json \
--max-results=1000 \
--max-bytes-billed=500000000000
Manual Execution
You can also run the compiled binary directly:
# Run with Application Default Credentials
./dist/bigquery-mcp-server --project-id=your-project-id
# Run with a service account key file
./dist/bigquery-mcp-server \
--project-id=your-project-id \
--location=asia-northeast1 \
--key-file=/path/to/service-account-key.json \
--max-results=1000 \
--max-bytes-billed=500000000000
Example Client
An example Node.js client is included in the examples
directory:
# Make the example executable
chmod +x examples/sample-query.js
# Edit the example to set your project ID
# Then run it
cd examples
./sample-query.js
Command Line Options
-
--project-id
: Google Cloud project ID (required) -
--location
: BigQuery location (default: asia-northeast1) -
--key-file
: Path to service account key file (optional) -
--max-results
: Maximum rows to return (default: 1000) -
--max-bytes-billed
: Maximum bytes to process (default: 500000000000, 500GB)
Required Permissions
The service account or user credentials should have one of the following:
-
roles/bigquery.user
(recommended)
Or both of these:
-
roles/bigquery.dataViewer
(for reading table data) -
roles/bigquery.jobUser
(for executing queries)
Example Usage
Query Tool
{
"query": "SELECT * FROM `project.dataset.table` LIMIT 10",
"maxResults": 100
}
List All Datasets Tool
// No parameters required
List All Tables With Dataset Tool
{
"datasetId": "your_dataset"
}
Get Table Information Tool
{
"datasetId": "your_dataset",
"tableId": "your_table",
"partition": "20250101"
}
Dry Run Query Tool
{
"query": "SELECT * FROM `project.dataset.table` WHERE date = '2025-01-01'"
}
Error Handling
The server provides detailed error messages for:
- Authentication failures
- Permission issues
- Invalid queries
- Missing partition filters
- Excessive data processing requests
Code Structure
The server is organized into the following structure:
src/
├── index.ts # Entry point
├── server.ts # BigQueryMcpServer class
├── types.ts # Type definitions
├── tools/ # Tool implementations
│ ├── query.ts # query tool
│ ├── list-datasets.ts # list_all_datasets tool
│ ├── list-tables.ts # list_all_tables_with_dataset tool
│ ├── table-info.ts # get_table_information tool
│ └── dry-run.ts # dry_run_query tool
└── utils/ # Utility functions
├── args-parser.ts # Command line argument parser
└── query-utils.ts # Query validation and response formatting
License
MIT
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Therapist adept at identifying core issues and offering practical advice with images.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Una puerta de enlace de API unificada para integrar múltiples API de explorador de blockchain similar a Esterscan con soporte de protocolo de contexto modelo (MCP) para asistentes de IA.
Espejo dehttps: //github.com/agentience/practices_mcp_server
Espejo de https: //github.com/bitrefill/bitrefill-mcp-server
Reviews

user_2YWVqKNw
As a loyal user of the bigquery-mcp-server, I have been consistently impressed with its performance and reliability. Created by takuya0206, this server seamlessly integrates with BigQuery and has significantly optimized our data management workflows. The thorough documentation and robust functionality make it a standout product in its category. I highly recommend it to anyone looking to enhance their data querying and management capabilities.