MCP cover image

MCP Toolbox for Databases is an open source MCP server for databases.

22

Github Watches

74

Github Forks

602

Github Stars

logo

MCP Toolbox for Databases

[!NOTE] MCP Toolbox for Databases is currently in beta, and may see breaking changes until the first stable release (v1.0).

MCP Toolbox for Databases is an open source MCP server for databases It was designed with enterprise-grade and production-quality in mind. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more.

This README provides a brief overview. For comprehensive details, see the full documentation.

[!NOTE] This product was originally named “Gen AI Toolbox for Databases” as its initial development predated MCP, but was renamed to align with recently added MCP compatibility.

Table of Contents

Why Toolbox?

Toolbox helps you build Gen AI tools that let your agents access data in your database. Toolbox provides:

  • Simplified development: Integrate tools to your agent in less than 10 lines of code, reuse tools between multiple agents or frameworks, and deploy new versions of tools more easily.
  • Better performance: Best practices such as connection pooling, authentication, and more.
  • Enhanced security: Integrated auth for more secure access to your data
  • End-to-end observability: Out of the box metrics and tracing with built-in support for OpenTelemetry.

General Architecture

Toolbox sits between your application's orchestration framework and your database, providing a control plane that is used to modify, distribute, or invoke tools. It simplifies the management of your tools by providing you with a centralized location to store and update tools, allowing you to share tools between agents and applications and update those tools without necessarily redeploying your application.

architecture

Getting Started

Installing the server

For the latest version, check the releases page and use the following instructions for your OS and CPU architecture.

Binary

To install Toolbox as a binary:

# see releases page for other versions
export VERSION=0.3.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
Container image You can also install Toolbox as a container:
# see releases page for other versions
export VERSION=0.3.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
Compile from source

To install from source, ensure you have the latest version of Go installed, and then run the following command:

go install github.com/googleapis/genai-toolbox@v0.3.0

Running the server

Configure a tools.yaml to define your tools, and then execute toolbox to start the server:

./toolbox --tools_file "tools.yaml"

You can use toolbox help for a full list of flags! To stop the server, send a terminate signal (ctrl+c on most platforms).

For more detailed documentation on deploying to different environments, check out the resources in the How-to section

Integrating your application

Once your server is up and running, you can load the tools into your application. See below the list of Client SDKs for using various frameworks:

Core
  1. Install Toolbox Core SDK:
    pip install toolbox-core
    
  2. Load tools:
    from toolbox_core import ToolboxClient
    
    # update the url to point to your server
    client = ToolboxClient("http://127.0.0.1:5000")
    
    # these tools can be passed to your application! 
    tools = await client.load_toolset("toolset_name")
    

For more detailed instructions on using the Toolbox Core SDK, see the project's README.

LangChain / LangGraph
  1. Install Toolbox LangChain SDK:
    pip install toolbox-langchain
    
  2. Load tools:
    from toolbox_langchain import ToolboxClient
    
    # update the url to point to your server
    client = ToolboxClient("http://127.0.0.1:5000")
    
    # these tools can be passed to your application! 
    tools = client.load_toolset()
    

For more detailed instructions on using the Toolbox LangChain SDK, see the project's README.

LlamaIndex
  1. Install Toolbox Llamaindex SDK:
    pip install toolbox-llamaindex
    
  2. Load tools:
    from toolbox_llamaindex import ToolboxClient
    
    # update the url to point to your server
    client = ToolboxClient("http://127.0.0.1:5000")
    
    # these tools can be passed to your application! 
    tools = client.load_toolset()
    

For more detailed instructions on using the Toolbox Llamaindex SDK, see the project's README.

Configuration

The primary way to configure Toolbox is through the tools.yaml file. If you have multiple files, you can tell toolbox which to load with the --tools_file tools.yaml flag.

You can find more detailed reference documentation to all resource types in the Resources.

Sources

The sources section of your tools.yaml defines what data sources your Toolbox should have access to. Most tools will have at least one source to execute against.

sources:
  my-pg-source:
    kind: postgres
    host: 127.0.0.1
    port: 5432
    database: toolbox_db
    user: toolbox_user
    password: my-password

For more details on configuring different types of sources, see the Sources.

Tools

The tools section of a tools.yaml define the actions an agent can take: what kind of tool it is, which source(s) it affects, what parameters it uses, etc.

tools:
  search-hotels-by-name:
    kind: postgres-sql
    source: my-pg-source
    description: Search for hotels based on name.
    parameters:
      - name: name
        type: string
        description: The name of the hotel.
    statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';

For more details on configuring different types of tools, see the Tools.

Toolsets

The toolsets section of your tools.yaml allows you to define groups of tools that you want to be able to load together. This can be useful for defining different groups based on agent or application.

toolsets:
    my_first_toolset:
        - my_first_tool
        - my_second_tool
    my_second_toolset:
        - my_second_tool
        - my_third_tool

You can load toolsets by name:

# This will load all tools
all_tools = client.load_toolset()

# This will only load the tools listed in 'my_second_toolset'
my_second_toolset = client.load_toolset("my_second_toolset")

Versioning

This project uses semantic versioning, including a MAJOR.MINOR.PATCH version number that increments with:

  • MAJOR version when we make incompatible API changes
  • MINOR version when we add functionality in a backward compatible manner
  • PATCH version when we make backward compatible bug fixes

The public API that this applies to is the CLI associated with Toolbox, the interactions with official SDKs, and the definitions in the tools.yaml file.

Contributing

Contributions are welcome. Please, see the CONTRIBUTING to get started.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Contributor Code of Conduct for more information.

相关推荐

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • https://reddgr.com
  • Delivers concise Python code and interprets non-English comments

  • Benedikt Ess
  • FindetundanalysiertOnlineProdukteeinschlielichAmazonnachVolumenBewertungenundPreis

  • Beria Joey
  • 你的职业规划师,不走弯路就问我。Sponsor:小红书“ ItsJoe就出行 ”

  • jae-jae
  • MCP server for fetch web page content using Playwright headless browser.

  • ravitemer
  • A powerful Neovim plugin for managing MCP (Model Context Protocol) servers

  • patruff
  • Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools

  • Mintplex-Labs
  • The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.

  • pontusab
  • The Cursor & Windsurf community, find rules and MCPs

  • WangRongsheng
  • 🧑‍🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.

  • av
  • Effortlessly run LLM backends, APIs, frontends, and services with one command.

  • n8n-io
  • Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.

  • appcypher
  • Awesome MCP Servers - A curated list of Model Context Protocol servers

  • 1Panel-dev
  • 🔥 1Panel provides an intuitive web interface and MCP Server to manage websites, files, containers, databases, and LLMs on a Linux server.

    Reviews

    4 (1)
    Avatar
    user_Y0ycMiID
    2025-04-17

    I've been using genai-toolbox by googleapis and it's a game-changer for any AI project. The comprehensive tools and seamless integration have significantly boosted my productivity. Highly recommend checking it out on GitHub: https://github.com/googleapis/genai-toolbox.