mcp-toolbox-sdk-python
Python SDK for interacting with the MCP Toolbox for Databases.
claude mcp add --transport stdio googleapis-mcp-toolbox-sdk-python uvx toolbox-core \ --env TOOLBOX_AUTH_TOKEN="your-auth-token-or-placeholder" \ --env TOOLBOX_SERVICE_URL="https://your-toolbox-service-url"
How to use
This MCP server wraps the Toolbox Python SDKs to expose Toolbox tools for GenAI applications. It enables you to fetch tool definitions from your running Toolbox instance, obtain Python objects or callables that represent those tools, and invoke them within your Python applications or orchestrators. The tooling supports integration with common frameworks like LangChain or LlamaIndex through the corresponding toolbox packages, while remaining usable in a framework-agnostic manner via toolbox-core. By using this MCP server, you can centralize tool definitions and authentication handling and then use the returned Python interfaces to execute API connectors, databases, or other Toolbox-managed tools from your code or automation flows.
To use it, install the toolbox-core package (or the broader toolbox suite) via the MCP server configuration, then connect your application to the Toolbox service URL. You can load tool definitions, instantiate tool objects or wrappers, and invoke them with the appropriate parameters. If you are building with LangChain, LangGraph, or LlamaIndex, prefer the corresponding toolbox-* package to ensure compatibility with those ecosystems. The setup typically involves creating a client instance, loading tools from Toolbox, and then calling the tools as standard Python functions or objects within your orchestration logic.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- Network access to the Toolbox service endpoint
- A running MCP Toolbox service instance
- uvx utility installed (as used by this MCP server configuration) or a compatible Python runtime
Install and run the MCP server client:
-
Install uvx (or ensure it is available in your environment). If your environment uses a standard installation flow for uvx, follow those instructions to install the tool.
-
Install the toolbox-core package (the core, framework-agnostic SDK) via the MCP server run command described in mcp_config:
- Ensure your Toolbox service URL and authentication token are configured (see env vars in mcp_config).
-
Start or run your MCP server configuration as defined in the mcp_config section of this repository. The MCP server will fetch definitions from Toolbox and expose usable Python tool interfaces to your application.
Example commands (illustrative; adapt to your environment):
# Ensure Python is installed
python3 --version
# Example: install uvx if applicable to your setup (follow your system docs)
# (This step varies by environment; provide your standard uvx installation command here)
# The MCP server configuration will be used as defined in the repository's mcp_config
# Typically you would ensure the environment variables are set, then run the MCP runner
export TOOLBOX_SERVICE_URL=https://your-toolbox-service-url
export TOOLBOX_AUTH_TOKEN=your-auth-token-or-placeholder
# Run the MCP server (via uvx as configured)
# The exact invocation depends on your environment; the config in this repo uses: uvx toolbox-core
uvx toolbox-core
- Verify access by invoking a loaded Toolbox tool through Python, ensuring the tool definitions are loaded and the tool calls succeed.
Additional notes
Notes and tips:
- Ensure the Toolbox service URL (TOOLBOX_SERVICE_URL) and authentication token (TOOLBOX_AUTH_TOKEN) are correctly configured in the environment where the MCP server runs.
- If you are using LangChain, LangGraph, or LlamaIndex in your application, prefer the corresponding toolbox-<integration> package for best compatibility.
- The exact package versions should be aligned with your Toolbox service version to avoid compatibility issues.
- If you encounter connectivity or authentication issues, check network access, service URL correctness, and token validity. Review the Toolbox service logs for tool-loading errors.
- For local development, you can run a minimal Toolbox service instance and point the MCP server to it to test tool loading and invocation end-to-end.
Related MCP Servers
mindsdb
Query Engine for AI Analytics: Build self-reasoning agents across all your live data
NagaAgent
A simple yet powerful agent framework for personal assistants, designed to enable intelligent interaction, multi-agent collaboration, and seamless tool integration.
openinference
OpenTelemetry Instrumentation for AI Observability
AgentChat
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
sample-agentic-ai-demos
Collection of examples of how to use Model Context Protocol with AWS.
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.