agent-project
Agent 实战:智能路由、任务拆解和链路工程
claude mcp add --transport stdio luochang212-agent-project python gradio_app.py
How to use
This MCP server project provides a set of Gradio-based interfaces for interacting with a PostgreSQL or MySQL database through LLM-powered agents and workflows. It includes a simple chat interface, a Postgres-focused agent for direct SQL-like queries, and a workflow-enabled interface for more structured query tasks. You can launch three separate Gradio apps: a basic chat app, an agent-driven query app, and a workflow-enhanced query app. Each app connects to your local or configured database via the provided code and relies on an LLM service to interpret natural language queries and translate them into database actions. Open http://localhost:7860/ in your browser after starting any of the apps to begin interacting with the interfaces.
How to install
Prerequisites:
- Python 3.8+ installed on your machine
- PostgreSQL and/or MySQL server running locally or remotely
- Optional: a local LLM server or API key configured for the agent to use
Installation steps:
- Clone or download the repository to your local machine.
- Create a .env file in the project root with database connection details, for example: DB_HOST=localhost DB_PORT=5432 DB_NAME=ecommerce_orders DB_USER=admin DB_PASSWORD=admin-password
- Install Python dependencies (adjust if you use a virtual environment): python -m pip install -r requirements.txt
- Start the Gradio interfaces (choose one or more to run): python gradio_app.py python gradio_postgres_agent.py python gradio_postgres_workflow.py
- Open http://localhost:7860/ in your browser to use the UI.
Notes:
- Ensure the .env file is created as described; it is not uploaded for security reasons.
- If your LLM service requires an API key or a local hosting setup, configure it accordingly in the environment or code.
Additional notes
Tips and common issues:
- Make sure the database service is reachable using the credentials in .env. Connection errors are the most common issue when first launching.
- If the Gradio app fails to start, check that port 7860 is not in use by another process.
- The agent interfaces assume basic schemas and example data; replace with your own schema and data as needed.
- For best results, ensure the LLM service you configure is responsive, as latency directly affects user experience in the Gradio apps.
- If you want to run all three apps simultaneously, consider launching them in separate terminal sessions or using a process manager.
- Review the notebooks and modules under intelligent_routing and test_ folders for extended examples and configurations.
Related MCP Servers
VectorCode
A code repository indexing tool to supercharge your LLM experience.
mcp-pinecone
Model Context Protocol server to allow for reading and writing from Pinecone. Rudimentary RAG
mcp-victoriametrics
The implementation of Model Context Protocol (MCP) server for VictoriaMetrics
langgraph-ai
LangGraph AI Repository
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
spring-ai
From Java Dev to AI Engineer: Spring AI Fast Track