learn-low-code-agentic-ai
Low-Code Full-Stack Agentic AI Development using LLMs, n8n, Loveable, UXPilot, Supabase and MCP. Class Videos: https://www.youtube.com/playlist?list=PL0vKVrkG4hWq5T6yqCtUL7ol9rDuEyzBH
claude mcp add --transport stdio panaversity-learn-low-code-agentic-ai node path/to/server.js
How to use
This MCP server serves as the integration layer for the Learn Low-Code Agentic AI stack described in the repository. It coordinates model execution, tool invocation, data storage references, and memory/context handling across the low-code components (UXPilot, Lovable.dev, n8n, Supabase) through the MCP protocol. You can use this server to route prompts to LLMs, orchestrate tool calls (HTTP, databases, file operations, vector stores), and manage state across steps in an agent-driven workflow. Typical usage involves deploying the MCP server alongside your frontend and backend services, then configuring agents and workflows in n8n to leverage MCP for standardized model-to-tool communication, secure data exchange, and reusable context propagation.
How to install
Prerequisites:
- Node.js (LTS) installed on your system
- Access to a Git clone of the repository
- Basic familiarity with running Node.js apps and environment variable configuration
Installation steps:
- Clone the repository: git clone <repository-url>
- Navigate into the project directory: cd learn-low-code-agentic-ai
- Install dependencies (adjust package manager as needed): npm install
- Configure environment variables if required by the MCP server or adjacent services (see additional_notes for common vars).
- Start the MCP server (example): node path/to/server.js
- Verify the server is running by checking logs or hitting a health endpoint if exposed.
If a docker image is provided for this project, you can alternatively run: docker run -i image-name
Note: Replace path/to/server.js with the actual entry point and image-name with the published container image if available.
Additional notes
Environment variables and configuration options may vary depending on your deployment: - MCP_PORT: Port for the MCP server to listen on - MCP_HOST: Hostname or IP for MCP binding - DATABASE_URL or SUPABASE_URL: Connections to data stores - API_KEYS: Any external service keys required by n8n, OpenAI, or embedding/vector services. If you encounter connectivity issues, ensure CORS, network ACLs, and firewall rules allow traffic between the MCP server, frontend, and data services. When debugging, enable verbose logs and verify that the MCP protocol handshakes correctly with agents and tool nodes. If using a local development setup, consider using a docker-compose file to orchestrate MCP, n8n, and Supabase/services for repeatable local runs.
Related MCP Servers
mindsdb
Query Engine for AI Analytics: Build self-reasoning agents across all your live data
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
mcp-framework
A framework for writing MCP (Model Context Protocol) servers in Typescript
agentcontrolplane
ACP is the Agent Control Plane - a distributed agent scheduler optimized for simplicity, clarity, and control. It is designed for outer-loop agents that run without supervision, and make asynchronous tool calls like requesting human feedback on key operations. Full MCP support.
dat
Asking yours data in a natural language way through pre-modeling (data models and semantic models).
mcp-toolbox-sdk-python
Python SDK for interacting with the MCP Toolbox for Databases.