Get the FREE Ultimate OpenClaw Setup Guide →

fullstack-langgraph-nextjs-agent

Production-ready Next.js template for building AI agents with LangGraph.js. Features MCP integration for dynamic tool loading, human-in-the-loop tool approval, persistent conversation memory with PostgreSQL, and real-time streaming responses. Built with TypeScript, React, Prisma, and Tailwind CSS.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio agentailor-fullstack-langgraph-nextjs-agent npx @modelcontextprotocol/server-filesystem /Users/yourname/Documents \
  --env NOTE="Adjust path to your local filesystem root if needed"

How to use

This LangGraph.js AI Agent template provides a production-ready front-end UI with a backend agent service that integrates with the Model Context Protocol (MCP). It supports dynamic tool loading via MCP, human-in-the-loop tool approvals, persistent memory through PostgreSQL, and real-time streaming responses via Server-Sent Events. You can add or configure MCP servers (stdio or HTTP) to expose tools, and the UI will let you approve, modify, or deny tool calls before they execute. The example filesystem MCP server demonstrates how tools can be sourced and loaded on demand, enabling flexible tool orchestration without changing client code. You can run multiple MCP servers (stdio or HTTP) and the agent will coordinate with LangGraph.js to manage tool usage, memory, and model interactions.

How to install

Prerequisites:

  • Node.js 18+ and pnpm
  • Docker (for PostgreSQL and MinIO during local development)
  • OpenAI API key and/or Google API key if you plan to use those models
  1. Clone the repository:
git clone https://github.com/IBJunior/fullstack-langgraph-nextjs-agent.git
cd fullstack-langgraph-nextjs-agent
  1. Install dependencies:
pnpm install
  1. Configure environment:
cp .env.example .env.local

Edit .env.local with your configuration, for example:

# Database
DATABASE_URL="postgresql://user:password@localhost:5434/agent_db"

# AI Models
OPENAI_API_KEY="sk-..."
GOOGLE_API_KEY="...
  1. Start required services (PostgreSQL and MinIO):
docker compose up -d
  1. Prepare the database (generate Prisma client and migrate):
pnpm prisma:generate
pnpm prisma:migrate
  1. Run the development server:
pnpm dev

Visit http://localhost:3000 to interact with the AI agent.

Additional notes

Tips and notes:

  • The MCP system supports both stdio and HTTP transports. Use stdio for local tooling (e.g., tool binaries) and HTTP for remote tools behind APIs.
  • Tools are loaded dynamically from MCP servers; you can add new servers without changing client code.
  • Ensure your environment variables (DATABASE_URL, OPENAI_API_KEY, GOOGLE_API_KEY) are correctly set in .env.local; without these, model calls or persistence may fail.
  • For production deployments, consider securing MCP HTTP endpoints and using proper authentication/authorization for tool access.
  • The template uses Server-Sent Events for streaming responses; ensure your frontend can gracefully handle reconnects and partial data.
  • If you encounter port or database issues, verify Docker networking and that the database schema matches Prisma migrations.

Related MCP Servers

Sponsor this space

Reach thousands of developers