Get the FREE Ultimate OpenClaw Setup Guide →

nestjs -langchainjs-demo

This repository demonstrates a NestJS implementation of the Model Context Protocol (MCP) with a microservice architecture.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio liusdev-nestjs-mcp-server-langchainjs-demo npx -y liusdev-nestjs-mcp-server-langchainjs-demo \
  --env PORT="3000" \
  --env OPENAI_API_KEY="your_openai_api_key_here" \
  --env OPENAI_API_URL="https://api.openai.com/v1"

How to use

This MCP server implements the Model Context Protocol (MCP) using a NestJS-based architecture. It exposes tools via the MCP interface, including a function to get the current time context for LLMs. The accompanying MCP backend connects to this server (via LangChain.js) and retrieves available tools so that a language model can call them as needed. You can connect the MCP backend to this server (and potentially other MCP servers) to enable cross-server tool discovery and usage.

To use it, first start the MCP server and the MCP backend. The server provides a time-context function that can be invoked by the backend or by clients that understand MCP endpoints. The backend uses LangChain.js with MCP adapters to fetch the tools from the server and route user queries through the appropriate tools. When you send a request to the MCP backend, the backend resolves the available tools from the MCP server(s) and uses them to process your query, returning the result produced by the underlying tools (e.g., current time for a given region).

If you want to connect to multiple MCP servers, configure each server in the MCP client module (as shown in the repository’s example). The backend will fetch tools from all configured servers and present them to the LLM, with optional prefixes to avoid name collisions.

How to install

Prerequisites:

  • Node.js v20 or higher
  • npm

Installation steps:

  1. Clone the repository

  2. Install dependencies

npm install
  1. Copy the environment template to your working environment and add your API keys
cp .env.example .env

Then edit the .env file to include your OpenAI API key and any required configuration, for example:

OPENAI_API_KEY=your_openai_api_key_here
OPENAI_API_URL=https://api.openai.com/v1
PORT=3000
  1. Start the MCP server and MCP backend (in separate terminals)
# Start the MCP server
npm run start:dev mcp-server
# Start the MCP backend
npm run start:dev mcp-backend
  1. Verify operation by sending requests to the MCP backend (e.g., http://localhost:3001 by default) as demonstrated in the README.

Additional notes

Tips and notes:

  • Ensure your OpenAI API key is valid and your API quota is sufficient for testing.
  • When connecting to multiple MCP servers, consider prefixing tool names with the server name to avoid conflicts (as shown in the repository example).
  • Each MCP server should be reachable from the MCP backend; ensure network permissions and CORS (if applicable) permit connections.
  • If you encounter port conflicts, adjust the PORT environment variable for the server or backend to use distinct ports.
  • The MCP backend can dynamically fetch tools from all configured MCP servers and expose them to the LangChain.js pipeline via MCP adapters.

Related MCP Servers

Sponsor this space

Reach thousands of developers