Get the FREE Ultimate OpenClaw Setup Guide →

mcp-like

Custom implementation to provide similar functionality as MCP (model context Protocol) strictly for learning. NOT FOR PRODUCTION.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio freakynit-mcp-like-server node src/server.js \
  --env QDRANT_HOST="http://localhost:6333" \
  --env QDRANT_PORT="6333" \
  --env FIRECRAWL_KEY="<your-firecrawl-key-if-needed>" \
  --env CHAT_LOG_LEVEL="info" \
  --env QDRANT_STORAGE_PATH="./qdrant_storage"

How to use

This MCP-like server is a custom implementation that demonstrates how an LLM can orchestrate external tools (functions) to fulfill user queries. The server exposes endpoints that let the LLM discover available functions and then execute a chosen function with the provided arguments. In practice, you run the server, start a sample chat loop, and interact with it to see how tool calls are made, how results are returned, and how the final answer is presented to the user. The repository's sample chats illustrate using a search function to fetch news-like results, obtaining server CPU counts, or extracting content from a page. You can customize or add new tools under the functions directory, and the LLM will be prompted to decide when to call them. To use it, start the server with the provided script, then run the sample_chat helper to see tool calls and final answers in action. The two core endpoints are /api/functions/search (to discover tools) and /api/functions/execute (to run a chosen tool with arguments).

How to install

Prerequisites:

  • Node.js and npm installed
  • A running Qdrant instance (or adjust vector store setup as needed)
  • Optional: a Firecrawl API key if page scraping is required

Installation steps:

  1. Install dependencies

    • npm install
  2. Prepare environment

    • Copy .env.example to .env and fill required values (e.g., FIRECRAWL_KEY if you plan to use the scraping tool).
    • Ensure Qdrant is up and reachable (default http://localhost:6333).
  3. Start the MCP-like server

    • npm run server
  4. Run the sample chat (optional but recommended)

    • In another terminal: npm run example_chat
  5. Verify functionality

    • Use the sample chat prompt like: how are you doing or search for latest US politics news to see tool calls and final results.

Additional notes

Tips and common issues:

  • Ensure the .env (or environment variables) provides necessary keys (FIRECRAWL_KEY only if you use scraping tool).
  • The server relies on a Qdrant vector store; make sure the Qdrant daemon is running with default ports or adjust the configuration accordingly.
  • When testing, you can enable verbose debug logs to see tool-call details; otherwise, set CHAT_LOG_LEVEL to info to see final answers only.
  • Tools are defined in the functions directory; adding a new tool typically requires exposing an endpoint and updating the tool registry so the LLM can decide to call it.
  • If you see connection errors to endpoints, verify that the server is up and that the /api/functions/search and /api/functions/execute routes are reachable.
  • The sample chat demonstrates both native responses (no tool call) and tool-assisted responses; use those flows to understand how the LLM decides to call tools.
  • If you modify server paths or ports, update the mcp_config accordingly to reflect the correct command and arguments.

Related MCP Servers

Sponsor this space

Reach thousands of developers