mcp-like
Custom implementation to provide similar functionality as MCP (model context Protocol) strictly for learning. NOT FOR PRODUCTION.
claude mcp add --transport stdio freakynit-mcp-like-server node src/server.js \ --env QDRANT_HOST="http://localhost:6333" \ --env QDRANT_PORT="6333" \ --env FIRECRAWL_KEY="<your-firecrawl-key-if-needed>" \ --env CHAT_LOG_LEVEL="info" \ --env QDRANT_STORAGE_PATH="./qdrant_storage"
How to use
This MCP-like server is a custom implementation that demonstrates how an LLM can orchestrate external tools (functions) to fulfill user queries. The server exposes endpoints that let the LLM discover available functions and then execute a chosen function with the provided arguments. In practice, you run the server, start a sample chat loop, and interact with it to see how tool calls are made, how results are returned, and how the final answer is presented to the user. The repository's sample chats illustrate using a search function to fetch news-like results, obtaining server CPU counts, or extracting content from a page. You can customize or add new tools under the functions directory, and the LLM will be prompted to decide when to call them. To use it, start the server with the provided script, then run the sample_chat helper to see tool calls and final answers in action. The two core endpoints are /api/functions/search (to discover tools) and /api/functions/execute (to run a chosen tool with arguments).
How to install
Prerequisites:
- Node.js and npm installed
- A running Qdrant instance (or adjust vector store setup as needed)
- Optional: a Firecrawl API key if page scraping is required
Installation steps:
-
Install dependencies
- npm install
-
Prepare environment
- Copy .env.example to .env and fill required values (e.g., FIRECRAWL_KEY if you plan to use the scraping tool).
- Ensure Qdrant is up and reachable (default http://localhost:6333).
-
Start the MCP-like server
- npm run server
-
Run the sample chat (optional but recommended)
- In another terminal: npm run example_chat
-
Verify functionality
- Use the sample chat prompt like: how are you doing or search for latest US politics news to see tool calls and final results.
Additional notes
Tips and common issues:
- Ensure the .env (or environment variables) provides necessary keys (FIRECRAWL_KEY only if you use scraping tool).
- The server relies on a Qdrant vector store; make sure the Qdrant daemon is running with default ports or adjust the configuration accordingly.
- When testing, you can enable verbose debug logs to see tool-call details; otherwise, set CHAT_LOG_LEVEL to info to see final answers only.
- Tools are defined in the functions directory; adding a new tool typically requires exposing an endpoint and updating the tool registry so the LLM can decide to call it.
- If you see connection errors to endpoints, verify that the server is up and that the /api/functions/search and /api/functions/execute routes are reachable.
- The sample chat demonstrates both native responses (no tool call) and tool-assisted responses; use those flows to understand how the LLM decides to call tools.
- If you modify server paths or ports, update the mcp_config accordingly to reflect the correct command and arguments.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.