tolk
MCP server from robustfengbin/tolk-mcp-server
claude mcp add --transport stdio robustfengbin-tolk-mcp-server npx tolk-mcp-server \ --env MCP_HOST="0.0.0.0" \ --env MCP_PORT="3000"
How to use
This MCP server exposes the Tolk compiler functionality to AI assistants via the MCP interface. It enables real-time compilation of Tolk source code, verification of code correctness before deployment, and retrieval of compiled output such as BOC (bytecode) and code hash. The server supports multi-file projects and provides debugging information to help identify and resolve compilation errors. Tools exposed by the server include getTolkCompilerVersion, which returns the current Tolk compiler version, and runTolkCompiler, which accepts an entrypoint file and a map of source files to source text and returns the compilation result along with optional diagnostic information. The server can be consumed over standard MCP transports (stdio for Claude Desktop, HTTP/SSE for ChatGPT) depending on how you launch it. For Claude Desktop, you typically configure the client to invoke the server via npx tolk-mcp-server. For ChatGPT, you can run the HTTP version (npx tolk-mcp-server-http) and connect through an HTTP endpoint.
How to install
Prerequisites:
- Node.js >= 18.0.0
- npm (comes with Node.js)
Installation options:
- Install globally (for quick local access):
npm install -g tolk-mcp-server
- Install as a project dependency:
mkdir my-tolk-project
cd my-tolk-project
npm init -y
npm install tolk-mcp-server
- Run directly with npx (no installation required on the system):
npx tolk-mcp-server
- Run the HTTP version (for ChatGPT integration):
npx tolk-mcp-server-http
Configure environment variables if needed (see additional notes): MCP_PORT (default 3000), MCP_HOST (default 0.0.0.0).
Additional notes
Notes and tips:
- The server exposes two main delivery modes: stdio (for Claude Desktop) and HTTP/SSE (for ChatGPT). Use the appropriate startup script or npx command to choose the transport.
- Environment variables allow you to customize the listening port and host. Example: MCP_PORT=3001 MCP_HOST=0.0.0.0
- If you are deploying with Docker, you can map port 3000 (or your chosen port) to the host and run the container as usual. Ensure the MCP_PORT matches the port exposed by your deployment configuration.
- The API includes getTolkCompilerVersion and runTolkCompiler; when using runTolkCompiler, you must provide entrypointFileName and a sources map. The response includes fiftCode, codeBoc64, codeHashHex, and stderr for debugging.
- For multi-file projects, ensure all files are included in the sources map with their respective filenames as keys.
- Check the README for specific usage examples and script names (start, start:http, build, test, etc.).
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.