2025-rex _premiers_usages_et_retours_terrain
REX on MCPs by @LaurentVeyssier
claude mcp add --transport stdio datacraft-paris-2025-rex-mcp_premiers_usages_et_retours_terrain npx -y datacraft-mcp-premiers-usages@latest \ --env MCP_HOST="localhost" \ --env MCP_PORT="8080" \ --env MCP_LOG_LEVEL="info"
How to use
This MCP server presents the initial MCP usage scenarios and field feedback discussed during the 2025 session. Once started, it exposes a set of tools and data access capabilities that an LLM can discover and invoke through a standardized MCP client. You can connect your MCP client to the server to see the available tools, understand their input/output schemas, and test tool calls in real time. The server is designed to illustrate how tools can be discovered, bound to a context, and invoked dynamically by an AI agent to perform tasks such as data retrieval, reasoning over tool outputs, and orchestrating multiple actions in sequence.
How to install
Prerequisites:
- Node.js 18+ and npm (for the npx/npm installation path) or Python with uv/uvx if you prefer the Python transport. Optional Docker usage is supported if you want containerized execution.
Install via npx (Node.js):
- Ensure Node.js and npm are installed
- Run:
npx -y datacraft-mcp-premiers-usages@latest
Install via uvx (Python/uv) if you have the Python MCP client workflow:
- Ensure Python 3.8+ is installed
- Install uvx (if your environment uses uv/uvx tooling):
pipx install uvx
- Start the MCP server (example, depending on your setup):
uvx datacraft-mcp-premiers-usages@latest
Docker (optional):
- Build or pull the server image and run it:
docker run -i --rm datacraft/mcp-premiers-usages:latest
Notes:
- Replace the package name with the exact published MCP server package if the name differs in your environment.
- The server will typically start on localhost:8080 unless overridden by environment variables.
Additional notes
Tips and common considerations:
- Verify network access between the MCP client and the server. If you run locally, ensure MCP_HOST and MCP_PORT match your setup.
- If using npm/npx, you may want to pin the version (instead of latest) to maintain a stable schema.
- The server exposes tools whose schemas (inputs/outputs) may be documented in its API reference or via the MCP Inspector tooling.
- Environment variables like MCP_LOG_LEVEL can help with debugging; set to debug if you hit issues.
- If you encounter tool discovery issues, ensure the server has started correctly and that your MCP client is configured to connect to the same host/port.
- For production, consider running the server in a container and proxying through a secure connection; review the transport layer used by MCP (e.g., WebSocket, HTTP) in your environment.
Related MCP Servers
Wax
Sub-Millisecond RAG on Apple Silicon. No Server. No API. One File. Pure Swift
wanaku
Wanaku MCP Router
mcp
Octopus Deploy Official MCP Server
mem0
✨ mem0 MCP Server: A memory system using mem0 for AI applications with model context protocl (MCP) integration. Enables long-term memory for AI agents as a drop-in MCP server.
furi
CLI & API for MCP management
Pare
Dev tools, optimized for agents. Structured, token-efficient MCP servers for git, test runners, npm, Docker, and more.