mcp
MCP server from nathanjclark/mcp-server
claude mcp add --transport stdio nathanjclark-mcp-server docker run -i nathanjclark/mcp-server
How to use
This MCP server implements the Model Context Protocol with OAuth 2.1 authentication, a PostgreSQL database, and built-in AI tooling. It exposes a central MCP endpoint at /mcp and uses Spa-like authentication flows via OAuth 2.1 to protect tools, resources, and prompts. You can test the server rapidly with the MCP Inspector or with curl to exercise initialize, tools.list, and resources.read once authenticated. The registry-based architecture coordinates tools, resources, and prompts, and the AI integration via the rig crate provides expandable capabilities for natural language processing and data operations.
To use the server, first run it (via Shuttle deployment in production or locally with Docker). You will authenticate through Auth0 and obtain a bearer token. Once authenticated, you can list available tools with tools/list, call a tool with tools/call, list resources with resources/list, read a resource with resources/read, and inspect prompts with prompts/list or prompts/get. The MCP endpoint at POST /mcp accepts JSON-RPC 2.0 requests and returns standard MCP responses, with protected methods requiring a valid access token. The MCP Inspector tool is recommended for exploring capabilities and generating client configurations such as mcp.json.
How to install
Prerequisites:
- Rust toolchain (rustup) for building or contributing, if needed
- Docker or Shuttle CLI for running the server in containerized form
- Access to an Auth0 tenant for OAuth 2.1 authentication
- PostgreSQL (or use Shuttle-managed DB in deployment)
- Optional: OpenAI API key for AI features
Installation steps (local / development):
-
Clone the repository: git clone <this-repo-url> cd mcp-server
-
Install and configure prerequisites:
- Install Rust: https://rustup.rs/
- Install Docker: https://docs.docker.com/get-d docker/
- Install Shuttle CLI (for deployment): curl -sSfL https://www.shuttle.dev/install | bash
-
Set up authentication and secrets:
- Create a Secrets.toml in the project root and populate with your Auth0 and OpenAI keys, for example: cat > Secrets.toml << EOF AUTH0_DOMAIN = 'your-tenant.auth0.com' AUTH0_CLIENT_ID = 'your-client-id' AUTH0_CLIENT_SECRET = 'your-client-secret' AUTH0_CALLBACK_URL = 'http://localhost:8000/auth/callback' SESSION_JWT_SECRET = 'your-very-long-random-secret-key-at-least-32-chars' OPENAI_API_KEY = 'sk-your-openai-api-key' # Optional EOF
-
Run locally (Docker-based or via Shuttle):
- Using Docker (build/run): docker build -t mcp-server . docker run -p 8000:8000 --env-file Secrets.toml mcp-server
- Using Shuttle (recommended for production): shuttle run
-
Test the server:
- Public initialize call (no auth required):
curl -X POST http://localhost:8000/mcp
-H "Content-Type: application/json"
-d '{"jsonrpc": "2.0","method": "initialize","params": {"protocolVersion": "2024-11-05","capabilities": {},"clientInfo": {"name": "test-client","version": "1.0.0"}},"id": 1}'
- Public initialize call (no auth required):
curl -X POST http://localhost:8000/mcp
-
Obtain and use an access token with MCP Inspector to test protected methods (tools/list, resources/read, prompts/get, etc.).
Additional notes
Tips and notes:
- Ensure your Auth0 domain and client credentials are correctly configured in Secrets.toml and that your OAuth callback URL matches the deployed host.
- The OPENAI_API_KEY is optional; enable it if you plan to use AI-powered tools via the rig crate.
- If you encounter authentication errors, verify token scopes and that the Bearer token is sent in the Authorization header for protected MCP methods.
- The server uses a registry-based approach to expose tools, resources, and prompts; update the registry to add new capabilities.
- In production, configure proper TLS termination (e.g., via Shuttle or a reverse proxy) and rotate secrets regularly.
- When testing locally, MCP Inspector supports both OAuth-driven and token-based testing flows; use the session token for bearer-authenticated calls.
- If you modify the code, run cargo build --release to generate an optimized binary for production use.
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.
perplexity-web-api
🔍 Perplexity AI MCP without API key