trustwise
Trustwise MCP servers.
claude mcp add --transport stdio trustwiseai-trustwise-mcp-server docker run -i --rm -e TW_API_KEY -e TW_BASE_URL ghcr.io/trustwiseai/trustwise-mcp-server:latest \ --env TW_API_KEY="<YOUR_TRUSTWISE_API_KEY>" \ --env TW_BASE_URL="<OPTIONAL_TRUSTWISE_INSTANCE_URL>"
How to use
Trustwise MCP Server exposes a suite of evaluation tools (metrics) that you can call as part of an MCP workflow. The server is backed by the Trustwise metrics suite, including faithfulness, relevancy (answer and context), PII detection, prompt injection risk, summarization quality, clarity, formality, usefulness, sensitivity, toxicity, refusal detection, completion, adherence, stability, and environmental estimates like carbon footprint and cost. You can invoke these tools to assess a model response against a given prompt or context, integrate the evaluations into your agent or orchestration pipeline, and compare outputs across runs. The server is containerized and exposed via an MCP interface, so you can connect to it from your MCP client or orchestrator just like other MCP servers. To get started, configure the Docker-based run with your Trustwise API key and (optionally) a specific base URL for your Trustwise instance, then call the tools with the appropriate metric names and inputs in your MCP requests.
How to install
Prerequisites:
- Docker installed on your machine or host (see Docker docs for installation steps)
- A Trustwise API Key (you can obtain one from https://trustwise.ai)
- Basic knowledge of MCP workflows and how to send requests to an MCP server
Installation steps:
- Ensure Docker is running on your machine.
- Pull and run the Trustwise MCP Server container using the following configuration (example uses a placeholder for API key):
Code snippet (docker-run approach):
-
Save configuration to your MCP client (example shows how to reference the trustwise server in an MCP settings file):
{ "mcpServers": { "trustwise": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "TW_API_KEY", "-e", "TW_BASE_URL", "ghcr.io/trustwiseai/trustwise-mcp-server:latest" ], "env": { "TW_API_KEY": "<YOUR_TRUSTWISE_API_KEY>", "TW_BASE_URL": "<OPTIONAL_TRUSTWISE_INSTANCE_URL>" } } } }
- If you plan to point to a specific Trustwise instance, set TW_BASE_URL in the environment or in your MCP client config as shown above. Step-by-step usage will depend on your MCP client, but you’ll generally send a request to the trustwise server with the prompt/context and specify which metric(s) you want to evaluate.
Optional notes:
- You can replace the image tag with a specific version if you want reproducible builds (for example: ghcr.io/trustwiseai/trustwise-mcp-server:1.2.3).
- Ensure your TW_API_KEY is kept secure and not committed to any public repository.
Additional notes
Tips and common considerations:
- The TW_BASE_URL environment variable is optional and only needed if you want to target a non-default Trustwise instance.
- For best results, provide a well-structured context and prompt to the MCP client so the reliability and relevance metrics can be meaningfully evaluated.
- Monitor container resource usage (CPU/memory) when running multiple metrics in parallel.
- If you encounter authentication issues, verify that your API key is valid and has the necessary permissions for the Trustwise API.
- The server exposes a wide range of metrics; consult Trustwise SDK/docs for available argument formats and examples for each metric.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP