Trustwise
Advanced evaluation tools for AI safety, alignment, and performance using the Trustwise API.
🦉 Trustwise MCP Server
The Trustwise MCP Server is a Model Context Protocol (MCP) server that provides a suite of advanced evaluation tools for AI safety, alignment, and performance. It enables developers and AI tools to programmatically assess the quality, safety, and cost of LLM outputs using Trustwise's industry-leading metrics.
💡 Use Cases
- Evaluating the safety and reliability of LLM responses.
- Measuring alignment, clarity, and helpfulness of AI-generated content.
- Estimating the carbon footprint and cost of model inference.
- Integrating robust evaluation into AI pipelines, agents, or orchestration frameworks.
🛠️ Prerequisites
- A Trustwise API Key (get one here)
- Docker; Follow the install instructions
📦 Installation & Running
Claude Desktop
To connect the Trustwise MCP Server to Claude Desktop, add the following configuration to your Claude Desktop settings:
{
"mcpServers": {
"trustwise": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"TW_API_KEY",
"ghcr.io/trustwiseai/trustwise-mcp-server:latest"
],
"env": {
"TW_API_KEY": "<YOUR_TRUSTWISE_API_KEY>"
}
}
}
}
To point to a specific Trustwise Instance - under env, also set the following optional environment variable:
TW_BASE_URL: "<YOUR_TRUSTWISE_INSTANCE_URL>"
e.g "TW_BASE_URL": "https://api.yourdomain.ai"
Cursor
To connect the Trustwise MCP Server to cursor, add the following configuration to your cursor settings:
{
"mcpServers": {
"trustwise": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"TW_API_KEY",
"-e",
"TW_BASE_URL",
"ghcr.io/trustwiseai/trustwise-mcp-server:latest"
],
"env": {
"TW_API_KEY": "<YOUR_TRUSTWISE_API_KEY>"
}
}
}
}
Replace <YOUR_TRUSTWISE_API_KEY> with your actual Trustwise API key.
🧰 Tools
The Trustwise MCP Server exposes the following tools (metrics). Each tool can be called with the specified arguments to evaluate a model response.
🛡️ Trustwise Metrics
| Tool Name | Description |
|---|---|
faithfulness_metric | Evaluate the faithfulness of a response to its context |
answer_relevancy_metric | Evaluate relevancy of a response to the query |
context_relevancy_metric | Evaluate relevancy of context to the query |
pii_metric | Detect PII in a response |
prompt_injection_metric | Detect prompt injection risk |
summarization_metric | Evaluate summarization quality |
clarity_metric | Evaluate clarity of a response |
formality_metric | Evaluate formality of a response |
helpfulness_metric | Evaluate helpfulness of a response |
sensitivity_metric | Evaluate sensitivity of a response |
simplicity_metric | Evaluate simplicity of a response |
tone_metric | Evaluate tone of a response |
toxicity_metric | Evaluate toxicity of a response |
refusal_metric | Detect refusal to answer or comply with the query |
completion_metric | Evaluate completion of the query’s instruction |
adherence_metric | Evaluate adherence to a given policy or instruction |
stability_metric | Evaluate stability (consistency) of multiple responses |
carbon_metric | Estimate carbon footprint of a response |
cost_metric | Estimate cost of a response |
For more examples and advanced usage, see the official Trustwise SDK.
📄 License
This project is licensed under the terms of the MIT open source license. See LICENSE for details.
🔒 Security
- Do not commit secrets or API keys.
- This repository is public; review all code and documentation for sensitive information before pushing.
관련 서버
Scout Monitoring MCP
스폰서Put performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
스폰서Access financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
PlantUML-MCP-Server
MCP server that provides PlantUML diagram generation capabilities
Dan MCP
An example MCP server deployed on Cloudflare Workers without authentication.
Figma to React Native MCP
Convert Figma designs into React Native components.
ServiceNow
A production-ready Model Context Protocol (MCP) server for ServiceNow platform integration. Built with TypeScript for Node.js 20+, this server enables LLMs and AI assistants to interact with ServiceNow instances through a standardized interface.
AgentMode
An all-in-one MCP server for developers, connecting coding AI to databases, data warehouses, data pipelines, and cloud services.
Game Asset Generator
Generate 2D and 3D game assets using AI models hosted on Hugging Face Spaces.
Sugar
Autonomous AI development system for Claude Code with task queue management and workflow automation.
mybacklinks-mcp
Backlinks tracker and management tools for MyBacklinks.app.
Binlog MCP Server
A Model Context Protocol Server for analyzing MSBuild binlogs.
Code-Index-MCP
A local-first code indexer that enhances LLMs with deep code understanding. It integrates with AI assistants via the Model Context Protocol (MCP) and supports AI-powered semantic search.