Artificial Analysis
An unofficial MCP server for the Artificial Analysis API
Artificial Analysis MCP Server
An MCP (Model Context Protocol) server that provides LLM model pricing, speed metrics, and benchmark scores from Artificial Analysis.
Features
- Get real-time pricing for 300+ LLM models (input/output/blended rates)
- Compare speed metrics (tokens/sec, time to first token)
- Access benchmark scores (Intelligence Index, Coding Index, MMLU-Pro, GPQA, and more)
- Filter by provider (OpenAI, Anthropic, Google, etc.)
- Sort by any metric
Installation
Claude Code
claude mcp add artificial-analysis -e AA_API_KEY=your-key -- npx -y artificial-analysis-mcp
Or install from GitHub:
claude /mcp add https://github.com/davidhariri/artificial-analysis-mcp
Manual Configuration
Add to your Claude settings (~/.claude/settings.json):
{
"mcpServers": {
"artificial-analysis": {
"command": "npx",
"args": ["-y", "artificial-analysis-mcp"],
"env": {
"AA_API_KEY": "your-api-key"
}
}
}
}
Configuration
| Environment Variable | Required | Description |
|---|---|---|
AA_API_KEY | Yes | Your Artificial Analysis API key |
Get your API key at artificialanalysis.ai.
Tools
list_models
List all available LLM models with optional filtering and sorting.
Parameters:
| Name | Type | Required | Description |
|---|---|---|---|
creator | string | No | Filter by model creator (e.g., "OpenAI", "Anthropic") |
sort_by | string | No | Sort field (see below) |
sort_order | string | No | "asc" or "desc" (default: "desc") |
limit | number | No | Maximum results to return |
Sort fields: price_input, price_output, price_blended, speed, ttft, intelligence_index, coding_index, math_index, mmlu_pro, gpqa, release_date
Example usage:
- "List the top 5 fastest models"
- "Show me Anthropic models sorted by price"
- "What are the cheapest models with high intelligence scores?"
get_model
Get detailed information about a specific model.
Parameters:
| Name | Type | Required | Description |
|---|---|---|---|
model | string | Yes | Model name or slug (e.g., "gpt-4o", "claude-4-5-sonnet") |
Returns: Complete model details including pricing, speed metrics, and all benchmark scores.
Example usage:
- "Get pricing for GPT-4o"
- "What are Claude 4.5 Sonnet's benchmark scores?"
Model Data
Each model includes:
- Pricing: Input/output/blended rates per 1M tokens (USD)
- Speed: Output tokens per second, time to first token
- Benchmarks: Intelligence Index, Coding Index, Math Index, MMLU-Pro, GPQA, LiveCodeBench, and more
Development
# Install dependencies
npm install
# Build
npm run build
# Run locally
AA_API_KEY=your-key node dist/index.js
License
MIT
Servidores relacionados
Alpha Vantage MCP Server
patrocinadorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
memtrace
Memtrace gives AI coding agents structural memory — your codebase as a live knowledge graph so agents stop re-deriving code structure from scratch and start reasoning from fact.
ZenML
Interact with your MLOps and LLMOps pipelines through your ZenML MCP server
OpenAPI Invoker
Invokes any OpenAPI specification through a Model Context Protocol (MCP) server.
Supervisord MCP
A tool for managing Supervisord processes, integrated with AI agents via the Model Context Protocol (MCP). It offers standardized process control, real-time monitoring, and robust operations.
gget-mcp
An MCP server for the gget bioinformatics library, enabling standardized access to genomics tools and databases.
PyMilvus Code Generate Helper
Retrieves relevant code snippets and documents to assist in generating PyMilvus code, requiring a running Milvus instance.
WRG MCP Server
A server providing tools for weapon recoil generation and visualization via HTTP endpoints.
portkey-admin-mcp
Full MCP server for the https://portkey.ai AI Gateway Admin API with 151 tools across 18 domains.
OpenMM MCP
AI-native crypto trading server with 13 tools for market data, order execution, grid strategies, and Cardano DeFi across multiple exchanges.
LLMS.TXT Documentation Server
Access and read llms.txt documentation files for various Large Language Models.