ATOM Pricing Intelligence
The Global Price Benchmark for AI Inference. 1,600+ SKUs, 40+ vendors, 25 AIPI indexes.
ATOM MCP Server
AI Inference Pricing Intelligence — delivered as a native tool for AI agents.
1,600+ SKUs · 40+ vendors · 6 modalities · 4 channels · 14 AIPI indexes · Updated weekly
Website · ATOM MCP Pro · Smithery
What Is This?
ATOM MCP Server lets any MCP-compatible AI agent (Claude, GPT, Cursor, Windsurf, VS Code Copilot) query real-time AI inference pricing data programmatically. Think of it as the Bloomberg Terminal for AI pricing, accessible via the Model Context Protocol.
Ask your AI assistant a question like "What's the cheapest way to run GPT-4o?" and it calls ATOM's tools behind the scenes, returning a data-backed answer from 1,600+ pricing SKUs across 40+ vendors globally.
Built by ATOM (A7OM) — the world's first methodological inference pricing index.
AIPI Indexes
14 benchmark indexes across four categories:
| Category | Indexes | What It Answers |
|---|---|---|
| Modality | TXT, MML, IMG, AUD, VID, VOC | What does this type of inference cost? |
| Channel | DEV, CLD, PLT, NCL | Where should you buy — direct, marketplace, platform, or neocloud? |
| Tier | FTR, BDG, RSN | What's the premium for flagship vs budget vs reasoning? |
| Special | OSS | How much cheaper is open-source inference across all channels? |
All indexes are global (GLB), calculated weekly using chained matched-model methodology to eliminate composition bias.
Tools
| Tool | Tier | Description |
|---|---|---|
list_vendors | Free | All tracked vendors with country, region, channel type, and pricing page URLs |
get_kpis | Free | 6 market KPIs: output premium, caching savings, open-source advantage, context cost curve, caching availability, size spread |
get_index_benchmarks | Free | AIPI price benchmarks across 14 indexes — modality, channel, tier, and licensing |
get_market_stats | Tiered | Aggregate market intelligence: medians, quartiles, distributions, modality breakdown |
search_models | Tiered | Multi-filter search: modality, vendor, creator, open-source, price range, context window, parameters |
get_model_detail | Tiered | Full specs + pricing across all vendors for a single model |
compare_prices | Tiered | Cross-vendor price comparison for a model or model family |
get_vendor_catalog | Tiered | Complete catalog for a specific vendor: all models, modalities, and pricing |
Pricing Tiers
| ATOM MCP (Free) | ATOM MCP Pro ($49/mo) | |
|---|---|---|
| Vendors, KPIs, AIPI indexes | ✅ Full data | ✅ Full data |
| Market stats | Aggregates only | + Vendor-level breakdown |
| Model search & comparison | Counts + price ranges | Full granular SKU data |
| Model detail | Specs only | + Per-vendor pricing |
| Vendor catalog | Summary only | Full SKU listing |
Free tier (no API key): Enough to understand the market — counts, ranges, distributions, benchmarks.
ATOM MCP Pro ($49/mo): Full granular data — every vendor, model, price, and spec. → a7om.com/mcp
Quick Start
Option 1: Remote URL — Claude.ai / Claude Desktop (recommended)
No install required. Connect directly to ATOM's hosted server:
Claude.ai (web): Settings → Connectors → Add custom connector
Name: ATOM Pricing Intelligence
URL: https://atom-mcp-server-production.up.railway.app/mcp
Claude Desktop: Settings → Developer → Edit Config
{
"mcpServers": {
"atom-pricing": {
"url": "https://atom-mcp-server-production.up.railway.app/mcp"
}
}
}
Note: Remote URL support requires a recent Claude Desktop version. If it doesn't work, use the npx method below.
Claude Desktop (via npx proxy):
{
"mcpServers": {
"atom-pricing": {
"command": "npx",
"args": ["mcp-remote", "https://atom-mcp-server-production.up.railway.app/mcp"]
}
}
}
Option 2: Local (stdio) — for Cursor, Windsurf, etc.
git clone https://github.com/A7OM-AI/atom-mcp-server.git
cd atom-mcp-server
npm install && npm run build
Add to your MCP client config:
{
"mcpServers": {
"atom-pricing": {
"command": "node",
"args": ["/path/to/atom-mcp-server/dist/index.js"],
"env": {
"SUPABASE_URL": "https://jonncmzxvxzwyaznokba.supabase.co",
"SUPABASE_ANON_KEY": "your-anon-key"
}
}
}
}
Option 3: Deploy your own (Railway)
Set environment variables in Railway dashboard:
SUPABASE_URLSUPABASE_ANON_KEYATOM_API_KEYS(comma-separated, for paid tier validation)TRANSPORT=http
Example Queries
Once connected, just ask your AI assistant naturally:
- "What's the cheapest way to run GPT-4o?"
- "Compare Claude Sonnet 4.5 pricing across all vendors"
- "Find open-source text models under $0.50 per million tokens"
- "Show me Google's full model catalog"
- "What are the AIPI benchmark prices for text inference?"
- "How do neocloud prices compare to cloud marketplaces?"
- "How much cheaper is open-source inference?"
- "Give me a market overview of AI inference pricing"
- "What are the key market KPIs for AI inference?"
Environment Variables
| Variable | Required | Description |
|---|---|---|
SUPABASE_URL | Yes | Supabase project URL |
SUPABASE_ANON_KEY | Yes | Supabase anonymous/public key |
ATOM_API_KEYS | No | Comma-separated valid API keys for paid tier |
TRANSPORT | No | stdio (default) or http |
PORT | No | HTTP port (default 3000) |
Tech Stack
- TypeScript / Node.js
- MCP SDK (
@modelcontextprotocol/sdk) - Supabase (PostgreSQL) via REST API
- Express (HTTP transport)
- Zod (schema validation)
About ATOM
ATOM tracks 1,600+ AI inference pricing SKUs from 40+ vendors globally through the AIPI (ATOM Inference Price Index) system — the first methodological price benchmark for AI inference. 14 indexes span modality, channel, tier, and licensing dimensions, updated weekly using chained matched-model methodology to eliminate composition bias.
Vendors are classified across four distribution channels: Model Developers (direct API), Cloud Marketplaces (AWS Bedrock, Google Vertex, Azure), Inference Platforms (DeepInfra, Fireworks, Together AI), and Neoclouds (Groq, Cerebras).
Products: ATOM MCP · ATOM Terminal · ATOM Feed
License
MIT
ATOM — The Global Price Benchmark for AI Inference.
相关服务器
Elite Stock Research MCP
Live Stock Market Data, SEC Filings, Stock Screener, and analysis
MCP Emotional Support
Provides emotional support and positive reinforcement for LLMs, with customizable therapeutic personas.
FPL MCP Server
MCP server for Fantasy Premier League analysis and strategy. This server provides AI assistants with powerful tools, resources, and prompts to help you dominate your FPL mini-leagues with data-driven insights
Mind Reasoner MCP Server
Mind Reasoner's MCP Server
Decompose
Decompose text into classified semantic units — authority, risk, attention, entities. No LLM. Deterministic.
BSC MultiSend MCP
Perform bulk BNB and BEP20 token transfers on the BNB Smart Chain (BSC).
Limitless MCP
MCP server for Limitless Exchange prediction markets on Base. 34 tools for market access, limit order trading, wallet management, and position tracking.
Ingero
eBPF-based GPU causal observability agent with MCP server. Traces CUDA Runtime/Driver APIs via uprobes and host kernel events via tracepoints to build causal chains explaining GPU latency. 7 MCP tools for AI-assisted GPU debugging and root cause analysis. <2% overhead, production-safe.
xmcp.dev
The TypeScript framework for building & shipping MCP servers
Cyber Triage
Allows access to DFIR / forensics data that was collected from endpoints. Used for SOC and forensic investigations.