ATOM Pricing Intelligence
The Global Price Benchmark for AI Inference. 1,600+ SKUs, 40+ vendors, 25 AIPI indexes.
ATOM MCP Server
AI Inference Pricing Intelligence — delivered as a native tool for AI agents.
1,600+ SKUs · 40+ vendors · 6 modalities · 4 channels · 14 AIPI indexes · Updated weekly
Website · ATOM MCP Pro · Smithery
What Is This?
ATOM MCP Server lets any MCP-compatible AI agent (Claude, GPT, Cursor, Windsurf, VS Code Copilot) query real-time AI inference pricing data programmatically. Think of it as the Bloomberg Terminal for AI pricing, accessible via the Model Context Protocol.
Ask your AI assistant a question like "What's the cheapest way to run GPT-4o?" and it calls ATOM's tools behind the scenes, returning a data-backed answer from 1,600+ pricing SKUs across 40+ vendors globally.
Built by ATOM (A7OM) — the world's first methodological inference pricing index.
AIPI Indexes
14 benchmark indexes across four categories:
| Category | Indexes | What It Answers |
|---|---|---|
| Modality | TXT, MML, IMG, AUD, VID, VOC | What does this type of inference cost? |
| Channel | DEV, CLD, PLT, NCL | Where should you buy — direct, marketplace, platform, or neocloud? |
| Tier | FTR, BDG, RSN | What's the premium for flagship vs budget vs reasoning? |
| Special | OSS | How much cheaper is open-source inference across all channels? |
All indexes are global (GLB), calculated weekly using chained matched-model methodology to eliminate composition bias.
Tools
| Tool | Tier | Description |
|---|---|---|
list_vendors | Free | All tracked vendors with country, region, channel type, and pricing page URLs |
get_kpis | Free | 6 market KPIs: output premium, caching savings, open-source advantage, context cost curve, caching availability, size spread |
get_index_benchmarks | Free | AIPI price benchmarks across 14 indexes — modality, channel, tier, and licensing |
get_market_stats | Tiered | Aggregate market intelligence: medians, quartiles, distributions, modality breakdown |
search_models | Tiered | Multi-filter search: modality, vendor, creator, open-source, price range, context window, parameters |
get_model_detail | Tiered | Full specs + pricing across all vendors for a single model |
compare_prices | Tiered | Cross-vendor price comparison for a model or model family |
get_vendor_catalog | Tiered | Complete catalog for a specific vendor: all models, modalities, and pricing |
Pricing Tiers
| ATOM MCP (Free) | ATOM MCP Pro ($49/mo) | |
|---|---|---|
| Vendors, KPIs, AIPI indexes | ✅ Full data | ✅ Full data |
| Market stats | Aggregates only | + Vendor-level breakdown |
| Model search & comparison | Counts + price ranges | Full granular SKU data |
| Model detail | Specs only | + Per-vendor pricing |
| Vendor catalog | Summary only | Full SKU listing |
Free tier (no API key): Enough to understand the market — counts, ranges, distributions, benchmarks.
ATOM MCP Pro ($49/mo): Full granular data — every vendor, model, price, and spec. → a7om.com/mcp
Quick Start
Option 1: Remote URL — Claude.ai / Claude Desktop (recommended)
No install required. Connect directly to ATOM's hosted server:
Claude.ai (web): Settings → Connectors → Add custom connector
Name: ATOM Pricing Intelligence
URL: https://atom-mcp-server-production.up.railway.app/mcp
Claude Desktop: Settings → Developer → Edit Config
{
"mcpServers": {
"atom-pricing": {
"url": "https://atom-mcp-server-production.up.railway.app/mcp"
}
}
}
Note: Remote URL support requires a recent Claude Desktop version. If it doesn't work, use the npx method below.
Claude Desktop (via npx proxy):
{
"mcpServers": {
"atom-pricing": {
"command": "npx",
"args": ["mcp-remote", "https://atom-mcp-server-production.up.railway.app/mcp"]
}
}
}
Option 2: Local (stdio) — for Cursor, Windsurf, etc.
git clone https://github.com/A7OM-AI/atom-mcp-server.git
cd atom-mcp-server
npm install && npm run build
Add to your MCP client config:
{
"mcpServers": {
"atom-pricing": {
"command": "node",
"args": ["/path/to/atom-mcp-server/dist/index.js"],
"env": {
"SUPABASE_URL": "https://jonncmzxvxzwyaznokba.supabase.co",
"SUPABASE_ANON_KEY": "your-anon-key"
}
}
}
}
Option 3: Deploy your own (Railway)
Set environment variables in Railway dashboard:
SUPABASE_URLSUPABASE_ANON_KEYATOM_API_KEYS(comma-separated, for paid tier validation)TRANSPORT=http
Example Queries
Once connected, just ask your AI assistant naturally:
- "What's the cheapest way to run GPT-4o?"
- "Compare Claude Sonnet 4.5 pricing across all vendors"
- "Find open-source text models under $0.50 per million tokens"
- "Show me Google's full model catalog"
- "What are the AIPI benchmark prices for text inference?"
- "How do neocloud prices compare to cloud marketplaces?"
- "How much cheaper is open-source inference?"
- "Give me a market overview of AI inference pricing"
- "What are the key market KPIs for AI inference?"
Environment Variables
| Variable | Required | Description |
|---|---|---|
SUPABASE_URL | Yes | Supabase project URL |
SUPABASE_ANON_KEY | Yes | Supabase anonymous/public key |
ATOM_API_KEYS | No | Comma-separated valid API keys for paid tier |
TRANSPORT | No | stdio (default) or http |
PORT | No | HTTP port (default 3000) |
Tech Stack
- TypeScript / Node.js
- MCP SDK (
@modelcontextprotocol/sdk) - Supabase (PostgreSQL) via REST API
- Express (HTTP transport)
- Zod (schema validation)
About ATOM
ATOM tracks 1,600+ AI inference pricing SKUs from 40+ vendors globally through the AIPI (ATOM Inference Price Index) system — the first methodological price benchmark for AI inference. 14 indexes span modality, channel, tier, and licensing dimensions, updated weekly using chained matched-model methodology to eliminate composition bias.
Vendors are classified across four distribution channels: Model Developers (direct API), Cloud Marketplaces (AWS Bedrock, Google Vertex, Azure), Inference Platforms (DeepInfra, Fireworks, Together AI), and Neoclouds (Groq, Cerebras).
Products: ATOM MCP · ATOM Terminal · ATOM Feed
License
MIT
ATOM — The Global Price Benchmark for AI Inference.
関連サーバー
Aare.guru
Get water temperature and swimming conditions for the Aare river in Switzerland.
ClawPay MCP
Non-custodial x402 payment layer for AI agents. Agents sign transactions locally on Base — no custodial infrastructure, no API keys, no KYC.
Simtheory
MCP client with model switching, assistants and agentic mode.
TI Mindmap HUB — MCP Server
TI Mindmap HUB MCP Server provides AI assistants with direct access to curated threat intelligence — reports, CVEs, IOCs, STIX bundles, and weekly briefings — through the Model Context Protocol.
myinstants-mcp
A soundboard MCP server with millions of meme sounds from myinstants.com — search, play, and browse categories. npx myinstants-mcp to get started.
Universal Image MCP
Universal MCP server for AI image generation supporting AWS Bedrock (Nova Canvas), OpenAI (GPT Image, DALL-E), and Google Gemini (Imagen 4). Generate, transform, and edit images using multiple AI models through a single Model Context Protocol interface.
FPL MCP Server
MCP server for Fantasy Premier League analysis and strategy. This server provides AI assistants with powerful tools, resources, and prompts to help you dominate your FPL mini-leagues with data-driven insights
Relay-gateway
Relay is a desktop application for managing Model Context Protocol (MCP) servers. It provides a user-friendly interface to configure, enable/disable, and export MCP servers for use with Claude Desktop and other AI applications.
OpenRoute MCP
🗺️ MCP server to help plan routes using OpenRouteService.org, for activities such as hiking or mountain biking.
mcp-cli-catalog
An MCP server that publishes CLI tools on your machine for discoverability by LLMs