TokenCost
An MCP (Model Context Protocol) server that provides real-time LLM token pricing data for 60+ AI models across 15 providers.
TokenCost MCP Server
An MCP (Model Context Protocol) server that provides real-time LLM token pricing data for 60+ AI models across 15 providers.
Query, compare, and estimate costs for models from OpenAI, Anthropic, Google, Meta, xAI, Mistral, DeepSeek, and more — directly from your AI assistant.
Built by TokenCost — the free LLM token cost calculator.
Tools
| Tool | Description |
|---|---|
tokencost_get_model_pricing | Get pricing for a specific model |
tokencost_compare_models | Side-by-side pricing comparison |
tokencost_estimate_cost | Calculate cost for given token counts |
tokencost_find_cheapest | Find cheapest models with filters |
tokencost_list_models | List all available models |
tokencost_list_providers | List all providers with pricing ranges |
Quick Start
Claude Desktop / Cursor / Windsurf
Add to your MCP config:
{
"mcpServers": {
"tokencost": {
"command": "npx",
"args": ["-y", "tokencost-mcp-server"]
}
}
}
From Source
git clone https://github.com/ankit-aglawe/tokencost-mcp-server
cd tokencost-mcp-server
npm install
npm run build
npm start
Example Usage
"How much would it cost to process 1M input tokens with GPT-5?"
→ Uses tokencost_estimate_cost with model="gpt-5", input_tokens=1000000, output_tokens=0
"Compare Claude Sonnet 4.6 vs GPT-5 vs Gemini 3 Pro pricing"
→ Uses tokencost_compare_models with ["claude-sonnet-4.6", "gpt-5", "gemini-3-pro"]
"What's the cheapest model with at least 200K context?"
→ Uses tokencost_find_cheapest with min_context=200000
Supported Providers
OpenAI, Anthropic, Google, xAI, Meta, Mistral, DeepSeek, Alibaba (Qwen), Amazon (Nova), NVIDIA, Cohere, Perplexity, Moonshot (Kimi), Zhipu (GLM), MiniMax
Pricing Data
Pricing is kept accurate and up to date by the TokenCost team. We track official provider announcements and update pricing as soon as changes are published — new models, price cuts, and deprecations are reflected within days.
If you notice outdated pricing or a missing model, open an issue and we'll get it updated.
License
MIT
İlgili Sunucular
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
BrainBox
Hebbian memory for AI agents — learns file access patterns, builds neural pathways, predicts next tools/files, saves tokens
Google Workspace Developers
Developer documentation for Google Workspace APIs
CrowdCent MCP Server
Integrates with the CrowdCent Challenge API, allowing AI assistants to manage prediction challenges, datasets, and submissions.
AWS CodePipeline MCP Server
Integrates with AWS CodePipeline to manage continuous integration and delivery pipelines.
Vibe Coder
An advanced MCP server for semantic routing, code generation, workflows, and AI-assisted development.
Chronos Protocol
A robust MCP server that eliminates temporal blindness in AI coding agents through intelligent time tracking, persistent memory, and complete session traceability.
Sistema de Predicción Energética con IA
An AI-powered system for analyzing and predicting domestic energy consumption. It offers precise forecasts, historical pattern analysis, and personalized optimization recommendations through a conversational interface.
Unified MCP & A2A Server
A Google Apps Script server that unifies Model Context Protocol (MCP) and Agent2Agent (A2A) for Google Workspace users.
Terraform MCP Server by Binadox
MCP server for Terraform — automatically validates, secures, and estimates cloud costs for Terraform configurations. Developed by Binadox, it integrates with any Model Context Protocol (MCP) client (e.g. Claude Desktop or other MCP-compatible AI assistants).
Dify Workflows
An MCP server for executing Dify workflows, configured via environment variables or a config file.