UsageWall
Live LLM pricing as an MCP server — six read-only tools to list, compare, estimate cost, and find the cheapest model across OpenAI, Anthropic, Google, DeepSeek, Mistral, xAI and Meta.
UsageWall
Live LLM pricing as an MCP server. Ask Claude (or any MCP client) "how much does this prompt cost?" and get real numbers from a hand-checked pricing table for every major LLM. Free, read-only, no auth.
TL;DR
- MCP endpoint:
https://usagewall.vercel.app/api/mcp - Install docs: usagewall.vercel.app/mcp
- Pricing JSON API: usagewall.vercel.app/api/pricing
- Live status: usagewall.vercel.app/status
- What's coming (waitlist): a hosted hard-cap proxy so a runaway loop can't drain your AI budget overnight.
Quick test:
curl -s https://usagewall.vercel.app/api/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'
Install in Claude.ai:
Settings → Connectors → Add custom connector → paste
https://usagewall.vercel.app/api/mcp
Install in Claude Desktop / Cursor / Windsurf / Zed / Cline: see /mcp.
The MCP server
Six read-only tools, two raw-JSON resources, JSON-RPC 2.0 over Streamable HTTP. Zero dependencies; ~280 LOC; auditable in one sitting.
Tools — list_models, get_model, compare_models, estimate_cost,
cheapest_for, list_providers.
Resources — usagewall://pricing/full, usagewall://pricing/<provider>.
Source — api/mcp.js. Tests — tests/mcp.test.js
(18 cases). Smithery descriptor — smithery.yaml.
What else is in this repo
Production-grade waitlist landing for usagewall.dev, plus the local proxy prototype. Static HTML + 5 Vercel serverless functions. No build step. No framework.
Stack (100% free tier)
| Layer | Service | Free tier |
|---|---|---|
| Frontend hosting | Vercel Hobby | 100GB/mo bandwidth |
| Database | Supabase | 500MB DB + 50k MAU |
| Email confirmation | Resend | 3k emails/mo, 100/day |
| Anti-bot (optional) | Cloudflare Turnstile | 1M challenges/mo |
| CI | GitHub Actions | 2k min/mo (private) / unlimited (public) |
| Domain | Porkbun / Cloudflare | $10–15 / year (only paid item) |
Layout
usagewall/
├── public/
│ ├── index.html # Landing — hero, problem, how, why, pricing, waitlist, footer
│ ├── privacy.html # Plain-English privacy page
│ ├── styles.css # Brand: Fraunces + Switzer + JetBrains Mono, dark warm
│ ├── app.js # Vanilla form-binding, fetch to /api/waitlist
│ ├── favicon.svg
│ ├── og.svg / og.png # 1200x630 social preview
├── api/
│ ├── waitlist.js # POST signup endpoint
│ └── unsubscribe.js # GET unsubscribe with HMAC token
├── lib/
│ ├── email.js # RFC-aware email validation + throwaway block
│ └── security.js # rate limit, IP hash, HMAC tokens, safe JSON parse
├── tests/
│ ├── email.test.js # 12 unit tests
│ ├── security.test.js # 23 unit tests
│ └── api.e2e.test.js # 13 end-to-end tests with mocked req/res
├── scripts/
│ ├── validate-html.js # CSP-friendly HTML linter (no inline handlers, alt, lang, etc.)
│ └── smoke-api.js # Live HTTP smoke against waitlist handler
├── supabase/
│ └── schema.sql # waitlist table + RLS deny-all + stats view
├── launch/
│ └── COPY.md # Twitter thread, Show HN, Reddit, IndieHackers, dev.to
├── .github/workflows/
│ └── test.yml # CI: tests + HTML validation + secret-leak grep
├── vercel.json # Security headers, CSP, cache rules
├── package.json # `npm test`, `npm run smoke`, `npm run ci`
├── .env.example
└── README.md
First-time setup
1. Supabase
- Create a project at supabase.com (free tier).
- SQL editor → paste
supabase/schema.sql→ run. - Settings → API → copy
URLandservice_rolekey.
2. Generate two secrets (required)
openssl rand -hex 32 # for IP_HASH_SALT
openssl rand -hex 32 # for UNSUBSCRIBE_SECRET
Both must be ≥32 chars. The handler refuses to sign tokens with shorter secrets.
3. Resend (optional but recommended)
- Sign up at resend.com.
- Add
usagewall.devas a domain (DNS records via your registrar). - API keys → create one. Copy.
4. Vercel deploy
cd usagewall
vercel link # creates .vercel/, do not commit
vercel env add SUPABASE_URL
vercel env add SUPABASE_SERVICE_ROLE_KEY
vercel env add IP_HASH_SALT
vercel env add UNSUBSCRIBE_SECRET
vercel env add RESEND_API_KEY # optional
vercel env add RESEND_FROM # optional
vercel env add PUBLIC_URL # https://usagewall.dev
vercel env add ALLOWED_ORIGIN # https://usagewall.dev
vercel deploy --prod
5. DNS for usagewall.dev
Buy at Porkbun (~$15/yr for .dev). Point at Vercel:
- A record
@→76.76.21.21 - CNAME
www→cname.vercel-dns.com
Local dev
cp .env.example .env.local
# Fill in at least: SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY,
# IP_HASH_SALT, UNSUBSCRIBE_SECRET
npx vercel dev
# → http://localhost:3000
To test without Vercel CLI, use the smoke server:
npm run smoke
# Hits the handler with stubbed Supabase, prints pass/fail per case.
Tests
npm test # unit + E2E (48 + 13 = 61 cases, no network)
npm run validate:html # HTML structure + CSP + a11y checks
npm run smoke # live HTTP smoke vs handler with stubbed Supabase
npm run ci # all of the above
CI runs on every push that touches usagewall/** (see .github/workflows/test.yml).
Security & privacy
- No raw IPs stored. Only
HMAC-SHA256(ip, IP_HASH_SALT)first 64 bits. - No PII in production logs. Email addresses are redacted to
email_domainonly. - CSP at meta + headers.
default-src 'self', scripts only from self. - HSTS preload. All TLS, no opt-out.
- HMAC-signed unsubscribe tokens. No DB lookup needed to verify.
- Honeypot field on every form — invisible to humans, irresistible to bots.
- Rate limit 5 req/min per IP-hash (in-memory; Turnstile handles the rest).
- Body cap 8KB on the API.
safeJsonParsenever throws. - GitHub Actions pin actions to commit SHA + grep for leaked secrets.
Read public/privacy.html for the user-facing policy.
Validation goals (5 days)
| Day | Action | Channel |
|---|---|---|
| 0 | Buy domain + deploy | — |
| 1 | Twitter thread (see launch/COPY.md) | X / Twitter |
| 2 | Show HN + Reddit (×3 subs) | HN, r/SaaS, r/SideProject, r/IndieHackers |
| 3 | IndieHackers post | IndieHackers |
| 4 | dev.to long-form article | dev.to |
| 5 | Measure | — |
GO if ≥100 emails and ≥1 reply asking "when can I pay?" NO-GO if <30 emails after the launch thread.
Brand notes
This is not the Baku brand — it's UsageWall, a product by Baku. Inherits the design DNA (dark warm, editorial serifs, no flashy gradients, generous spacing) but lives independently with its own identity. Footer credits Baku.
Palette:
- bg
#0A0A0B(warm near-black) - text
#F5F4F0(cream off-white) - accent
#E8DFC7(warm cream — highlights only) - code bg
#0E0E10
Type:
- Display: Fraunces 400/500, often italic
- Body: Switzer 400/500
- Mono: JetBrains Mono 400/500/600
Files NOT to commit
The .gitignore covers it, but be paranoid about:
.env,.env.local,.env.*.local.vercel/(project link)node_modules/
The CI grep checks for accidental commits of sk_live_…, re_…, sbp_….
संबंधित सर्वर
Kone.vc
प्रायोजकMonetize your AI agent with contextual product recommendations
Todo MCP Server
A Todo and Task Manager server for creating, updating, and managing tasks, using a SQLite database.
DeltaTask
A powerful, locally-hosted task management application with Obsidian integration and SQLite database support.
Flyweel Ad-MCP (Google+Meta)
Connect your Google Ads and Meta accounts to Claude, Cursor, or any AI tool that supports MCP.
Notes MCP Server
An MCP server for interacting with Obsidian notes. Requires the OBSIDIAN_VAULT_PATH environment variable to be set.
Targetprocess
Enables AI assistants to interact with Targetprocess data using semantic operations.
Penpot MCP Server
Integrates AI language models with the Penpot design platform to automate design workflows.
Mercado Pago
Mercado Pago's official MCP server, offering tools to interact with our API, simplifying tasks and product integration.
Awesome Agent Skills MCP Server
A Model Context Protocol (MCP) server that provides access to 100+ curated AI agent skills from the VoltAgent Awesome Agent Skills collection.
PinkRoosterMcp
Self-hosted project management system built for AI coding agents. 24 MCP tools with automatic state cascades, dependency tracking, autonomous implementation loop, and a React dashboard. One-command Docker setup.
MCP Video Digest
Transcribe and summarize video content from links using various transcription services.