Pulse
Open compute-pricing reference — daily GPU and inference-token medians, free under CC-BY 4.0.
Pulse MCP
MCP server for Pulse — the open compute-pricing reference. Daily GPU and inference-token medians, free under CC-BY 4.0.
The server is hosted at https://pulsebenchmarks.com/mcp. This repository is a public mirror of the production handler, kept here so MCP registries and end users have a stable source URL.
What you can do with it
Install the server in Claude Desktop, Cursor, Cline, or any MCP-aware client. You get five tools that wrap the public Pulse API and return JSON with pre-formatted citations:
list_indices— every published index with current value, freshness, and methodology versionget_series— full payload for one index: history, per-provider observations, contributing-provider listlatest_value— quick lookup for one index, withcite_asstring ready to pasteget_status— pipeline freshness and per-provider collection healthcompare_indices— current values across multiple indices side-by-side
The data covers four GPU series (H100 SXM and A100 80GB, hyperscaler vs neocloud), four additional neocloud series (B200, H200 141GB, H100 PCIe, A100 80GB spot), and an inference-token basket anchored on Llama 3.3 70B FP8.
Install
The server is hosted, so the install is the same on every client: point mcp-remote at the public endpoint.
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or the equivalent on Windows/Linux:
{
"mcpServers": {
"pulse": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://pulsebenchmarks.com/mcp"]
}
}
}
Restart Claude Desktop. You should see pulse in the tools menu.
Cursor
Add to ~/.cursor/mcp.json (or use the in-app MCP settings). Same JSON as above; full file in examples/cursor_mcp.json.
Cline / other clients
Any MCP client that speaks mcp-remote over HTTP works the same way. The endpoint is https://pulsebenchmarks.com/mcp; transport is JSON-RPC 2.0 over POST; no auth.
Tools
| Tool | Description |
|---|---|
list_indices | List every published Pulse compute-pricing index with its current value, freshness, methodology version, and status. |
get_series | Fetch the full payload for one Pulse index: metadata, gated daily headline series, per-provider observations, contributing-provider list. Optional range filter (30d / 90d / 1y / all). |
latest_value | Quick lookup: the latest published value for one index, with minimal payload and a ready-to-paste cite_as string. |
get_status | Pipeline freshness and per-provider collection health. |
compare_indices | Compare current values across multiple Pulse indices (e.g. hyperscaler vs neocloud for the same GPU). |
Full input schemas and tool implementations are in functions/mcp.js.
Data and methodology
The MCP server is a thin wrapper over Pulse's public HTTP endpoints. The same data is available without an MCP client:
- Methodology: https://pulsebenchmarks.com/methodology/
- Public API:
GET https://pulsebenchmarks.com/api/indices,GET https://pulsebenchmarks.com/api/indices/{slug},GET https://pulsebenchmarks.com/api/status - Bulk export (CC-BY 4.0): https://pulsebenchmarks.com/data/data_export.json
- Reproducibility script: https://pulsebenchmarks.com/methodology/#reproducibility
- Agent guide: https://pulsebenchmarks.com/llms.txt
Methodology v1.0 covers four headline GPU series (H100 SXM and A100 80GB, hyperscaler vs neocloud) plus a Llama 3.3 70B FP8 inference-token basket. Series additions and methodology changes are versioned and documented in the methodology changelog.
Self-host
The handler is a single Cloudflare Pages Function: functions/mcp.js. It has no dependencies, no build step, and no environment variables. To deploy your own:
- Drop
functions/mcp.jsinto a Cloudflare Pages project (or adapt to Workers —onRequestmaps tofetchwith minor changes). - The handler issues same-origin sub-requests to
/api/indices,/api/indices/{slug}, and/api/status— point those at the public Pulse endpoints, or proxy them through your own deployment. - Hit
GET /mcpto confirm the discovery payload returns; hitPOST /mcpwith aninitializeJSON-RPC envelope to confirm the protocol layer.
For most users we recommend the hosted server. Self-hosting is mainly useful if you want to add caching, vendor it into a private network, or extend the tool set against the same public endpoints.
License
- Server code (this repository): Apache License 2.0
- Pricing data returned by the server: Creative Commons Attribution 4.0 International (CC-BY 4.0)
When citing a value, prefer the cite_as string returned by latest_value. It already includes the index name, methodology version, year, and canonical URL.
Contact and contributions
The production handler lives in the main Pulse codebase; changes there flow into this repository on a release cadence. PRs are reviewed as time allows; for fast turnaround on data or tool-shape changes, open an issue first so we can align before you write code.
関連サーバー
MCP Server Sample
Provides weather information tools using the US National Weather Service API.
Agent Safe Email MCP
A Remote MCP Server that checks every email before your agent acts on it. Connect via MCP protocol, pay per use with Skyfire.
Tailscale MCP Server
Integrate with Tailscale's CLI and API for automated network management and monitoring.
Coolify MCP Server
An MCP server for integrating with Coolify, the self-hostable alternative to Netlify and Vercel.
招投标大数据服务
Provides cloud migration services, including asset usage analysis, technology stack evaluation, and migration planning.
OpenWeatherMap
Provides comprehensive weather data and forecasts using the OpenWeatherMap API.
Amazon Nova Reel 1.1
An MCP server for generating videos using Amazon Nova Reel 1.1 via AWS Bedrock.
VixMCP.Ai.Bridge
An MCP server that exposes VMware VIX operations for AI assistants and automation workflows.
Spotify
Control Spotify playback and manage your liked songs using LLMs.
Foreman
Integrate with Foreman to manage hosts and other infrastructure resources.