hn-mcp-server
Hacker News feeds and search
@cyanheads/hn-mcp-server
MCP server for Hacker News. Feeds, threaded discussions, user profiles, and full-text search via the HN Firebase and Algolia APIs. Runs over stdio or HTTP.
Public Hosted Server: https://hn.caseyjhand.com/mcp
Tools
Four read-only tools for accessing Hacker News data:
| Tool Name | Description |
|---|---|
hn_get_stories | Fetch stories from an HN feed (top, new, best, ask, show, jobs) with pagination. |
hn_get_thread | Get an item and its comment tree as a threaded discussion with depth/count controls. |
hn_get_user | Fetch a user profile with karma, about, and optionally their recent submissions. |
hn_search_content | Search stories and comments via Algolia with type, author, date, and score filters. |
hn_get_stories
Fetch stories from any HN feed with pagination support.
- Six feed types:
top,new,best,ask,show,jobs - Configurable count (1–100, default 30) and offset for pagination
- Returns enriched story objects with title, URL, score, author, comment count, and body text
hn_get_thread
Retrieve an item and its full comment tree via ranked breadth-first traversal.
- Depth control (0–10, default 3) — depth 0 doubles as a single-item lookup
- Comment limit (1–200, default 50) caps total comments across all levels
- Breadth-first traversal preserves HN's ranking order
- Flat comment list with
depth/parentIdfor tree reconstruction
hn_get_user
Fetch a user profile with optional recent submission resolution.
- Profile includes karma, creation date, and about text (HTML stripped)
- Optionally resolves up to 50 most recent submissions into full items
- Submission resolution filters out dead/deleted items
hn_search_content
Full-text search via the Algolia HN Search API.
- Filter by content type:
story,comment,ask_hn,show_hn,front_page - Filter by author, date range (ISO 8601), and minimum points
- Sort by relevance or date
- Pagination with page/count controls
Features
Built on @cyanheads/mcp-ts-core:
- Declarative tool definitions — single file per tool, framework handles registration and validation
- Unified error handling across all tools
- Structured logging with request correlation
- Runs locally (stdio/HTTP) from the same codebase
HN-specific:
- Concurrent batch fetching with configurable parallelism for item resolution
- HTML entity decoding and tag stripping with code block and link preservation
- No API keys required — HN APIs are public
Getting Started
Public Hosted Instance
A public instance is available at https://hn.caseyjhand.com/mcp — no installation required. Point any MCP client at it via Streamable HTTP:
{
"mcpServers": {
"hn": {
"type": "streamable-http",
"url": "https://hn.caseyjhand.com/mcp"
}
}
}
Self-Hosted / Local
Add to your MCP client config (e.g., claude_desktop_config.json):
{
"mcpServers": {
"hn-mcp-server": {
"type": "stdio",
"command": "bunx",
"args": ["@cyanheads/hn-mcp-server@latest"]
}
}
}
Or with npx:
{
"mcpServers": {
"hn-mcp-server": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@cyanheads/hn-mcp-server"]
}
}
}
Prerequisites
- Bun v1.2.0 or higher (or Node.js >= 22)
Installation
git clone https://github.com/cyanheads/hn-mcp-server.git
cd hn-mcp-server
bun install
Configuration
All configuration is via environment variables. No API keys required — HN APIs are public.
| Variable | Description | Default |
|---|---|---|
HN_CONCURRENCY_LIMIT | Max concurrent HTTP requests for batch item fetches (1–50). | 10 |
MCP_TRANSPORT_TYPE | Transport: stdio or http. | stdio |
MCP_HTTP_PORT | HTTP server port. | 3010 |
MCP_HTTP_HOST | HTTP server host. | localhost |
MCP_LOG_LEVEL | Log level: debug, info, notice, warning, error. | info |
LOGS_DIR | Directory for log files (Node.js only). | <project-root>/logs |
Running the Server
Local Development
bun run dev:stdio # Dev mode (stdio, auto-reload)
bun run dev:http # Dev mode (HTTP, auto-reload)
bun run test # Run test suite
bun run devcheck # Lint + format + typecheck + audit
Production
bun run build
bun run start:stdio # or start:http
Docker
docker build -t hn-mcp-server .
docker run -p 3010:3010 hn-mcp-server
Project Structure
| Directory | Purpose |
|---|---|
src/index.ts | createApp() entry point. |
src/config/ | Server-specific env var parsing with Zod. |
src/services/hn/ | HN Firebase + Algolia API client and domain types. |
src/mcp-server/tools/definitions/ | Tool definitions (*.tool.ts). |
Development Guide
See CLAUDE.md for development guidelines and architectural rules. The short version:
- Handlers throw, framework catches — no
try/catchin tool logic - Use
ctx.logfor request-scoped logging - All tools are read-only — no auth scopes required
Contributing
Issues and pull requests are welcome. Run checks before submitting:
bun run devcheck
bun run test
License
Apache-2.0 — see LICENSE for details.
相關伺服器
OSHA Compliance Assistant
Check workplace safety compliance against OSHA General Industry standards (29 CFR 1910) with cited regulation sections and corrective actions.
wordpress-mcp
Lightweight WordPress MCP server with 42 tools. Token-optimized responses reduce REST API verbosity by 95%+. Posts, pages, users, plugins, themes, media, taxonomies, comments.
Home Assistant MCP
An MCP integration for controlling Home Assistant devices with AI assistants.
Xmind Generator
A server for generating Xmind mind maps from various data sources.
TRIGGERcmd
Runs commands on your computers remotely.
Coda
Interact with the Coda API to manage documents and pages, including creating, reading, updating, and deleting.
MCP Sharepoint (aisuru)
A Model Context Protocol (MCP) server that exposes core Microsoft SharePoint / Microsoft Graph API functionalities as tools usable by LLM agents (e.g. Claude Desktop).
AppleScript MCP
Execute AppleScript to gain full control of your Mac.
QiQ Social
AI-powered multi-platform content publishing. 25 tools to create posts, run automations, manage RSS feeds, generate hashtags, search images, and publish to Instagram, Facebook, LinkedIn, X, Threads, Telegram, Discord, WordPress, YouTube, TikTok, and more.
Marketing Automation MCP Server
Automates marketing operations with AI-powered optimization, real-time analytics, and multi-platform integration.