HyperStore
Search 6,500+ curated AI apps from the HyperStore directory via 8 MCP tools.
HyperStore MCP
Plug 6,500+ AI apps into any LLM via the Model Context Protocol.
HyperStore is a curated directory of 6,500+ AI applications, developed by HyperGPT. This MCP server exposes the HyperStore catalog to any LLM client — Claude, ChatGPT, Cursor, Windsurf, Cline, Zed, Gemini, and anything else that speaks MCP.
Ask your LLM:
"Find me a free AI tool that summarises PDFs." "Compare ChatGPT, Claude, and Gemini side-by-side." "Show me the top 5 image-generation apps with an API."
The LLM calls HyperStore MCP behind the scenes and answers with up-to-date, curated results.
What you get
8 tools:
| Tool | Purpose |
|---|---|
search_apps | Full-text keyword search |
ai_search | Embedding-based semantic search |
get_app | Full app detail (features, screenshots, pricing) |
list_apps | Paginated apps with filters (category, pricing) |
list_categories | Browse all 30+ categories |
category_apps | Apps within a category |
browse_apps | A-Z directory listing |
get_homepage | Trending + top categories overview |
3 resources:
hyperstore://app/{slug}— markdown rendering of any apphyperstore://category/{slug}— top apps in a categoryhyperstore://catalog— full category index
3 prompts:
find_tool_for_task— guided discovery for a taskcompare_apps— side-by-side app comparisondiscover_category— explore a topic
Install
Option A — uvx (zero install, recommended)
Requires uv. One command and you're done:
uvx hyperstore-mcp
Option B — pipx
pipx install hyperstore-mcp
hyperstore-mcp
Option C — Docker (for remote hosting)
docker run --rm -p 8080:8080 ghcr.io/deficlow/hyperstore-mcp
# Now MCP Streamable HTTP at http://localhost:8080/mcp
Option D — Hosted endpoint (no install)
Use our managed Streamable HTTP server:
https://mcp.store.hypergpt.ai/mcp
Connect from your LLM client
Claude Desktop
Edit ~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Restart Claude → tools appear in the 🛠 menu.
Claude Code
claude mcp add hyperstore -- uvx hyperstore-mcp
Cursor
.cursor/mcp.json (project) or ~/.cursor/mcp.json (global):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Windsurf
~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Cline (VS Code)
settings.json:
{
"cline.mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Zed
~/.config/zed/settings.json:
{
"context_servers": {
"hyperstore": {
"command": {
"path": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
}
Gemini CLI
~/.gemini/settings.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
ChatGPT (Pro / Team / Enterprise)
Settings → Connectors → Add custom connector:
- Name: HyperStore
- MCP Server URL:
https://mcp.store.hypergpt.ai/mcp - Authentication: None
OpenAI Responses API
from openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4.1",
tools=[{
"type": "mcp",
"server_label": "hyperstore",
"server_url": "https://mcp.store.hypergpt.ai/mcp",
"require_approval": "never",
}],
input="Find me 3 free AI tools for writing unit tests.",
)
print(response.output_text)
Anthropic Messages API
from anthropic import Anthropic
client = Anthropic()
response = client.messages.create(
model="claude-opus-4-7",
max_tokens=1024,
mcp_servers=[{
"type": "url",
"url": "https://mcp.store.hypergpt.ai/mcp",
"name": "hyperstore",
}],
messages=[{"role": "user", "content": "Top 5 AI image generators?"}],
)
See examples/ for ready-to-paste configs for every supported client.
Self-hosting
For self-hosting, use the Docker image.
For direct invocation without Docker, the CLI accepts --transport http|sse
(see hyperstore-mcp --help).
Configuration
When self-hosting, these environment variables can be set
(see .env.example for the full list):
| Variable | Default | Purpose |
|---|---|---|
MCP_HOST | 0.0.0.0 | Bind host (http/sse transports) |
MCP_PORT | 8080 | Bind port (http/sse transports) |
LOG_LEVEL | INFO | Logging level (DEBUG, INFO, WARNING, ERROR) |
Development
git clone https://github.com/deficlow/HyperStore-MCP
cd HyperStore-MCP
uv sync --all-extras
uv run pytest
uv run hyperstore-mcp # stdio mode for local testing
Inspect the running server with the official MCP Inspector:
npx @modelcontextprotocol/inspector uvx hyperstore-mcp
How it works
HyperStore MCP is a thin async wrapper around the HyperStore public REST API. It is read-only — no credentials, no writes, no PII. The same data that powers the website powers the MCP server. Updates land in your LLM the moment they land on the site.
LLM client ──MCP──▶ hyperstore-mcp ──HTTPS──▶ store.hypergpt.ai/api
License
MIT © HyperGPT
Related Servers
Stack Overflow
Access Stack Overflow's trusted and verified technical questions and answers.
Rhumb MCP
Agent-native tool intelligence — discover, score, and compare 600+ APIs across 16 MCP tools. Zero-signup discovery, AN Score methodology, failure mode data, and managed execution.
Japan postalcode MCP
An MCP server for searching Japanese postal codes.
gemini-embedding-2-mcp
A powerful Model Context Protocol (MCP) server using gemini embedding 3 that transforms any local directory into an ultrafast, visually-aware spatial search engine for AI agents.
SIMAP MCP Server
An MCP (Model Context Protocol) server for interacting with SIMAP.ch, Switzerland's public procurement platform.
Minecraft Wiki MCP
A server for browsing and searching the official Minecraft Wiki.
Google Scholar
Search and access academic papers on Google Scholar.
中指房产估值MCP
MCP服务器,提供房产小区评级和评估功能
Legal MCP Server Argentina
A server for intelligent search and jurisprudence analysis of Argentine legal documents.
MCP Gemini Google Search
Performs Google searches using Gemini's built-in Grounding with Google Search feature.