SuperMCP
Reddit, Twitter, Google Trends, LinkedIn, Medium, Dev.to & News MCP server that uses your Chrome login session. 13 tools, fully local, pip install.
A · Install
· two commands. pip or uvx.
pip uvx
$ uvx --from supermcp supermcp setup $ claude mcp add supermcp -- uvx --from supermcp supermcp
Sources
reddit · linkedin · twitter · blackhatworld · medium · dev.to · google trends
Auth
Chrome cookies (logged-in) · public RSS · public APIs
Runtime
local Python · stdio MCP · zero data sent off-machine
B · Platforms
· one server, every source. Read the deeper write-up per platform.
Reddit MCP 5 tools
Idea validation: "Find recent r/SaaS posts about people frustrated with calendar tools"
Read the full write-up
LinkedIn MCP 3 tools
Pain-point research: "Search LinkedIn for posts where founders mention struggling with hiring"
Twitter MCP 3 tools
Trend monitoring: "Search Twitter for tweets about the Stripe outage in the last 6 hours"
BlackHatWorld MCP 3 tools
Product research: "Search BHW for threads where SEOs complain about Ahrefs in 2026"
Medium MCP 3 tools
Tech writing trends: "What's being written about MCP servers on Medium this month?"
Dev.to MCP 4 tools
Tech-niche pulse: "What's trending under #rust on Dev.to this week?"
C · All 26 tools
· flat list. See the platform write-up for context.
full write-up →
reddit_search
Search all of Reddit for posts matching a query.
reddit_search_subreddit
Search within a specific subreddit only.
reddit_get_post
Get a full post with all top-level comments and metadata.
reddit_get_subreddit_posts
List hot, new, top, or rising posts in any subreddit.
reddit_get_user_activity
Browse a user's recent posts and comments across Reddit.
full write-up →
linkedin_search
Search LinkedIn posts by keyword across all public posts.
linkedin_feed
Get posts from your LinkedIn home feed (algorithmic, what LinkedIn is showing you).
linkedin_post_comments
Get the comments on a specific LinkedIn post.
full write-up →
twitter_search
Search Twitter / X for tweets matching a query.
twitter_get_user_tweets
Pull a user's recent tweets and replies.
twitter_get_tweet
Get a single tweet with its replies and metadata.
BlackHatWorld
full write-up →
bhw_search
Search BlackHatWorld posts and threads (Cloudflare-bypassing browser fetch).
bhw_get_thread
Get a thread with the OP + replies + structured metadata (views, likes, replies, section).
bhw_latest
Latest posts site-wide via official RSS feed (no browser, instant).
Medium
full write-up →
medium_search
Search Medium articles by keyword (login recommended for full results).
medium_tag_feed
Latest articles published under a specific tag (RSS, no login needed).
medium_user_feed
A writer's latest articles (RSS, no login needed).
Dev.to
full write-up →
devto_search
Full-text search across Dev.to articles (uses Dev.to's Algolia index).
devto_trending
Trending articles overall or filtered by tag.
devto_article
Get a full article with nested comments.
devto_user_articles
List a user's published articles.
D · Get an API key
· sign in to generate a key and start using SuperMCP.
Sign in or create an account to generate API keys for Claude, Cursor, or any MCP client.
Login Register
E · Pricing
· free for daily use, $9 one-time for unlimited.
Free
$0 forever
- 100 requests per day
- All 26 tools
- All 7 data sources
- Local processing Get started free
Recommended
Unlimited
$9 one-time
- Unlimited requests
- All 26 tools
- All 7 data sources
- Lifetime access
- Priority support Login to upgrade
F · macOS first-run
· why you'll see a Keychain prompt.
macOS users. On first run, macOS asks for your login keychain password to read Chrome's cookies. This is normal. Click Always Allow so it doesn't ask again. SuperMCP never stores or transmits your password.
G · Common questions
· quick answers.
Will Reddit, LinkedIn, or Twitter ban my account for using this?
Risk is low if you use it the way it was designed. SuperMCP runs from your machine, with your real cookies, your normal IP, and a real Chromium fingerprint. Reddit/LinkedIn/Twitter see traffic that looks like you using your browser, because that's what it is. The accounts that get banned (the ones you read about on r/LinkedInTips and r/openclaw) are usually being run by a paid third-party service that holds your cookies on their server and fires automated actions from a datacenter IP. That's a different threat model. We don't post, follow, like, DM, or perform any write actions; read-only research only.
I tried to register a Reddit API app and got hit with "You cannot create any more applications." How does this avoid that?
That wall (Reddit's Responsible Builder Policy) is why this exists. The official API now gates app creation, OAuth, and rate caps; many personal-use developers have been waiting weeks with no response. SuperMCP doesn't touch the developer API at all. It uses your existing logged-in browser session the same way you'd use Reddit yourself. No client_id, no client_secret, no app approval, no API tier.
How is this different from PhantomBuster, Apify, Linkfinder AI, or other paid scraping services?
Two differences. First, those services run from their datacenters with their cookies (or yours, uploaded to their servers). SuperMCP runs on your machine; your cookies never leave your disk. Second, they're recurring SaaS at $40–$300/mo. SuperMCP is local Python: free with a 100/day limit, $9 one-time for unlimited. If your use case is server-side bulk scraping for a commercial product, those tools probably fit better. For research, idea-mining, and AI-agent workflows, local + your-own-session is the cleaner trust model.
Will Claude's tool-use limit kick in mid-task and break my workflow?
If you're on Claude Desktop, yes, but not because of SuperMCP. Anthropic added a per-turn tool-call cap that affects every MCP server (people on r/ClaudeAI have noticed Desktop Commander, GitHub MCP, and others all hit it). The mitigation: keep individual prompts focused ("search r/SaaS for posts about X, return top 5") instead of asking for sprawling agent-style flows in one turn. SuperMCP tools are designed to return enough on a single call so you don't need to chain. `reddit_get_post` returns the post + full comment tree in one shot, `reddit_get_subreddit_posts` returns 25 by default.
Does this work with Claude Code, Cursor, Windsurf, Cline, or just Claude Desktop?
Any MCP-compatible client. SuperMCP is a standard stdio MCP server, so the same install + `claude mcp add` flow works for Claude Desktop, Claude Code, Cursor, Windsurf, Cline, and the GitHub Copilot Agent. VS Code MCP setup has known gotchas (PATH issues, reload-vs-restart); the install scripts in the snippet above handle those.
How does the cookie reading actually work, technically?
Same way password managers work. Chrome stores its cookies in a SQLite database on disk, encrypted with a key in the OS keychain (Keychain on macOS, libsecret on Linux, DPAPI on Windows). SuperMCP opens that database read-only and uses the cookies to make HTTP requests as you. Cookies are read fresh on every request: never copied to disk, never cached, never sent to our servers.
Do I need Chrome open while it runs?
No. Chrome doesn't have to be running. The cookie database is read straight off disk. The only requirement is that you've logged into the relevant site (Reddit, LinkedIn, Twitter) in Chrome at some point and stayed logged in, which most people do anyway.
What happens when my login session expires?
You get a normal "unauthorized" response on the affected source, you log into the site in Chrome again (the way you would anyway), and the next SuperMCP request works. Sessions on Reddit/LinkedIn/Twitter typically last weeks to months. SuperMCP reads cookies fresh on every request, so the moment you log back in, it picks up the new session.
What about Google Trends and News? Those don't have a login.
Trends and News use public RSS feeds. No browser, no cookies, no session, just instant results. Same for Dev.to and BlackHatWorld (BHW uses its public site).
Is anything ever sent to your servers?
Only your API key, for validation. Nothing else: no cookies, no queries, no results, no telemetry. Inspect the source if you want; it's open. The validation call is one HTTPS request to webmatrices.com to check that your key is active and within its daily limit.
Can I run this on a server / in CI / for a production product?
It's designed for developer + research workflows on your own machine. You can run it on a server in principle (it's just Python), but you'd need to copy the cookie file from a machine where you've logged in, and you'd lose the "reads-fresh-on-every-request" benefit. For server-side bulk extraction, a paid scraping service is a better fit. SuperMCP's sweet spot is one developer, one machine, AI-agent-augmented research.
What's the free tier really limited to?
100 requests per day across all tools combined. That covers casual research: a few keyword searches, reading a handful of threads, checking trends. Past that, $9 one-time unlocks unlimited; no subscription, no recurring charge.
Why is MCP suddenly everywhere?
Anthropic published the protocol in late 2024 and it caught on fast. MotherDuck, Sportradar, Refine, GitHub, TradingView, and hundreds of indie devs have shipped MCP servers in 2026. The pitch is simple: MCP is the USB-C layer for AI tools. Instead of writing glue code per integration, you plug into a standard interface and any MCP-compatible client (Claude, Cursor, etc.) can use it. SuperMCP applies that to the social/web sources that didn't already have a clean API path.
About Contact Privacy Policy Terms & Conditions Disclaimer Author Product Hunt LinkedIn
Search
Posts Tags Apps
Start typing to search across everything
↑ ↓ Navigate ↵ Open Esc Close
Servidores relacionados
Open Brewery DB
Search and retrieve brewery data worldwide using the Open Brewery DB API.
DuckDuckGo Search
Perform web searches using the DuckDuckGo Search API.
LLM Jukebox
Search, download, and extract information from YouTube music videos.
SerpApi
Provides search capabilities and data retrieval from SerpAPI and YouTube for AI assistants.
RSS3
Integrates the RSS3 API to query the Open Web.
PipeCD Docs
Search and retrieve official PipeCD documentation.
Dash API Docs
MCP server for Dash, the macOS API documentation browser
Wolfram Alpha
Access Wolfram Alpha's computational knowledge engine for expert-level answers and data analysis.
RocketReach
Find emails, phone numbers, and enrich company data using the RocketReach API.
Google News
Google News search capabilities with automatic topic categorization and multi-language support via SerpAPI integration.