Redlib MCP Server
An MCP server that integrates your private Redlib instance, allowing AI assistants access to utilize Redlib.
🔴 Redlib MCP Server
A Model Context Protocol (MCP) server that enables AI agents to interact with Reddit through your private Redlib instance. No Reddit API keys required - just a running Redlib instance!
📑 Table of Contents
- Features
- Prerequisites
- Quick Start
- Configuration
- Available Tools
- Integration with AI Clients
- Docker Compose
- Security: Default vs Hardened
- Development
- Example Usage with AI
- Contributing
- License
- Acknowledgments
- Links
✨ Features
- 🔒 Privacy-First - Uses your self-hosted Redlib, no tracking or API keys
- 🛠️ 3 Powerful Tools - Search posts, get hot posts, fetch full post details with comments
- 🐳 Docker Ready - Both simple and hardened Docker images available
- 🧩 Easy Setup - Works with Claude Desktop, Cursor, VS Code, Codex, ForgeCode, KiloCode and any MCP-compatible client
- 📊 Structured Output - Returns clean JSON instead of raw HTML
📋 Prerequisites
Before using this MCP server, you need:
-
Redlib Instance - A running Redlib instance (default:
http://localhost:8080) -
MCP Client - One of:
- Claude Desktop
- Cursor
- VS Code with GitHub Copilot
- OpenAI Codex
- ForgeCode
- KiloCode
🚀 Quick Start
Option 1: Docker (Recommended)
# Pull and run the default version
docker run -i --rm \
--network host \
-e REDLIB_URL=http://localhost:8080 \
alfafadock/mcp-redlib:latest
Option 2: Hardened Docker (Security-Focused)
# Uses non-root user and minimal privileges
docker run -i --rm \
--network host \
--cap-drop=ALL \
--security-opt no-new-privileges:true \
-e REDLIB_URL=http://localhost:8080 \
alfafadock/mcp-redlib:hardened
Option 3: Local Development
# Clone and setup
git clone https://github.com/Devthatdoes/redlib-mcp-server.git
cd redlib-mcp-server
# Install dependencies
npm install
# Build
npm run build
# Run
npm start
🐳 Docker Compose
Click to expand Docker Compose setup
Create docker-compose.yml:
services:
redlib-mcp:
image: alfafadock/mcp-redlib:latest
container_name: redlib-mcp
network_mode: "host" # Uses host network for MCP client communication
environment:
- REDLIB_URL=http://localhost:8080 # Change if Redlib is on a different port
restart: unless-stopped
# For hardened image, change image to: alfafadock/mcp-redlib:hardened
🔧 Configuration
Environment Variables
| Variable | Default | Description |
|---|---|---|
REDLIB_URL | http://localhost:8080 | URL of your Redlib instance |
Custom Redlib Port Example
If your Redlib runs on a different port (e.g., 8085):
docker run -i --rm \
--network host \
-e REDLIB_URL=http://localhost:8085 \
alfafadock/mcp-redlib:latest
Example .env File
Copy .env.example to .env and modify as needed:
cp .env.example .env
🛠️ Available Tools
1. search_reddit
Search Reddit posts using your private Redlib instance.
Parameters:
query(required) - Search query stringsubreddit(optional) - Limit search to specific subreddit
Example:
{
"query": "rust programming",
"subreddit": "rust"
}
Returns: JSON with post IDs, titles, authors, scores, and comment counts.
2. get_subreddit_hot
Get hot posts from a specific subreddit.
Parameters:
subreddit(required) - Subreddit name (without r/)limit(optional) - Number of posts (default: 25)
Example:
{
"subreddit": "rust",
"limit": 10
}
3. get_post
Get a specific post with its comments.
Parameters:
subreddit(required) - Subreddit namepostId(required) - Reddit post ID (from search results)
Example:
{
"subreddit": "rust",
"postId": "abc123"
}
Returns: Full post body, score, and up to 10 top comments.
🔌 Integration with AI Clients
Claude Desktop
Edit ~/.config/claude/claude_desktop_config.json (Linux/macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"redlib": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"--network", "host",
"-e", "REDLIB_URL=http://localhost:8080",
"alfafadock/mcp-redlib:latest"
]
}
}
}
For custom Redlib port (e.g., 8085):
{
"mcpServers": {
"redlib": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"--network", "host",
"-e", "REDLIB_URL=http://localhost:8085",
"alfafadock/mcp-redlib:latest"
]
}
}
}
After updating: Restart Claude Desktop. You should see a hammer icon (🔨) indicating MCP tools are available.
Reference: Claude Desktop MCP Docs
Cursor
Add to ~/.cursor/mcp.json (global) or .cursor/mcp.json (project level):
{
"mcpServers": {
"redlib": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network", "host", "-e", "REDLIB_URL=http://localhost:8080", "alfafadock/mcp-redlib:latest"]
}
}
}
Project-level setup: Create .cursor/mcp.json in your project root.
After updating: Cursor will automatically detect the changes. Use the Command Palette (Cmd/Ctrl+Shift+P) and search for "MCP" to manage servers.
Reference: Cursor MCP Documentation
VS Code / GitHub Copilot
Option A: Workspace Configuration
Create .vscode/mcp.json in your project:
{
"servers": {
"redlib": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network", "host", "-e", "REDLIB_URL=http://localhost:8080", "alfafadock/mcp-redlib:latest"]
}
}
}
Option B: User Configuration (Global)
Use Command Palette (Cmd/Ctrl+Shift+P) → "MCP: Open User Configuration"
{
"mcp": {
"servers": {
"redlib": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network", "host", "-e", "REDLIB_URL=http://localhost:8080", "alfafadock/mcp-redlib:latest"]
}
}
}
}
After updating: Reload VS Code. The tools will appear in GitHub Copilot's Agent Mode.
Reference: VS Code MCP Documentation
OpenAI Codex
Edit ~/.codex/config.toml (global) or .codex/config.toml (project):
[mcp_servers.redlib]
command = "docker"
args = ["run", "-i", "--rm", "--network", "host", "-e", "REDLIB_URL=http://localhost:8080", "alfafadock/mcp-redlib:latest"]
Project-level setup: Create .codex/config.toml in your project root.
After updating: Restart Codex. Use codex mcp list to verify the server is loaded.
Reference: Codex MCP Documentation
ForgeCode
ForgeCode supports MCP servers via the forge mcp command for easy import.
Option A: Quick Import (Recommended)
Use the built-in MCP import functionality:
# Import the Redlib MCP server
forge mcp import alfafadock/mcp-redlib:latest
# List imported servers
forge mcp list
# Reload to apply changes
forge mcp reload
Option B: Project Configuration
Create .mcp.json in your project root:
{
"mcpServers": {
"redlib": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network", "host", "-e", "REDLIB_URL=http://localhost:8080", "alfafadock/mcp-redlib:latest"]
}
}
}
Option C: Global Configuration
Edit ForgeCode's config directory (check extension settings for the exact path).
After updating: Reload the ForgeCode extension using forge mcp reload. The MCP tools should appear in the AI assistant interface.
Reference: ForgeCode Documentation
KiloCode
Option A: Global Configuration
Edit ~/.config/kilo/kilo.jsonc or use Settings → MCP in KiloCode:
{
"mcpServers": {
"redlib": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network", "host", "-e", "REDLIB_URL=http://localhost:8080", "alfafadock/mcp-redlib:latest"]
}
}
}
Option B: Project Configuration
Create .kilocode/mcp.json or kilo.jsonc in your project root:
{
"mcpServers": {
"redlib": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network", "host", "-e", "REDLIB_URL=http://localhost:8080", "alfafadock/mcp-redlib:latest"]
}
}
}
After updating: Open KiloCode Settings → MCP → Add Server, or edit the config file directly.
Reference: KiloCode MCP Documentation
🔒 Security: Default vs Hardened
| Feature | Default (latest) | Hardened (hardened) |
|---|---|---|
| User | root | Non-root (mcpuser) |
| File Ownership | root | mcpuser |
| Build Stages | Single | Multi-stage (smaller) |
| Runtime Caps | Default | Requires --cap-drop=ALL |
| Use Case | Development, testing | Production |
Using Hardened Image
docker run -i --rm \
--cap-drop=ALL \
--security-opt no-new-privileges:true \
--network host \
-e REDLIB_URL=http://localhost:8080 \
alfafadock/mcp-redlib:hardened
💻 Development
Project Structure
redlib-mcp-server/
├── src/
│ └── index.ts # Main server code
├── dist/ # Compiled JavaScript (gitignored)
├── Dockerfile # Default Docker image
├── Dockerfile.hardened # Hardened Docker image
├── docker-compose.yml # Production compose
├── package.json
├── tsconfig.json
└── README.md
Build from Source
# Install dependencies
npm install
# Build TypeScript
npm run build
# Run locally
npm start
Build Custom Docker Images
# Default version
docker build -t redlib-mcp-server .
# Hardened version
docker build -f Dockerfile.hardened -t redlib-mcp-server:hardened .
📝 Example Usage with AI
Once connected to your AI client (e.g., Claude), you can:
User: "Search Reddit for 'home lab setup' and summarize the top results"
AI uses search_reddit tool →
Returns structured JSON with posts →
AI summarizes the findings for you
User: "Get the full post and comments for that Rust tutorial I searched earlier"
AI uses get_post with postId →
Returns post body + comments →
AI provides detailed analysis
🤝 Contributing
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Submit a pull request
📄 License
MIT License - feel free to use this project however you want!
🙏 Acknowledgments
- Redlib - The private Reddit front-end this server interfaces with
- Model Context Protocol - The protocol that makes this integration possible
- Cheerio - HTML parsing library used to extract structured data
🔗 Links
- Docker Hub: alfafadock/mcp-redlib
- Issues: GitHub Issues
- Redlib: github.com/redlib-org/redlib
Made with ❤️ for the privacy-conscious Reddit community
関連サーバー
Bright Data
スポンサーDiscover, extract, and interact with the web - one interface powering automated access across the public internet.
Intelligence Aeternum (Fluora MCP)
AI training dataset marketplace — 2M+ museum artworks across 7 world-class institutions with on-demand 111-field Golden Codex AI enrichment. x402 USDC micropayments on Base L2. First monetized art/provenance MCP server. Research-backed: dense metadata improves VLM capability by +25.5% (DOI: 10.5281/zenodo.18667735)
Novada-MCP
Search, extract, crawl, map, and research the web — from any AI agent or terminal.
Conduit
Headless browser with SHA-256 hash-chained audit trails and Ed25519-signed proof bundles. MCP server for AI agents.
Influship Influencer Marketing MCP
AI Influencer Search, Creator Data, & Live Scraping
Patchright Lite MCP Server
A server that wraps the Patchright SDK to provide stealth browser automation for AI models.
Playwright MCP
Browser automation using Playwright, enabling LLMs to interact with web pages through structured accessibility snapshots.
Bilibili
Interact with the Bilibili video website, enabling actions like searching for videos, retrieving video information, and accessing user data.
AI Shopping Assistant
A conversational AI shopping assistant for web-based product discovery and decision-making.
CrawlAPI
Scrape any URL with JavaScript rendering and get back clean markdown — built for AI agents, LLM pipelines, and autonomous research workflows.
Docs Fetch MCP Server
Fetch web page content with recursive exploration.