CrawlForge MCP
CrawlForge MCP is a production-ready MCP server with 18 web scraping tools for AI agents. It gives Claude, Cursor, and any MCP-compatible client the ability to fetch URLs, extract structured data with CSS/XPath selectors, run deep multi-step research, bypass anti-bot detection with TLS fingerprint randomization, process documents, monitor page changes, and more. Credit-based pricing with a free tier (1,000 credits/month, no credit card required).
CrawlForge MCP Server
Professional web scraping and content extraction server implementing the Model Context Protocol (MCP). Get started with 1,000 free credits - no credit card required!
šÆ Features
- 20 Professional Tools: Web scraping, deep research, stealth browsing, content analysis
- Free Tier: 1,000 credits to get started instantly
- MCP Compatible: Works with Claude, Cursor, and other MCP-enabled AI tools
- Enterprise Ready: Scale up with paid plans for production use
- Credit-Based: Pay only for what you use
š Quick Start (2 Minutes)
1. Install from NPM
npm install -g crawlforge-mcp-server
2. Setup Your API Key
npx crawlforge-setup
This will:
- Guide you through getting your free API key
- Configure your credentials securely
- Auto-configure Claude Code and Cursor (if installed)
- Verify your setup is working
Don't have an API key? Get one free at https://www.crawlforge.dev/signup
3. Configure Your IDE (if not auto-configured)
š¤ For Claude Desktop
Add to claude_desktop_config.json:
{ "mcpServers": { "crawlforge": { "command": "npx", "args": ["crawlforge-mcp-server"] } } }
Location:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%/Claude/claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Restart Claude Desktop to activate.
š„ļø For Claude Code CLI (Auto-configured)
The setup wizard automatically configures Claude Code by adding to ~/.claude.json:
{ "mcpServers": { "crawlforge": { "type": "stdio", "command": "crawlforge" } } }
After setup, restart Claude Code to activate.
š» For Cursor IDE (Auto-configured)
The setup wizard automatically configures Cursor by adding to ~/.cursor/mcp.json:
Restart Cursor to activate.
š Available Tools
Basic Tools (1 credit each)
fetch_url- Fetch content from any URLextract_text- Extract clean text from web pagesextract_links- Get all links from a pageextract_metadata- Extract page metadata
Advanced Tools (2-3 credits)
scrape_structured- Extract structured data with CSS selectorssearch_web- Search the web using Google Search APIsummarize_content- Generate intelligent summariesanalyze_content- Comprehensive content analysisextract_structured- LLM-powered schema-driven extractiontrack_changes- Monitor content changes over time
Premium Tools (5-10 credits)
crawl_deep- Deep crawl entire websitesmap_site- Discover and map website structurebatch_scrape- Process multiple URLs simultaneouslydeep_research- Multi-stage research with source verificationstealth_mode- Anti-detection browser management
Heavy Processing (3-10 credits)
process_document- Multi-format document processingextract_content- Enhanced content extractionscrape_with_actions- Browser automation chainsgenerate_llms_txt- Generate AI interaction guidelineslocalization- Multi-language and geo-location management
š³ Pricing
| Plan | Credits/Month | Best For |
|---|---|---|
| Free | 1,000 | Testing & personal projects |
| Starter | 5,000 | Small projects & development |
| Professional | 50,000 | Professional use & production |
| Enterprise | 250,000 | Large scale operations |
All plans include:
- Access to all 20 tools
- Credits never expire and roll over month-to-month
- API access and webhook notifications
View full pricing
š§ Advanced Configuration
Environment Variables
Optional: Set API key via environment
export CRAWLFORGE_API_KEY="cf_live_your_api_key_here"
Optional: Custom API endpoint (for enterprise)
export CRAWLFORGE_API_URL="https://api.crawlforge.dev"
Manual Configuration
Your configuration is stored at ~/.crawlforge/config.json:
{ "apiKey": "cf_live_...", "userId": "user_...", "email": "[email protected]" }
š Usage Examples
Once configured, use these tools in your AI assistant:
"Search for the latest AI news"
"Extract all links from example.com"
"Crawl the documentation site and summarize it"
"Monitor this page for changes"
"Extract product prices from this e-commerce site"
š Security & Privacy
- Secure Authentication: API keys required for all operations (no bypass methods)
- Local Storage: API keys stored securely at
~/.crawlforge/config.json - HTTPS Only: All connections use encrypted HTTPS
- No Data Retention: We don't store scraped data, only usage logs
- Rate Limiting: Built-in protection against abuse
- Compliance: Respects robots.txt and GDPR requirements
Security Updates
v3.0.3 (2025-10-01): Removed authentication bypass vulnerability. All users must authenticate with valid API keys.
š Support
- Documentation: https://www.crawlforge.dev/docs
- Issues: GitHub Issues
- Email: [email protected]
- Discord: Join our community
š License
MIT License - see LICENSE file for details.
š¤ Contributing
Contributions are welcome! Please read our Contributing Guide first.
Built with ā¤ļø by the CrawlForge team
Website | Documentation | API Reference
Server Terkait
Bright Data
sponsorDiscover, extract, and interact with the web - one interface powering automated access across the public internet.
CrawlAPI
Scrape any URL with JavaScript rendering and get back clean markdown ā built for AI agents, LLM pipelines, and autonomous research workflows.
Puppeteer MCP Server
Automate browser interactions using Puppeteer, controlling new or existing Chrome instances.
Oxylabs AI Studio
AI-powered tools for web scraping, crawling, and browser automation.
Intelligence Aeternum (Fluora MCP)
AI training dataset marketplace ā 2M+ museum artworks across 7 world-class institutions with on-demand 111-field Golden Codex AI enrichment. x402 USDC micropayments on Base L2. First monetized art/provenance MCP server. Research-backed: dense metadata improves VLM capability by +25.5% (DOI: 10.5281/zenodo.18667735)
Fetch
Fetch web content as HTML, JSON, plain text, or Markdown.
GasBuddy MCP Price Tracker
MCP server to get the cheapest gas prices in a particular city or zip code from gasbuddy.com
Playwright Server
A server providing Playwright tools for browser automation and web scraping.
WebSearch
A web search and content extraction tool using the Firecrawl API for advanced web scraping, searching, and content analysis.
visa-jobs-mcp
Identify US visa-sponsoring opportunities on LinkedIn and access the right contacts to accelerate your outreach.
Financial Data MCP Server
Provides real-time financial market data from Yahoo Finance.