CrawlForge MCP

CrawlForge MCP is a production-ready MCP server with 18 web scraping tools for AI agents. It gives Claude, Cursor, and any MCP-compatible client the ability to fetch URLs, extract structured data with CSS/XPath selectors, run deep multi-step research, bypass anti-bot detection with TLS fingerprint randomization, process documents, monitor page changes, and more. Credit-based pricing with a free tier (1,000 credits/month, no credit card required).

CrawlForge MCP Server

Professional web scraping and content extraction server implementing the Model Context Protocol (MCP). Get started with 1,000 free credits - no credit card required!

License: MIT Node.js Version MCP Protocol npm version

🎯 Features

  • 18 Professional Tools: Web scraping, deep research, stealth browsing, content analysis
  • Free Tier: 1,000 credits to get started instantly
  • MCP Compatible: Works with Claude, Cursor, and other MCP-enabled AI tools
  • Enterprise Ready: Scale up with paid plans for production use
  • Credit-Based: Pay only for what you use

πŸš€ Quick Start (2 Minutes)

1. Install from NPM

npm install -g crawlforge-mcp-server

2. Setup Your API Key

npx crawlforge-setup

This will:

  • Guide you through getting your free API key
  • Configure your credentials securely
  • Auto-configure Claude Code and Cursor (if installed)
  • Verify your setup is working

Don't have an API key? Get one free at https://www.crawlforge.dev/signup

3. Configure Your IDE (if not auto-configured)

πŸ€– For Claude Desktop

Add to claude_desktop_config.json:

{ "mcpServers": { "crawlforge": { "command": "npx", "args": ["crawlforge-mcp-server"] } } }

Location:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%/Claude/claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Restart Claude Desktop to activate.

πŸ–₯️ For Claude Code CLI (Auto-configured)

The setup wizard automatically configures Claude Code by adding to ~/.claude.json:

{ "mcpServers": { "crawlforge": { "type": "stdio", "command": "crawlforge" } } }

After setup, restart Claude Code to activate.

πŸ’» For Cursor IDE (Auto-configured)

The setup wizard automatically configures Cursor by adding to ~/.cursor/mcp.json:

Restart Cursor to activate.

πŸ“Š Available Tools

Basic Tools (1 credit each)

  • fetch_url - Fetch content from any URL
  • extract_text - Extract clean text from web pages
  • extract_links - Get all links from a page
  • extract_metadata - Extract page metadata

Advanced Tools (2-3 credits)

  • scrape_structured - Extract structured data with CSS selectors
  • search_web - Search the web using Google Search API
  • summarize_content - Generate intelligent summaries
  • analyze_content - Comprehensive content analysis

Premium Tools (5-10 credits)

  • crawl_deep - Deep crawl entire websites
  • map_site - Discover and map website structure
  • batch_scrape - Process multiple URLs simultaneously
  • deep_research - Multi-stage research with source verification
  • stealth_mode - Anti-detection browser management

Heavy Processing (3-10 credits)

  • process_document - Multi-format document processing
  • extract_content - Enhanced content extraction
  • scrape_with_actions - Browser automation chains
  • generate_llms_txt - Generate AI interaction guidelines
  • localization - Multi-language and geo-location management

πŸ’³ Pricing

PlanCredits/MonthBest For
Free1,000Testing & personal projects
Starter5,000Small projects & development
Professional50,000Professional use & production
Enterprise250,000Large scale operations

All plans include:

  • Access to all 18 tools
  • Credits never expire and roll over month-to-month
  • API access and webhook notifications

View full pricing

πŸ”§ Advanced Configuration

Environment Variables

Optional: Set API key via environment

export CRAWLFORGE_API_KEY="cf_live_your_api_key_here"

Optional: Custom API endpoint (for enterprise)

export CRAWLFORGE_API_URL="https://api.crawlforge.dev"

Manual Configuration

Your configuration is stored at ~/.crawlforge/config.json:

{ "apiKey": "cf_live_...", "userId": "user_...", "email": "[email protected]" }

πŸ“– Usage Examples

Once configured, use these tools in your AI assistant:

"Search for the latest AI news"
"Extract all links from example.com"
"Crawl the documentation site and summarize it"
"Monitor this page for changes"
"Extract product prices from this e-commerce site"

πŸ”’ Security & Privacy

  • Secure Authentication: API keys required for all operations (no bypass methods)
  • Local Storage: API keys stored securely at ~/.crawlforge/config.json
  • HTTPS Only: All connections use encrypted HTTPS
  • No Data Retention: We don't store scraped data, only usage logs
  • Rate Limiting: Built-in protection against abuse
  • Compliance: Respects robots.txt and GDPR requirements

Security Updates

v3.0.3 (2025-10-01): Removed authentication bypass vulnerability. All users must authenticate with valid API keys.

πŸ†˜ Support

πŸ“„ License

MIT License - see LICENSE file for details.

🀝 Contributing

Contributions are welcome! Please read our Contributing Guide first.


Built with ❀️ by the CrawlForge team

Website | Documentation | API Reference

Related Servers

NotebookLM Web Importer

Import web pages and YouTube videos to NotebookLM with one click. Trusted by 200,000+ users.

Install Chrome Extension