High-quality screenshot capture optimized for Claude Vision API. Automatically tiles full pages into 1072x1072 chunks (1.15 megapixels) with configurable viewports and wait strategies for dynamic content.
Fast, efficient screenshot capture tool for web pages - optimized for Claude Vision API. Automatically tiles full pages into 1072x1072 chunks for optimal AI processing.
Built specifically for AI vision workflows, this tool captures high-quality screenshots with automatic resolution limiting and tiling for optimal processing by Claude Vision API and other AI models. It ensures screenshots are perfectly sized at 1072x1072 pixels (1.15 megapixels) for maximum compatibility.
claude mcp add screenshot-website-fast -s user -- npx -y @just-every/mcp-screenshot-website-fast
code --add-mcp '{"name":"screenshot-website-fast","command":"npx","args":["-y","@just-every/mcp-screenshot-website-fast"]}'
cursor://anysphere.cursor-deeplink/mcp/install?name=screenshot-website-fast&config=eyJzY3JlZW5zaG90LXdlYnNpdGUtZmFzdCI6eyJjb21tYW5kIjoibnB4IiwiYXJncyI6WyIteSIsIkBqdXN0LWV2ZXJ5L21jcC1zY3JlZW5zaG90LXdlYnNpdGUtZmFzdCJdfX0=
Settings ā Tools ā AI Assistant ā Model Context Protocol (MCP) ā Add
Choose "As JSON" and paste:
{"command":"npx","args":["-y","@just-every/mcp-screenshot-website-fast"]}
{
"mcpServers": {
"screenshot-website-fast": {
"command": "npx",
"args": ["-y", "@just-every/mcp-screenshot-website-fast"]
}
}
}
Drop this into your client's mcp.json (e.g. .vscode/mcp.json, ~/.cursor/mcp.json, or .mcp.json for Claude).
Once installed in your IDE, the following tools are available:
screenshot_website_fast
- Captures a high-quality screenshot of a webpage
url
(required): The HTTP/HTTPS URL to capturewidth
(optional): Viewport width in pixels (max 1072, default: 1072)height
(optional): Viewport height in pixels (max 1072, default: 1072)fullPage
(optional): Capture full page screenshot (default: true)waitUntil
(optional): Wait until event: load, domcontentloaded, networkidle0, networkidle2 (default: networkidle2)waitFor
(optional): Additional wait time in millisecondsnpm install
npm run build
# Full page with automatic tiling (default)
npm run dev capture https://example.com -o screenshot.png
# Viewport-only screenshot
npm run dev capture https://example.com --no-full-page -o screenshot.png
# Wait for specific conditions
npm run dev capture https://example.com --wait-until networkidle0 --wait-for 2000 -o screenshot.png
-w, --width <pixels>
- Viewport width (max 1072, default: 1072)-h, --height <pixels>
- Viewport height (max 1072, default: 1072)--no-full-page
- Disable full page capture and tiling--wait-until <event>
- Wait until event: load, domcontentloaded, networkidle0, networkidle2--wait-for <ms>
- Additional wait time in milliseconds-o, --output <path>
- Output file path (required for tiled output)mcp-screenshot-website-fast/
āāā src/
ā āāā internal/ # Core screenshot capture logic
ā āāā utils/ # Logger and utilities
ā āāā index.ts # CLI entry point
ā āāā serve.ts # MCP server entry point
# Run in development mode
npm run dev capture https://example.com -o screenshot.png
# Build for production
npm run build
# Run tests
npm test
# Type checking
npm run typecheck
# Linting
npm run lint
Built specifically for AI vision workflows:
Contributions are welcome! Please:
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
and provide custom executable--wait-for
flag--wait-until
strategiesMIT
Web content fetching and conversion for efficient LLM usage
Enable AI agents to get structured data from unstructured web with AgentQL.
Use 3,000+ pre-built cloud tools to extract data from websites, e-commerce, social media, search engines, maps, and more
Discover, extract, and interact with the web - one interface powering automated access across the public internet.
Automate browser interactions in the cloud (e.g. web navigation, data extraction, form filling, and more)
Extract web data with Firecrawl
Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation.
Leverage Notte Web AI agents & cloud browser sessions for scalable browser automation & scraping workflows
Scrape websites with Oxylabs Web API, supporting dynamic rendering and parsing for structured data extraction.
Playwright MCP server