Unchained Sky
Browser automation MCP server that connects AI agents to your real Chrome browser with structured page understanding in ~500 tokens
Unchained Infra
Open infrastructure and control plane for Unchained, a browser automation system built on raw Chrome DevTools Protocol.
Most browser agents break at authentication. Unchained avoids that by driving the user's own Chrome session, with their existing cookies, extensions, IP, and 2FA state, instead of replaying brittle logins in a sandbox.
This repository contains the public-facing pieces of that system: relay, web UI, agent packaging, browser bridge, deployment assets, and server orchestration. The proprietary extraction engine lives behind a documented runtime boundary in a separate private repository.
What is in this repo
unchained/web.py: chat UI, auth flows, scheduler UI, SSE chat transportunchained/relay.py: WebSocket relay for browser-agent tunnels and CDP clientsunchained/chrome_bridge.py: local or headless bridge from Chrome CDP to relayunchained/chat_agent_cli.py: local agent lanes for Claude CLI, Codex CLI, and related model backendsunchained/agent_package.py: downloadable agent bundle generatordocker-compose.yml: production deployment topologydeploy.shanddeploy_headless.sh: EC2 deployment entrypoints
What stays private
The DDM, page-intelligence, and core CDP execution logic are not stored in this
repository. Public code talks to that layer through private_core_client.py.
That split is intentional:
- the relay, UI, auth, packaging, and deployment code are open for inspection
- the high-leverage browser extraction heuristics stay in the private core repo
- CI enforces the boundary with import guards and artifact checks
See docs/open-core-split-plan.md for the current open-core model.
Why developers can evaluate this repo
- The browser tunnel is here. You can inspect how agent auth, relay routing, and CDP proxying actually work.
- The deployment path is here. Docker Compose, Caddy routing, EC2 deploy scripts, and headless worker definitions are part of the public repo.
- The trust boundary is explicit. Public services do not directly import private browser-intelligence modules.
- The local dev path is real. You can run the relay and web app locally with
./dev.sh.
System shape
Phone / browser
|
| HTTPS + SSE
v
Caddy -> web -> private_core_client -> private core service
| |
| +-> chat agent websocket
|
+-> relay -> chrome_bridge -> user's Chrome DevTools endpoint
More detail: docs/architecture.md
Quick start
Local development
cd unchained-infra/unchained
uv sync
cd ..
./dev.sh
Then open http://localhost:8080.
If Google OAuth is not configured, the app falls back to dev auth:
curl -X POST http://localhost:8080/auth/dev \
-H 'Content-Type: application/json' \
-d '{"email":"dev@localhost"}'
Production deploy
cd ~/Projects/unchainedsky_com
./unchained-infra/tools/install_private_core.sh \
unchained-core-private/unchained \
unchained-infra/unchained
cd unchained-infra
KEY_PATH=~/.ssh/unchained-key.pem \
EC2_HOST=<host> \
EC2_USER=ubuntu \
./deploy.sh
Verification
These are the quickest checks for the public repo boundary and local stack:
cd unchained-infra/unchained
uv run python test_open_core_boundary.py
cd ..
python3 tools/oss_guard/check_private_imports.py
python3 tools/oss_guard/check_agent_artifact_leaks.py
Repository layout
unchained-infra/
├── docs/ # Architecture, setup, roadmap, and design notes
├── deploy/ # Deployment helpers
├── tools/ # Private-core overlay + OSS boundary guards
├── unchained/ # Python application code
├── docker-compose.yml # Production stack
├── docker-compose.headless.yml
└── deploy.sh
Documentation
- docs/README.md: docs index
- docs/architecture.md: service layout and request flow
- docs/cloud-tools-execution-map.md: where browser actions execute across the public/private boundary
- docs/debugging-map.md: trace events and incident triage
- docs/mcp-local-browser-guide.md: run production MCP against your local Chrome bridge
- docs/mcp-frontend-route-plan.md: plan for
a public
/mcponboarding route and positioning copy - docs/you-navigate-demo.md: local setup and reward-critic framing for the "Unchained drives. You navigate." demo
- docs/split-repo-setup.md: CI and private-core overlay
- unchained/benchmark/README.md: local benchmark flow
- unchained/README.md: package-level developer notes
License
관련 서버
Bright Data
스폰서Discover, extract, and interact with the web - one interface powering automated access across the public internet.
Notte
Leverage Notte Web AI agents & cloud browser sessions for scalable browser automation & scraping workflows
Scrapfly
Scrapfly MCP Server gives AI agents a simple, unified way to scrape live web data with built-in anti-bot handling.
LinkRescue
MCP server that exposes LinkRescue's broken link scanning, monitoring, and fix suggestion capabilities to AI agents (Claude, Cursor, etc.).
MCP YouTube Extract
Extracts information from YouTube videos and channels using the YouTube Data API.
Agentic Deep Researcher
A deep research agent powered by Crew AI and the LinkUp API.
Bilibili
Interact with the Bilibili video website, enabling actions like searching for videos, retrieving video information, and accessing user data.
Clawpage
Extract and structure any web page into clean JSON.
MCP Deep Web Research Server
An advanced web research server with intelligent search queuing, enhanced content extraction, and deep research capabilities.
TradingView Chart Image Scraper
Fetches TradingView chart images for a given ticker and interval.
siteaudit-mcp
Comprehensive website auditing with 8 tools: SEO analysis, security headers, Lighthouse audits, broken link detection, site comparison, technology detection, SSL analysis, and accessibility checks. Zero API keys required.