Scrape websites with Oxylabs Web API, supporting dynamic rendering and parsing for structured data extraction.
The Oxylabs MCP server provides a bridge between AI models and the web. It enables them to scrape any URL, render JavaScript-heavy pages, extract and format content for AI use, bypass anti-scraping measures, and access geo-restricted web data from 195+ countries.
This implementation leverages the Model Context Protocol (MCP) to create a secure, standardized way for AI assistants to interact with web content.
When you've set up the MCP server with Claude, you can make requests like:
https://www.google.com/search?q=ai
page?https://www.amazon.de/-/en/Smartphone-Contract-Function-Manufacturer-Exclusive/dp/B0CNKD651V
with parse enabledhttps://www.amazon.de/-/en/gp/bestsellers/beauty/ref=zg_bs_nav_beauty_0
with parse and render enabledhttps://www.bestbuy.com/site/top-deals/all-electronics-on-sale/pcmcat1674241939957.c
Before you begin, make sure you have:
Via Smithery CLI:
npx
command-line toolVia uv:
uv
package manager β install it using this guideuv
package manager β install it using this guideThe Oxylabs MCP server supports these parameters:
Parameter | Description | Values |
---|---|---|
url | The URL to scrape | Any valid URL |
parse | Enable structured data extraction | True or False |
render | Use headless browser rendering | html or None |
Automatically install Oxylabs MCP server via Smithery:
npx -y @smithery/cli install @oxylabs/oxylabs-mcp --client <client>
List of clients supported by Oxylabs at the moment:
Config with uvx
. Will install the CLI client and Oxylabs MCP server that performs calls directly to the Oxylabs API. Recommended and the most stable option at the moment.
{
"mcpServers": {
"oxylabs_scraper_uvx": {
"command": "uvx",
"args": [
"oxylabs-mcp"
],
"env": {
"OXYLABS_USERNAME": "OXYLABS_USERNAME",
"OXYLABS_PASSWORD": "OXYLABS_PASSWORD"
}
}
}
}
Config with npx
. Will install the Smithery CLI client that performs calls to the Oxylabs MCP server hosted in Smithery.
{
"mcpServers": {
"oxylabs-mcp": {
"command": "npx",
"args": [
"-y",
"@smithery/cli@latest",
"run",
"@oxylabs/oxylabs-mcp",
"--config",
"\"{\\\"oxylabsUsername\\\":\\\"OXYLABS_USERNAME\\\",\\\"oxylabsPassword\\\":\\\"OXYLABS_PASSWORD\\\"}\""
]
}
}
}
Config with uv
. Will install CLI client and Oxylabs MCP server that references the local code. For the local development.
{
"mcpServers": {
"oxylabs_scraper": {
"command": "uv",
"args": [
"--directory",
"/<Absolute-path-to-folder>/oxylabs-mcp",
"run",
"oxylabs-mcp"
],
"env": {
"OXYLABS_USERNAME": "OXYLABS_USERNAME",
"OXYLABS_PASSWORD": "OXYLABS_PASSWORD"
}
}
}
}
[!NOTE] If you don't have
uvx
utility you need to install it first withbrew install uv
[!TIP] If you run into errors with
uvx
, try using the full path touvx
in thecommand
field. For example,/Users/my-user/.local/bin/uvx
. If you are using Windows and experiencing issues with Cursor, refer to the guidelines described here.
Navigate to Claude β Settings β Developer β Edit Config and add one of the configurations above to the claude_desktop_config.json
file.
Navigate to Cursor β Settings β Cursor Settings β MCP. Click Add new global MCP server and add one of the configurations above.
git clone <git:url>
Install MCP server dependencies:
cd mcp-server-oxylabs
# Create virtual environment and activate it
uv venv
source .venv/bin/activate # MacOS/Linux
# OR
.venv/Scripts/activate # Windows
# Install dependencies
uv sync
make run
Then access MCP Inspector at http://localhost:5173
. You may need to add your username and password as environment variables in the inspector under OXYLABS_USERNAME
and OXYLABS_PASSWORD
.
This server provides two main tools:
Web Scraper API supports JavaScript rendering, parsed structured data, and cleaned HTML in Markdown format. Web Unblocker offers JavaScript rendering and cleaned HTML, but doesnβt return parsed data.
This project is licensed under the MIT License.
Established in 2015, Oxylabs is a market-leading web intelligence collection platform, driven by the highest business, ethics, and compliance standards, enabling companies worldwide to unlock data-driven insights.
Web content fetching and conversion for efficient LLM usage
Browser automation and web scraping
Enable AI agents to get structured data from unstructured web with AgentQL.
Actors MCP Server: Use 3,000+ pre-built cloud tools to extract data from websites, e-commerce, social media, search engines, maps, and more
Automate browser interactions in the cloud (e.g. web navigation, data extraction, form filling, and more)
Extract web data with Firecrawl
Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation.
Playwright MCP server
Render website screenshots with ScreenshotOne
Automate your local browser