Web Scout
A server for web scraping, searching, and analysis using multiple engines and APIs.
Web-Scout: AI-Powered Search with LLM Summarization
A FastAPI application that performs web searches using DuckDuckGo and generates AI-powered summaries using Google's Gemini AI.
Features
- Web search using DuckDuckGo
- AI summarization using Gemini 2.5 Flash
- Two output modes: Summary and Detailed analysis
- Docker & Docker Compose ready
- Secure API key management via environment variables
- MCP (Model Context Protocol) support over HTTP
Prerequisites
- Docker and Docker Compose installed
- Google Gemini API key
Setup
-
Clone the repository (or navigate to your project directory)
-
Add your Gemini API key to the
.envfile:GEMINI_API_KEY=your_actual_gemini_api_key_here -
Build and run with Docker Compose:
docker-compose up --build
API Usage
The application will be available at http://localhost:8000
Health Check
curl http://localhost:8000/health
Search Endpoint
Summary Mode (Default)
curl "http://localhost:8000/search?query=artificial+intelligence"
# or explicitly specify mode=summary
curl "http://localhost:8000/search?query=artificial+intelligence&mode=summary"
Detailed Mode
curl "http://localhost:8000/search?query=artificial+intelligence&mode=detailed"
Response Format
{
"query": "your search query",
"mode": "summary",
"summary": "AI-generated analysis...",
"sources_used": 10
}
MCP Server Integration
Web-Scout can also function as an MCP (Model Context Protocol) server, allowing AI assistants to perform web searches directly through tools.
MCP Features
- Web Search Tool: Perform web searches with AI summarization
- Dual Mode Support: Both summary and detailed analysis modes
- HTTP Transport: MCP over HTTP protocol for client integration
- JSON Responses: Structured output for easy integration
MCP Tools Available
Web Search Tool
- Name:
web_search - Description: Perform a web search using DuckDuckGo and generate AI-powered summaries
- Parameters:
query(string, required): The search query to performmode(string, optional): Response mode - "summary" or "detailed" (default: "summary")
MCP Server Setup
Web-Scout provides MCP functionality over HTTP, accessible at the /mcp endpoint.
Method 1: Direct FastAPI Server
- Install dependencies:
pip install -r requirements.txt
- Set your Gemini API key:
export GEMINI_API_KEY=your_api_key_here
- Run the HTTP server with MCP endpoint:
python main.py
The MCP endpoint will be available at http://localhost:8000/mcp
Method 2: Docker Container
# Run the HTTP server with Docker (MCP available at /mcp)
docker-compose up --build
Or run standalone container
docker run -p 8000:8000 -e GEMINI_API_KEY=your_api_key_here web-scout
Integrating with AI Tools
To use Web-Scout as an MCP server with AI tools like Claude Desktop or Roo:
- Create MCP Configuration:
{
"mcpServers": {
"web-scout": {
"command": "python",
"args": ["-m", "uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"],
"env": {
"GEMINI_API_KEY": "your_gemini_api_key_here"
}
}
}
}
-
Configure your AI tool to use the MCP configuration:
- For Claude Desktop: Add to
~/Library/Application Support/Claude/claude_desktop_config.json - For Roo: Add to the appropriate configuration file
- For Claude Desktop: Add to
-
Usage Example:
Can you search for the latest news about artificial intelligence?
The AI tool will use the Web-Scout MCP server (via the /mcp endpoint) to perform the search and provide summarized results.
Development
Local Development (without Docker)
# Install dependencies
pip install -r requirements.txt
# Set your API key
export GEMINI_API_KEY=your_api_key_here
# Run the application
uvicorn main:app --reload
Using Docker Compose
# Build and run
docker-compose up --build
# Run in background
docker-compose up -d --build
# Stop the application
docker-compose down
# View logs
docker-compose logs -f web-scout
Configuration
Environment Variables
GEMINI_API_KEY: Your Google Gemini API key (required)
Docker Configuration
- Port: 8000
- Container name: web-scout
- Health check: Automatic health monitoring
Security Notes
- The
.envfile is ignored by Git and should never be committed - API keys are mounted securely via Docker Compose volumes
- The application uses health checks for monitoring
Error Handling
- Returns proper HTTP status codes
- Includes detailed error messages
- Handles missing API keys and invalid parameters gracefully
Related Servers
SABIS MCP Server
Access academic grades from the Sakarya University SABIS system via automated web scraping.
BrowserLoop
Take screenshots and read console logs from web pages using Playwright.
Oxylabs
Scrape websites with Oxylabs Web API, supporting dynamic rendering and parsing for structured data extraction.
Cloudflare Browser Rendering
Provides web context to LLMs using Cloudflare's Browser Rendering API.
LinkedIn Profile Scraper
Fetches LinkedIn profile information using the Fresh LinkedIn Profile Data API.
Scraper.is MCP
A powerful web scraping tool for AI assistants, powered by the Scraper.is API.
Intelligent Crawl4AI Agent
An AI-powered web scraping system for high-volume automation and advanced data extraction strategies.
Browserbase
Automate browser interactions in the cloud (e.g. web navigation, data extraction, form filling, and more)
MCP Browser Agent
A browser automation agent using the Model Context Protocol (MCP) to enable browser interactions.
302AI BrowserUse
An AI-powered browser automation server for natural language control and web research.