CarDeals-MCP

A Model Context Protocol (MCP) service that indexes and queries car-deal contexts - fast, flexible search for vehicle listings and marketplace data.

Car Deals Search MCP

Search used car listings from Cars.com, Autotrader, and KBB with AI assistants

An MCP (Model Context Protocol) server that aggregates and searches car listings from multiple sources. Scrapes listings in parallel, extracts price, mileage, dealer info, and applies optional CARFAX-style filters (1-owner, no accidents, personal use).

License: MIT


πŸš€ Quick Start

Prerequisites

  • Node.js (v16 or higher)
  • Chrome/Chromium browser installed (required by Puppeteer)
    • If Chrome is not in the default location, set PUPPETEER_EXECUTABLE_PATH environment variable to point to your Chrome/Chromium binary

Installation

# Clone the repository
git clone https://github.com/SiddarthaKoppaka/car_deals_search_mcp.git
cd car_deals_search_mcp

# Install dependencies (includes Puppeteer)
npm install

Using with MCP Clients

Configure your MCP client (Claude Desktop, VS Code, GitHub Copilot, etc.) to use this server:

For Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "car-deals": {
      "command": "node",
      "args": ["/absolute/path/to/car_deals_search_mcp/src/server.js"]
    }
  }
}

For other MCP clients, refer to their documentation and use:

  • Command: node
  • Args: ["<absolute-path-to-repo>/src/server.js"]

Testing Standalone

# Run the test command
npm test

# Or test manually with a specific search
node -e "
const { scrapeCarscom } = require('./src/scraper.js');
scrapeCarscom({
  make: 'Toyota',
  model: 'Camry',
  oneOwner: true,
  noAccidents: true,
  personalUse: true
}, 5).then(listings => listings.forEach(l => console.log(l.format())));
"

✨ Features

  • Multi-source aggregation: Search Cars.com, Autotrader, and KBB simultaneously
  • Smart filtering: CARFAX-style filters (1-Owner, No Accidents, Personal Use)
  • Deal ratings: Heuristic-based deal quality assessment
  • Parallel scraping: Fast concurrent queries across sources
  • Stealth mode: Puppeteer with anti-bot detection techniques

πŸ“Š Supported Sources

SourcePriceMileageDeal RatingDealer InfoCARFAX Filters
Cars.comβœ…βœ…βœ…βœ…βœ…
Autotraderβœ…βœ…βš οΈ Limitedβœ…βš οΈ Limited
KBBβœ…βœ…βœ…βš οΈ Limited⚠️ Limited

πŸ”§ MCP Tool: search_car_deals

Parameters

ParameterTypeRequiredDescription
makestringβœ…Car manufacturer (e.g., "Toyota", "Honda")
modelstringβœ…Car model (e.g., "Camry", "Accord")
zipstring❌ZIP code for local search (default: "90210")
yearMininteger❌Minimum model year
yearMaxinteger❌Maximum model year
priceMaxinteger❌Maximum price in USD
mileageMaxinteger❌Maximum mileage
maxResultsinteger❌Max results per source (default: 10)
sourcesarray❌Sources to query: ["cars.com","autotrader","kbb"] (default: all)
oneOwnerboolean❌Filter for CARFAX 1-owner vehicles only
noAccidentsboolean❌Filter for no accidents reported
personalUseboolean❌Filter for personal use only (not rental/fleet)

Example Response

πŸš— 2021 Toyota Camry XSE
   πŸ’° Price: $23,491
   πŸ“ Mileage: 52,649 mi
   ⭐ Deal Rating: Good Deal
   πŸ† CARFAX: 1-Owner | No Accidents | Personal Use
   πŸͺ Dealer: Valencia BMW
   🌐 Source: Cars.com
   πŸ”— https://www.cars.com/vehicledetail/...

πŸ› οΈ Technical Details

  • Scraping: Puppeteer (headless Chromium) with stealth plugin to bypass bot detection
  • Concurrency: Parallel scraper workers for simultaneous multi-source queries
  • Protocol: Implements MCP (Model Context Protocol) for AI assistant integration
  • Data extraction: Source-specific parsers normalize listings into a common schema

Chrome/Chromium Requirement

This project uses Puppeteer, which requires Chrome or Chromium to be installed:

  • macOS: Chrome is typically at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
  • Linux: Usually auto-detected by Puppeteer or at /usr/bin/chromium-browser
  • Windows: Typically at C:\Program Files\Google\Chrome\Application\chrome.exe

If Puppeteer cannot find your browser, set the environment variable:

export PUPPETEER_EXECUTABLE_PATH="/path/to/chrome"

πŸ§ͺ Development & Testing

# Run tests
npm test

# Test individual scrapers
node src/scraper.js

# View code structure
ls -la src/

🀝 Contributing

Contributions are welcome! Please follow this workflow:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Add tests for new functionality
  4. Commit your changes (git commit -m 'Add amazing feature')
  5. Push to the branch (git push origin feature/amazing-feature)
  6. Open a Pull Request

Please include test coverage for scraping/parsing changes to avoid regressions when source sites update.


πŸ“„ License

MIT License - see LICENSE file for details


πŸ”— Links

Related Servers