SearXNG
A Model Context Protocol Server for SearXNG
SearXNG MCP Server
An MCP server that integrates the SearXNG API, giving AI assistants web search capabilities.
Quick Start
Add to your MCP client configuration (e.g. claude_desktop_config.json):
{
"mcpServers": {
"searxng": {
"command": "npx",
"args": ["-y", "mcp-searxng"],
"env": {
"SEARXNG_URL": "YOUR_SEARXNG_INSTANCE_URL"
}
}
}
}
Replace YOUR_SEARXNG_INSTANCE_URL with the URL of your SearXNG instance (e.g. https://search.example.com).
Features
- Web Search: General queries, news, articles, with pagination.
- URL Content Reading: Advanced content extraction with pagination, section filtering, and heading extraction.
- Intelligent Caching: URL content is cached with TTL (Time-To-Live) to improve performance and reduce redundant requests.
- Pagination: Control which page of results to retrieve.
- Time Filtering: Filter results by time range (day, month, year).
- Language Selection: Filter results by preferred language.
- Safe Search: Control content filtering level for search results.
How It Works
mcp-searxng is a standalone MCP server — a separate Node.js process that your AI assistant connects to for web search. It queries any SearXNG instance via its HTTP JSON API.
Not a SearXNG plugin: This project cannot be installed as a native SearXNG plugin. Point it at any existing SearXNG instance by setting
SEARXNG_URL.
AI Assistant (e.g. Claude)
│ MCP protocol
▼
mcp-searxng (this project — Node.js process)
│ HTTP JSON API (SEARXNG_URL)
▼
SearXNG instance
Tools
-
searxng_web_search
- Execute web searches with pagination
- Inputs:
query(string): The search query. This string is passed to external search services.pageno(number, optional): Search page number, starts at 1 (default 1)time_range(string, optional): Filter results by time range - one of: "day", "month", "year" (default: none)language(string, optional): Language code for results (e.g., "en", "fr", "de") or "all" (default: "all")safesearch(number, optional): Safe search filter level (0: None, 1: Moderate, 2: Strict) (default: instance setting)
-
web_url_read
- Read and convert the content from a URL to markdown with advanced content extraction options
- Inputs:
url(string): The URL to fetch and processstartChar(number, optional): Starting character position for content extraction (default: 0)maxLength(number, optional): Maximum number of characters to returnsection(string, optional): Extract content under a specific heading (searches for heading text)paragraphRange(string, optional): Return specific paragraph ranges (e.g., '1-5', '3', '10-')readHeadings(boolean, optional): Return only a list of headings instead of full content
Installation
NPM (global install)
npm install -g mcp-searxng
{
"mcpServers": {
"searxng": {
"command": "mcp-searxng",
"env": {
"SEARXNG_URL": "YOUR_SEARXNG_INSTANCE_URL"
}
}
}
}
Docker
Pre-built image:
docker pull isokoliuk/mcp-searxng:latest
{
"mcpServers": {
"searxng": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "SEARXNG_URL",
"isokoliuk/mcp-searxng:latest"
],
"env": {
"SEARXNG_URL": "YOUR_SEARXNG_INSTANCE_URL"
}
}
}
}
To pass additional env vars, add -e VAR_NAME to args and the variable to env.
Build locally:
docker build -t mcp-searxng:latest -f Dockerfile .
Use the same config above, replacing isokoliuk/mcp-searxng:latest with mcp-searxng:latest.
Docker Compose
docker-compose.yml:
services:
mcp-searxng:
image: isokoliuk/mcp-searxng:latest
stdin_open: true
environment:
- SEARXNG_URL=YOUR_SEARXNG_INSTANCE_URL
# Add optional variables as needed — see CONFIGURATION.md
MCP client config:
{
"mcpServers": {
"searxng": {
"command": "docker-compose",
"args": ["run", "--rm", "mcp-searxng"]
}
}
}
HTTP Transport
By default the server uses STDIO. Set MCP_HTTP_PORT to enable HTTP mode:
{
"mcpServers": {
"searxng-http": {
"command": "mcp-searxng",
"env": {
"SEARXNG_URL": "YOUR_SEARXNG_INSTANCE_URL",
"MCP_HTTP_PORT": "3000"
}
}
}
}
Endpoints: POST/GET/DELETE /mcp (MCP protocol), GET /health (health check)
Test it:
MCP_HTTP_PORT=3000 SEARXNG_URL=http://localhost:8080 mcp-searxng
curl http://localhost:3000/health
Configuration
Set SEARXNG_URL to your SearXNG instance URL. All other variables are optional.
Full environment variable reference: CONFIGURATION.md
Troubleshooting
403 Forbidden from SearXNG
Your SearXNG instance likely has JSON format disabled. Edit settings.yml (usually /etc/searxng/settings.yml):
search:
formats:
- html
- json
Restart SearXNG (docker restart searxng) then verify:
curl 'http://localhost:8080/search?q=test&format=json'
You should receive a JSON response. If not, confirm the file is correctly mounted and YAML indentation is valid.
See also: SearXNG settings docs · discussion
Contributing
See CONTRIBUTING.md
License
MIT — see LICENSE for details.
Servidores relacionados
TripGo
Find transport-related locations, departures, and routes using the TripGo API.
Sycek OSINT
The Sycek MCP Client is a Model Context Protocol server that gives AI assistants direct access to the Sycek OSINT platform's 20 intelligence tools. Instead of switching between dashboards, you describe what you need and your AI handles the investigation.
Kagi Search
Web search using the Kagi Search API
Marketaux
Search for market news and financial data by entity, country, industry, or symbol using the Marketaux API.
Brave Search
An MCP server for the Brave Search API, providing web and local search capabilities via a streaming SSE interface.
Algolia Search
A server for searching an Algolia index using the Algolia Search API.
Joblyst MCP
One search to get german projects and jobs from different plattforms
Crawleo MCP Server
Crawleo MCP - Web Search & Crawl for AI Enable AI assistants to access real-time web data through native tool integration. Two Powerful Tools: web.search - Real-time web search with flexible formatting Search from any country/language Device-specific results (desktop, mobile, tablet) Multiple output formats: Enhanced HTML (AI-optimized, clean) Raw HTML (original source) Markdown (formatted text) Plain Text (pure content) Auto-crawl option for full content extraction Multi-page search support web.crawl - Deep content extraction Extract clean content from any URL JavaScript rendering support Markdown conversion Screenshot capture Multi-URL support Features: ✅ Zero data retention (complete privacy) ✅ Real-time, not cached results ✅ AI-optimized with Enhanced HTML mode ✅ Global coverage (any country/language) ✅ Device-specific search (mobile/desktop/tablet) ✅ Flexible output formats (4 options) ✅ Cost-effective (5-10x cheaper than competitors) ✅ Simple Claude Desktop integration Perfect for: Research, content analysis, data extraction, AI agents, RAG pipelines, multi-device testing
secEdgarMCP
An MCP server that allows a client to fetch data from the SEC EDGAR API and pull documents into terminal rendering
MCP Documentation Server
A server for document management and semantic search using AI embeddings, with local JSON storage.
