Tavily
A comprehensive search API for real-time web search, data extraction, and crawling, requiring a Tavily API key.
Tavily MCP Server
Tavily MCP Server implementation that uses fastmcp and supports both sse and stdio transports. To use this server, you need a Tavily account and a Tavily API key, which must be loaded into the TAVILY_API_KEY environment variable.
The Tavily MCP server provides:
- search, extract, map, crawl tools
- Real-time web search capabilities through the tavily-search tool
- Intelligent data extraction from web pages via the tavily-extract tool
- Powerful web mapping tool that creates a structured map of website
- Web crawler that systematically explores websites
Prerequisites
- git installed. (To clone the repo)
- uv installed.
- docker installed (Optional: If you are planning to use the SSE server inside a docker container).
To install uv in Linux and MacOS type this in your terminal:
curl -LsSf https://astral.sh/uv/install.sh | sh
Environment Variables
Copy the .env.example file and rename that to .env. Then paste your TAVILY_API_KEY inside there
TAVILY_API_KEY=<YOUR-API-KEY>
Optional: You can also configure the port if you are planning to use SSE.
TAVILY_MCP_PORT=<PORT>
Running the SSE server
While inside the repo run:
uv run --env-file .env tavily-mcp-sse
Running on STDIO
{
"mcpServers": {
"tavily-mcp-server": {
"command": "uv",
"args": [
"run",
"--directory",
"<LOCATION-TO-THE-REPO>",
"tavily-mcp-stdio"
],
"env": {
"TAVILY_API_KEY": "<YOUR-API-KEY>"
}
}
}
}
Docker SSE Server
First you need to build the image using the Dockerfile inside this repository. Run this to build the image:
docker build -t tavily-mcp .
Then you can run the container using the environment variables inside the env file
docker run --name tavily-mcp \
-p 127.0.0.1:8000:8000 \
--env-file .env \
tavily-mcp
Or you can specify the environment variables yourself.
docker run --name tavily-mcp \
-p 127.0.0.1:8000:8000 \
-e TAVILY_API_KEY=<YOUR-API-KEY>
tavily-mcp
관련 서버
Crawleo MCP Server
Crawleo MCP - Web Search & Crawl for AI Enable AI assistants to access real-time web data through native tool integration. Two Powerful Tools: web.search - Real-time web search with flexible formatting Search from any country/language Device-specific results (desktop, mobile, tablet) Multiple output formats: Enhanced HTML (AI-optimized, clean) Raw HTML (original source) Markdown (formatted text) Plain Text (pure content) Auto-crawl option for full content extraction Multi-page search support web.crawl - Deep content extraction Extract clean content from any URL JavaScript rendering support Markdown conversion Screenshot capture Multi-URL support Features: ✅ Zero data retention (complete privacy) ✅ Real-time, not cached results ✅ AI-optimized with Enhanced HTML mode ✅ Global coverage (any country/language) ✅ Device-specific search (mobile/desktop/tablet) ✅ Flexible output formats (4 options) ✅ Cost-effective (5-10x cheaper than competitors) ✅ Simple Claude Desktop integration Perfect for: Research, content analysis, data extraction, AI agents, RAG pipelines, multi-device testing
NPMLens MCP
NPMLens MCP lets your coding agent (such as Claude, Cursor, Copilot, Gemini or Codex) search the npm registry and fetch package context (README, downloads, GitHub info, usage snippets). It acts as a Model‑Context‑Protocol (MCP) server, giving your AI assistant a structured way to discover libraries and integrate them quickly.
Mastra Docs Server
Provides AI assistants with direct access to Mastra.ai's complete knowledge base.
Google Scholar MCP
An MCP server for searching Google Scholar, built for AI assistants and automation workflows that need papers, authors, citations, and BibTeX entries.
mcpdoc
Access website documentation for AI search engines (llms.txt files) over MCP.
arXiv Research Assistant
Interact with the arXiv.org paper database. Supports keyword search, paper lookups, author searches, and trend analysis.
Unsplash MCP Server
Search and integrate images from Unsplash using its official API.
mcp-seo-audit
SEO audit and Google Search Console MCP server with 23 tools. Search analytics, URL inspection, Indexing API, Core Web Vitals (CrUX), striking distance keywords, keyword cannibalization detection, branded query analysis, and automated site audits.
Ripgrep Search
Efficiently search Obsidian vaults using the ripgrep tool.
MCP-MCP
A meta-server for discovering and provisioning other MCP servers from a large database.