Deep Research
An agent-based tool for web search and advanced research, including analysis of PDFs, documents, images, and YouTube transcripts.
Deep Research MCP Server
Deep Research is an agent-based tool that provides web search and advanced research capabilities. It leverages HuggingFace's smolagents and is implemented as an MCP server.
This project is based on HuggingFace's open_deep_research example.
Features
- Web search and information gathering
- PDF and document analysis
- Image analysis and description
- YouTube transcript retrieval
- Archive site search
Requirements
- Python 3.11 or higher
uvpackage manager- The following API keys:
- OpenAI API key
- HuggingFace token
- SerpAPI key
Installation
- Clone the repository:
git clone https://github.com/Hajime-Y/deep-research-mcp.git
cd deep-research-mcp
- Create a virtual environment and install dependencies:
uv venv
source .venv/bin/activate # For Linux or Mac
# .venv\Scripts\activate # For Windows
uv sync
Environment Variables
Create a .env file in the root directory of the project and set the following environment variables:
OPENAI_API_KEY=your_openai_api_key
HF_TOKEN=your_huggingface_token
SERPER_API_KEY=your_serper_api_key
You can obtain a SERPER_API_KEY by signing up at Serper.dev.
Usage
Start the MCP server:
uv run deep_research.py
This will launch the deep_research agent as an MCP server.
Docker Usage
You can also run this MCP server in a Docker container:
# Build the Docker image
docker build -t deep-research-mcp .
# Run with required API keys
docker run -p 8080:8080 \
-e OPENAI_API_KEY=your_openai_api_key \
-e HF_TOKEN=your_huggingface_token \
-e SERPER_API_KEY=your_serper_api_key \
deep-research-mcp
Registering with MCP Clients
To register this Docker container as an MCP server in different clients:
Claude Desktop
Add the following to your Claude Desktop configuration file (typically located at ~/.config/Claude/claude_desktop_config.json on Linux, ~/Library/Application Support/Claude/claude_desktop_config.json on macOS, or %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"deep-research-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e", "OPENAI_API_KEY=your_openai_api_key",
"-e", "HF_TOKEN=your_huggingface_token",
"-e", "SERPER_API_KEY=your_serper_api_key",
"deep-research-mcp"
]
}
}
}
Cursor IDE
For Cursor IDE, add the following configuration:
{
"mcpServers": {
"deep-research-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e", "OPENAI_API_KEY=your_openai_api_key",
"-e", "HF_TOKEN=your_huggingface_token",
"-e", "SERPER_API_KEY=your_serper_api_key",
"deep-research-mcp"
]
}
}
}
Using with Remote MCP Server
If you're running the MCP server on a remote machine or exposing it as a service, you can use the URL-based configuration:
{
"mcpServers": {
"deep-research-mcp": {
"url": "http://your-server-address:8080/mcp",
"type": "sse"
}
}
}
Key Components
deep_research.py: Entry point for the MCP servercreate_agent.py: Agent creation and configurationscripts/: Various tools and utilitiestext_web_browser.py: Text-based web browsertext_inspector_tool.py: File inspection toolvisual_qa.py: Image analysis toolmdconvert.py: Converts various file formats to Markdown
License
This project is provided under the Apache License 2.0.
Acknowledgements
This project uses code from HuggingFace's smolagents and Microsoft's autogen projects.
Serveurs connexes
Unsplash MCP Server
Search and integrate images from Unsplash using its official API.
Northwestern Digital Collections API MCP
Agent integration with the Northwestern University Libraries Digital Collections API
DevRag
Free local RAG for Claude Code - Save tokens & time with vector search. Indexes markdown docs and finds relevant info without reading entire files (40x fewer tokens, 15x faster).
Marketplace Search MCP
Search trading cards (TCGPlayer), music gear (Reverb), and local service pros (Thumbtack) from one MCP server. Real-time prices, ratings, and listings.
Perplexity AI
An MCP server to interact with Perplexity AI's language models for search and conversational AI.
Langflow Document Q&A Server
A document question-and-answer server powered by Langflow.
GPT Researcher
Conducts autonomous, in-depth research by exploring and validating multiple sources to provide relevant and up-to-date information.
PortOne Global MCP Server
Search and read PortOne documentation, including API schemas and product guides.
Scientific Paper Harvester
Harvests scientific papers from arXiv and OpenAlex, providing real-time access to metadata and full text.
Perplexity
Web search using the Perplexity API with automatic model selection based on query intent.