JinaAI Search
Efficient web search optimized for LLM-friendly content using the Jina AI API.
mcp-jinaai-search
⚠️ Notice
This repository is no longer maintained.
The functionality of this tool is now available in mcp-omnisearch, which combines multiple MCP tools in one unified package.
Please use mcp-omnisearch instead.
A Model Context Protocol (MCP) server for integrating Jina.ai's Search API with LLMs. This server provides efficient and comprehensive web search capabilities, optimised for retrieving clean, LLM-friendly content from the web.
Features
- 🔍 Advanced web search through Jina.ai Search API
- 🚀 Fast and efficient content retrieval
- 📄 Clean text extraction with preserved structure
- 🧠 Content optimised for LLMs
- 🌐 Support for various content types including documentation
- 🏗️ Built on the Model Context Protocol
- 🔄 Configurable caching for performance
- 🖼️ Optional image and link gathering
- 🌍 Localisation support through browser locale
- 🎯 Token budget control for response size
Configuration
This server requires configuration through your MCP client. Here are examples for different environments:
Cline Configuration
Add this to your Cline MCP settings:
{
"mcpServers": {
"jinaai-search": {
"command": "node",
"args": ["-y", "mcp-jinaai-search"],
"env": {
"JINAAI_API_KEY": "your-jinaai-api-key"
}
}
}
}
Claude Desktop with WSL Configuration
For WSL environments, add this to your Claude Desktop configuration:
{
"mcpServers": {
"jinaai-search": {
"command": "wsl.exe",
"args": [
"bash",
"-c",
"JINAAI_API_KEY=your-jinaai-api-key npx mcp-jinaai-search"
]
}
}
}
Environment Variables
The server requires the following environment variable:
JINAAI_API_KEY: Your Jina.ai API key (required)
API
The server implements a single MCP tool with configurable parameters:
search
Search the web and get clean, LLM-friendly content using Jina.ai Reader. Returns top 5 results with URLs and clean content.
Parameters:
query(string, required): Search queryformat(string, optional): Response format ("json" or "text"). Defaults to "text"no_cache(boolean, optional): Bypass cache for fresh results. Defaults to falsetoken_budget(number, optional): Maximum number of tokens for this requestbrowser_locale(string, optional): Browser locale for rendering contentstream(boolean, optional): Enable stream mode for large pages. Defaults to falsegather_links(boolean, optional): Gather all links at the end of response. Defaults to falsegather_images(boolean, optional): Gather all images at the end of response. Defaults to falseimage_caption(boolean, optional): Caption images in the content. Defaults to falseenable_iframe(boolean, optional): Extract content from iframes. Defaults to falseenable_shadow_dom(boolean, optional): Extract content from shadow DOM. Defaults to falseresolve_redirects(boolean, optional): Follow redirect chains to final URL. Defaults to true
Development
Setup
- Clone the repository
- Install dependencies:
pnpm install
- Build the project:
pnpm run build
- Run in development mode:
pnpm run dev
Publishing
- Create a changeset:
pnpm changeset
- Version the package:
pnpm version
- Build and publish:
pnpm release
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT License - see the LICENSE file for details.
Acknowledgments
- Built on the Model Context Protocol
- Powered by Jina.ai Search API
संबंधित सर्वर
Embedding MCP Server
An MCP server powered by txtai for semantic search, knowledge graphs, and AI-driven text processing.
Semantic API
Natural language API discovery — search 700+ API capabilities, get endpoints, auth setup, and code snippets.
Gemini Web Search
Performs web searches using the Gemini Web Search Tool via the local gemini-cli.
Coles and Woolworths MCP Server
Search for products and compare prices at Coles and Woolworths supermarkets in Australia.
PubMed MCP Server
A server for searching, retrieving, and analyzing articles from the PubMed database.
Web3 Research MCP
A free and local tool for in-depth crypto research.
MCP Lucene Server
MCP Lucene Server is a Model Context Protocol (MCP) server that exposes Apache Lucene's full-text search capabilities through a conversational interface. It allows AI assistants (like Claude) to help users search, index, and manage document collections without requiring technical knowledge of Lucene or search engines.
Console MCP Server
Bridge external console processes with Copilot by searching through JSON log files.
Deep Research
A server for conducting deep research and generating reports.
Perplexity MCP Server
Adds Perplexity AI as a tool provider for Claude Desktop.