Perplexica Search
Perform conversational searches with the Perplexica AI-powered answer engine.
Perplexica MCP Server
A Model Context Protocol (MCP) server for interacting with local Perplexica instances. This server provides tools to perform AI-powered searches using your local Perplexica installation.
Features
- Search: Perform AI-powered searches with various focus modes
- Streaming Support: Get real-time streaming responses
- Multiple Focus Modes: Support for web search, academic search, writing assistant, and more
- Customizable Models: Configure chat and embedding models
- Conversation History: Maintain context across searches
Installation
From Source
git clone https://github.com/armand0e/perplexica-mcp.git
cd perplexica-mcp
pip install -e .
From Git (Direct Install)
pip install git+https://github.com/armand0e/perplexica-mcp.git
Configuration
The server connects to a local Perplexica instance. By default, it expects Perplexica to be running on http://localhost:3000.
You can configure the base URL by setting the PERPLEXICA_BASE_URL environment variable:
export PERPLEXICA_BASE_URL=http://localhost:3001
Usage
With Claude Desktop
Add this to your Claude Desktop configuration:
{
"mcpServers": {
"perplexica": {
"command": "python",
"args": ["-m", "perplexica_mcp"],
"env": {
"PERPLEXICA_BASE_URL": "http://localhost:3000"
}
}
}
}
Available Tools
perplexica_search: Perform AI-powered searches with various focus modesperplexica_get_models: Get available chat and embedding models
Focus Modes
webSearch: General web searchacademicSearch: Academic and research-focused searchwritingAssistant: Writing and content creation assistancewolframAlphaSearch: Mathematical and computational queriesyoutubeSearch: YouTube video searchredditSearch: Reddit discussion search
Requirements
- Python 3.8+
- A running Perplexica instance (see Perplexica GitHub)
Verwandte Server
Ollama Deep Researcher
Conducts deep research using local Ollama LLMs, leveraging Tavily and Perplexity for comprehensive search capabilities.
Perplexity MCP Server
Perform real-time internet research with source citations using the Perplexity API.
Wikipedia
Retrieves information from Wikipedia to provide context to Large Language Models (LLMs).
Geocoding
Provides geocoding services by integrating with the Nominatim API.
Bucketeer Docs Local MCP Server
A local server to query Bucketeer documentation, which automatically fetches and caches content from its GitHub repository.
firefox-devtools-mcp
Model Context Protocol server for Firefox DevTools - enables AI assistants to inspect and control Firefox browser through the Remote Debugging Protocol
Web3 Research MCP
A free and local tool for in-depth crypto research.
Zenn Articles
A server for searching articles on the Zenn blogging platform.
Wikipedia MCP Server
A server that enables LLMs to query and retrieve information from Wikipedia.
signalfuse-mcp
Crypto trading signals, sentiment, macro regime, web search & code execution via x402 micropayments on Base