inspirehep-mcp
InspireHEP MCP Server - integrate high-energy physics literature with LLMs.
InspireHEP MCP Server
An MCP server that integrates InspireHEP high-energy physics literature with LLMs. Search papers, explore citations, retrieve author metrics, and generate formatted references.
Installation
# Using pip
pip install inspirehep-mcp
# Or run directly with uvx (no install needed)
uvx inspirehep-mcp
Install from source
git clone https://github.com/MohamedElashri/inspirehep-mcp.git
cd inspirehep-mcp
uv sync
uv run inspirehep-mcp
Integration
Claude Desktop / Cursor / Windsurf
Add to your MCP client config:
{
"mcpServers": {
"inspirehep": {
"command": "uvx",
"args": ["inspirehep-mcp"]
}
}
}
Claude Code
Option A: Using the CLI
# Global scope (available across all projects)
claude mcp add --scope user inspirehep -- uvx inspirehep-mcp
# Project scope (shared via .mcp.json, checked into source control)
claude mcp add --scope project inspirehep -- uvx inspirehep-mcp
Option B: Manual configuration
For global scope, add to ~/.claude.json:
{
"mcpServers": {
"inspirehep": {
"command": "uvx",
"args": ["inspirehep-mcp"]
}
}
}
For project scope, create .mcp.json in your project root:
{
"mcpServers": {
"inspirehep": {
"command": "uvx",
"args": ["inspirehep-mcp"]
}
}
}
Gemini CLI
Option A: Using the CLI
# Project scope (default)
gemini mcp add inspirehep uvx inspirehep-mcp
# User/global scope
gemini mcp add -s user inspirehep uvx inspirehep-mcp
Option B: Manual configuration
Add to ~/.gemini/settings.json (user scope) or .gemini/settings.json (project scope):
{
"mcpServers": {
"inspirehep": {
"command": "uvx",
"args": ["inspirehep-mcp"]
}
}
}
Tools
| Tool | Description |
|---|---|
search_papers | Search papers by topic, author, collaboration, or free text |
get_paper_details | Get full metadata for a paper by Inspire ID, arXiv ID, or DOI |
get_author_papers | Retrieve an author's publications and citation metrics |
get_citations | Explore citation graph — who cites a paper, or what it cites |
search_by_collaboration | Find publications from ATLAS, CMS, LHCb, etc. |
get_paper_figures | Retrieve figures and download URLs for a paper |
get_references | Generate BibTeX, LaTeX, or JSON reference lists |
get_bibtex | Retrieve BibTeX citation entry by DOI, arXiv ID, or Inspire ID |
server_stats | Monitor cache hit rates and API performance |
Configuration
All settings via environment variables (prefix INSPIREHEP_):
| Variable | Default | Description |
|---|---|---|
INSPIREHEP_REQUESTS_PER_SECOND | 1.5 | API rate limit |
INSPIREHEP_CACHE_TTL | 86400 | Cache TTL in seconds (24h) |
INSPIREHEP_CACHE_MAX_SIZE | 512 | Max cached entries |
INSPIREHEP_CACHE_PERSISTENT | false | Enable SQLite persistent cache |
INSPIREHEP_CACHE_DB_PATH | inspirehep_cache.db | SQLite cache file path |
INSPIREHEP_API_TIMEOUT | 30 | HTTP request timeout (seconds) |
INSPIREHEP_LOG_LEVEL | INFO | Logging level |
Development
# Run tests
uv run pytest tests/ -v
# Run with coverage
uv run pytest tests/ --cov=inspirehep_mcp --cov-report=term-missing
# Unit tests only (no network)
uv run pytest tests/test_utils.py tests/test_cache.py tests/test_errors.py tests/test_config.py
LICENCE
This project is licensed under the AGPL-3.0 License - see the LICENSE file for details.
Verwandte Server
DeFi MCP
MCP server for DeFi data — real-time crypto prices, token analytics, wallet balances, and on-chain data across multiple chains.
OctoEverywhere For 3D Printing
A 3D Printing MCP server that allows for querying for live state, webcam snapshots, and 3D printer control.
Text-to-Speech (TTS)
A Text-to-Speech server supporting multiple backends like macOS say, ElevenLabs, Google Gemini, and OpenAI TTS.
CryptoAPIs MCP Simulate
MCP server for dry-run EVM transaction simulation via Crypto APIs
SketchUp MCP Server
Control SketchUp with AI. MCP (Model Context Protocol) server that allows AI assistants like Claude, Cursor, and Gemini to programmatically create 3D models in SketchUp.
CraftedTrust
Independent trust verification for MCP servers. 7-factor trust scoring, 3,400+ packages indexed, embeddable badges, free API. Agents can query trust scores natively via MCP protocol.
MCP Hub
A lightweight MCP Hub to centralize your MCP servers in one place.
ShapeBridge
MCP Agent to understand 3D models
AILibrary MCP Server
API for AI agents to search, license, and download b-roll video clips and voiceovers. Pay-per-request, no human interaction required.
TwelveLabs
The TwelveLabs MCP Server provides seamless integration with the TwelveLabs platform. This server enables AI assistants and applications to interact with TwelveLabs powerful video analysis capabilities through a standardized MCP interface.