Leeroopedia
The Brain that turns Generalist Agents into ML Experts.
Documentation Index
Fetch the complete documentation index at: https://docs.leeroopedia.com/llms.txt Use this file to discover all available pages before exploring further.
Leeroopedia MCP
Give your AI coding agent access to curated ML/AI knowledge
$20 free credit on sign-up. That's plenty of searches, plans, and diagnoses. Skip the guesswork on your next fine-tuning run or inference deployment. No credit card required. Get your API key →
What is Leeroopedia?
Your ML & AI Knowledge Wiki. Learnt by AI, built by AI, for AI.
Expert-level knowledge across the full ML & AI stack: fine-tuning and distributed training, inference serving and GPU kernel optimization, building agents and RAG pipelines. 1000+ frameworks and libraries, all in one place.
This MCP server turns your AI coding agent (Claude Code, Cursor, Claude Desktop, ChatGPT, OpenAI Codex, ...) into an ML/AI expert engineer.
Browse the full knowledge base at leeroopedia.com.
Want to go end-to-end?
Leeroopedia gives your agent the knowledge. Kapso gives it the ability to act on it: research, experiment, and deploy. Together: a complete ML/AI engineer agent.
Connect to Your Agents
Use our hosted server for zero-setup. Just paste this URL into any MCP client that supports remote servers:
https://mcp.leeroopedia.com/mcp?token=kpsk_your_key_here
Or see the per-client guides below for detailed instructions (including local setup).
Set up with Claude Code Set up with Cursor Set up with Claude Desktop Set up with OpenAI Codex Set up with ChatGPTBenchmarks
We measured the effect of Leeroopedia MCP on real ML tasks:
-
ML Inference Optimization. Write CUDA/Triton kernels for 10 KernelBench problems. 2.11x geomean speedup vs 1.80x (+17%), with/without Leeroopedia MCP.
-
LLM Post-Training. End-to-end SFT + DPO + LoRA merge + vLLM serving + IFEval on 8×A100. 21.3 vs 18.5 IFEval strict-prompt accuracy, 34.6 vs 30.9 strict-instruction accuracy, 272.7 vs 231.6 throughput.
-
Self-Evolving RAG. Build a RAG service that automatically improves itself over multiple rounds. 45.16 vs 40.51 Precision@5, 40.32 vs 35.29 Recall@5, in 52 vs 62 min wall time.
-
Customer Support Agent. Multi-agent triage system classifying 200 tickets into 27 intents. 98 vs 83 benchmark performance, 11s vs 61s per query.
Available Tools
The server provides 8 agentic tools: search, plan, review, verify, diagnose, hypothesize, query hyperparameters, and retrieve pages.
See all 8 tools with parameters and usageQuick Links
Connect in 2 minutes Connect in 2 minutes Connect in 2 minutes Connect in 2 minutes Connect in 2 minutes See the results All 8 tools explained相關伺服器
Alpha Vantage MCP Server
贊助Access financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
MCP Server Creator
A meta-server for dynamically generating MCP server configurations and Python code.
AgentMode
An all-in-one MCP server for developers, connecting coding AI to databases, data warehouses, data pipelines, and cloud services.
Kafka Schema Registry
A comprehensive Message Control Protocol (MCP) server for Kafka Schema Registry.
FogBugz
A local MCP server for interacting with FogBugz issue tracker through LLM
IdeaJarvis
IdeaJarvis is an idea workspace for product builders. Use AI to structure brainstorming into detailed PRDs, conduct comprehensive market research, build prototypes, and gather real community feedback—turning "what if" into "ready to launch.
Postman MCP Server
Interact with the Postman API via an MCP server. Requires a Postman API key.
Loggles
Loggles is a local-first log sink with an MCP interface that lets coding agents (Claude Code, Cursor) query application logs directly
MCP Gemini CLI
Integrate with Google Gemini through its command-line interface (CLI).
Memori MCP
With Memori's MCP server, your agent can retrieve relevant memories before answering and store durable facts after responding, keeping context across sessions without any SDK integration.
ZKshare
Stdio MCP server that exposes zkShare tools to AI clients: store encrypted context, proofs, semantic search, sharing, and sandbox calls via POST /api/v1/context with ZKSHARE_API_KEY.