mcpdoc

Access website documentation for AI search engines (llms.txt files) over MCP.

mcp-mcpdoc

Leverages LLM.TXT to help agent (Claude, Windsurf, Cursor) to get the most recent version of web pages.

LLM.txt files are added to websites similar to robotx.txt files which the site adds for AI search engine.

LLM.txt files contains the breakdown of the URLs and descriptions

SETUP

  1. Assuming you already have uv installed

  2. Create a directiory

uv init
  1. Create a venv
uv venv
  1. Activate the venv
source .venv/Scripts/activate
  1. install the dependencies
uv sync
  1. Get the full path to the uv installation path
which uv

CLAUDE Desktop integration

{ "mcpServers": { "langgraph-docs-mcp": { "command": "/Users/XXX/AppData/Local/Microsoft/WinGet/Packages/astral-sh.uv_Microsoft.Winget.Source_8wekyb3d8bbwe/uvx", "args": [ "--from", "/Users/XX/Desktop/git_personal/mcpdoc", "mcpdoc", "--urls", "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt", "--transport", "stdio", "--port", "8081", "--host", "localhost" ] } } }

Note"

  1. The command field you get from running "which uv" in the project folder
  2. The field after --from you get from runnig "pwd" in the project folder. This is to point Claude to the project folder.

Basically here UV will activate the venv in the project folder and python will run the commands so that the server can start for Claude Desktop HOST

Related Servers