Remote Files MCP
MCP server for monitoring remote file sources, detecting new files, and extracting content. Works with any storage backend via rclone (70+ providers) or custom commands.
remote-files
MCP server for monitoring remote file sources, detecting new files, and extracting content. Works with any storage backend via rclone (70+ providers) or custom commands.
Features
- Multi-source monitoring — track multiple remote locations (Google Drive, S3, SFTP, local dirs, etc.)
- New file detection — state-based diffing detects new and changed files
- Content extraction — extracts text from DOCX, PDF, and plain text files
- Auto-cleanup — downloaded files are deleted after content extraction
- Hybrid transport — built-in rclone support + custom shell commands
Prerequisites
- Node.js >= 18
- npm
- rclone (if using the rclone transport — the installer can install it for you)
Installation
The interactive installer handles dependencies, build, configuration, and Claude Code registration:
cd remote-files
bash install.sh
The installer will:
- Verify Node.js and npm are available
- Ask you to choose a transport mode (rclone or custom)
- Install rclone if needed (via brew, apt, pacman, or the official install script)
- Walk you through rclone remote configuration if no remotes exist yet
- Install npm dependencies and build the project
- Create the config file at
~/.config/remote-files/config.json - Register the MCP server with Claude Code (if the CLI is available)
Manual installation
If you prefer to set things up yourself:
cd remote-files
npm install
npm run build
claude mcp add --transport stdio remote-files -- node /absolute/path/to/remote-files/dist/index.js
Then create ~/.config/remote-files/config.json manually (see Configuration below).
Configuration
Create ~/.config/remote-files/config.json:
{
"sources": {
"my-drive": {
"provider": "rclone",
"remote": "gdrive:",
"path": "Documents/Reports",
"flags": ["--drive-shared-with-me"],
"exclude": ["*.tmp"],
"excludeFrom": "/path/to/exclude-patterns.txt"
},
"my-server": {
"provider": "custom",
"listCommand": "ssh server 'find /data -type f -printf \"%s %P\\n\"'",
"downloadCommand": "scp server:/data/$FILE $DEST/"
}
},
"settings": {
"tempDir": "/tmp/remote-files",
"stateDir": "~/.local/share/remote-files/state",
"autoCleanup": true,
"maxContentLength": 102400
}
}
Override config path with REMOTE_FILES_CONFIG env var.
rclone provider
Requires rclone installed and configured. Fields:
| Field | Required | Description |
|---|---|---|
remote | yes | rclone remote name (e.g. gdrive:, s3:) |
path | yes | Path within the remote |
flags | no | Extra rclone flags |
exclude | no | Exclude patterns |
excludeFrom | no | Path to exclude file |
custom provider
For any backend not covered by rclone. Fields:
| Field | Required | Description |
|---|---|---|
listCommand | yes | Shell command that outputs <size> <path> lines |
downloadCommand | yes | Shell command with $FILE and $DEST variables |
MCP Tools
check_sources
Check for new/changed files without downloading.
Parameters:
source? — specific source name (omit for all)
include_pattern? — glob filter (e.g. "*.docx")
init_source
Initialize a source baseline. All current files are marked as known.
Parameters:
source — source name to initialize
fetch_file
Download a file, extract text, and auto-delete the local copy.
Parameters:
source — source name
path — file path (from check_sources)
keep_local — if true, keep file on disk (default: false)
Returns extracted text for DOCX/PDF/TXT. For unknown formats, returns local_path for Claude to read directly.
Workflow
- First time: Claude calls
init_sourceto create a baseline - Ongoing: Claude calls
check_sourcesto find new files - Per file: Claude calls
fetch_fileto get content and summarize - Files are auto-deleted after extraction
Uninstall
claude mcp remove remote-files
rm -rf ~/.config/remote-files
rm -rf ~/.local/share/remote-files
Похожие серверы
AWS S3
Manage AWS S3 buckets and objects, including policies, tagging, and configurations.
TOS MCP Server
Explore and retrieve content from Volcengine's Tinder Object Storage (TOS) using natural language queries.
VaultSage MCP
Give your AI agent persistent memory: upload, search, and chat over files stored in VaultSage — no install required, just sign in with Google and paste one token.
Google Photos
Access and manage your Google Photos library with AI assistants.
MCP Backup Server
A server for backing up and restoring data for AI agents and code editing tools.
AWS S3
Retrieve files like PDFs from an AWS S3 bucket. Requires AWS credentials for access.
Pinata
Interact with Public and Private IPFS through Pinata's API. Requires a Pinata account and API key.
MinIO
MCP server and client for MinIO object storage, configured via environment variables.
Memory Bank MCP
An MCP server for remote memory bank management, enhanced with Supergateway for streamable-http transport. It supports multi-project management and requires persistent storage.
AWS S3 MCP Server
An MCP server for managing files and buckets on AWS S3. Requires AWS credentials for authentication.