Remote Files MCP
MCP server for monitoring remote file sources, detecting new files, and extracting content. Works with any storage backend via rclone (70+ providers) or custom commands.
remote-files
MCP server for monitoring remote file sources, detecting new files, and extracting content. Works with any storage backend via rclone (70+ providers) or custom commands.
Features
- Multi-source monitoring — track multiple remote locations (Google Drive, S3, SFTP, local dirs, etc.)
- New file detection — state-based diffing detects new and changed files
- Content extraction — extracts text from DOCX, PDF, and plain text files
- Auto-cleanup — downloaded files are deleted after content extraction
- Hybrid transport — built-in rclone support + custom shell commands
Prerequisites
- Node.js >= 18
- npm
- rclone (if using the rclone transport — the installer can install it for you)
Installation
The interactive installer handles dependencies, build, configuration, and Claude Code registration:
cd remote-files
bash install.sh
The installer will:
- Verify Node.js and npm are available
- Ask you to choose a transport mode (rclone or custom)
- Install rclone if needed (via brew, apt, pacman, or the official install script)
- Walk you through rclone remote configuration if no remotes exist yet
- Install npm dependencies and build the project
- Create the config file at
~/.config/remote-files/config.json - Register the MCP server with Claude Code (if the CLI is available)
Manual installation
If you prefer to set things up yourself:
cd remote-files
npm install
npm run build
claude mcp add --transport stdio remote-files -- node /absolute/path/to/remote-files/dist/index.js
Then create ~/.config/remote-files/config.json manually (see Configuration below).
Configuration
Create ~/.config/remote-files/config.json:
{
"sources": {
"my-drive": {
"provider": "rclone",
"remote": "gdrive:",
"path": "Documents/Reports",
"flags": ["--drive-shared-with-me"],
"exclude": ["*.tmp"],
"excludeFrom": "/path/to/exclude-patterns.txt"
},
"my-server": {
"provider": "custom",
"listCommand": "ssh server 'find /data -type f -printf \"%s %P\\n\"'",
"downloadCommand": "scp server:/data/$FILE $DEST/"
}
},
"settings": {
"tempDir": "/tmp/remote-files",
"stateDir": "~/.local/share/remote-files/state",
"autoCleanup": true,
"maxContentLength": 102400
}
}
Override config path with REMOTE_FILES_CONFIG env var.
rclone provider
Requires rclone installed and configured. Fields:
| Field | Required | Description |
|---|---|---|
remote | yes | rclone remote name (e.g. gdrive:, s3:) |
path | yes | Path within the remote |
flags | no | Extra rclone flags |
exclude | no | Exclude patterns |
excludeFrom | no | Path to exclude file |
custom provider
For any backend not covered by rclone. Fields:
| Field | Required | Description |
|---|---|---|
listCommand | yes | Shell command that outputs <size> <path> lines |
downloadCommand | yes | Shell command with $FILE and $DEST variables |
MCP Tools
check_sources
Check for new/changed files without downloading.
Parameters:
source? — specific source name (omit for all)
include_pattern? — glob filter (e.g. "*.docx")
init_source
Initialize a source baseline. All current files are marked as known.
Parameters:
source — source name to initialize
fetch_file
Download a file, extract text, and auto-delete the local copy.
Parameters:
source — source name
path — file path (from check_sources)
keep_local — if true, keep file on disk (default: false)
Returns extracted text for DOCX/PDF/TXT. For unknown formats, returns local_path for Claude to read directly.
Workflow
- First time: Claude calls
init_sourceto create a baseline - Ongoing: Claude calls
check_sourcesto find new files - Per file: Claude calls
fetch_fileto get content and summarize - Files are auto-deleted after extraction
Uninstall
claude mcp remove remote-files
rm -rf ~/.config/remote-files
rm -rf ~/.local/share/remote-files
관련 서버
MCP Google Drive Server
Connect your AI assistant to Google Drive for file management and access.
Qiniu MCP Server
Access Qiniu's cloud storage, intelligent multimedia services, and CDN operations via the Model Context Protocol.
AI Cortex Storage
Persistent cloud memory for AI agents via MCP Streamable HTTP. Store, retrieve, and search key-value memories across sessions and devices. Trial account auto-created on first use.
Box
File access and search for Box.
AWS S3 MCP Server
Manage AWS S3 operations, providing secure access to S3 buckets through pre-signed URLs.
DeFI Agents MCP
DeFi agent definitions JSON API - Production-ready agents for Web3, crypto trading, portfolio management, and blockchain automation
dbx-mcp-server
An MCP server for interacting with Dropbox files and services.
The Drive AI
Interact with The Drive AI files directly from AI assistants like ChatGPT or Claude using the Model Context Protocol (MCP).
s3-tools MCP Server
Provides tools for interacting with AWS S3 buckets.
Memory Bank MCP
An MCP server for remote memory bank management, enhanced with Supergateway for streamable-http transport. It supports multi-project management and requires persistent storage.