Remote Files MCP
MCP server for monitoring remote file sources, detecting new files, and extracting content. Works with any storage backend via rclone (70+ providers) or custom commands.
remote-files
MCP server for monitoring remote file sources, detecting new files, and extracting content. Works with any storage backend via rclone (70+ providers) or custom commands.
Features
- Multi-source monitoring — track multiple remote locations (Google Drive, S3, SFTP, local dirs, etc.)
- New file detection — state-based diffing detects new and changed files
- Content extraction — extracts text from DOCX, PDF, and plain text files
- Auto-cleanup — downloaded files are deleted after content extraction
- Hybrid transport — built-in rclone support + custom shell commands
Prerequisites
- Node.js >= 18
- npm
- rclone (if using the rclone transport — the installer can install it for you)
Installation
The interactive installer handles dependencies, build, configuration, and Claude Code registration:
cd remote-files
bash install.sh
The installer will:
- Verify Node.js and npm are available
- Ask you to choose a transport mode (rclone or custom)
- Install rclone if needed (via brew, apt, pacman, or the official install script)
- Walk you through rclone remote configuration if no remotes exist yet
- Install npm dependencies and build the project
- Create the config file at
~/.config/remote-files/config.json - Register the MCP server with Claude Code (if the CLI is available)
Manual installation
If you prefer to set things up yourself:
cd remote-files
npm install
npm run build
claude mcp add --transport stdio remote-files -- node /absolute/path/to/remote-files/dist/index.js
Then create ~/.config/remote-files/config.json manually (see Configuration below).
Configuration
Create ~/.config/remote-files/config.json:
{
"sources": {
"my-drive": {
"provider": "rclone",
"remote": "gdrive:",
"path": "Documents/Reports",
"flags": ["--drive-shared-with-me"],
"exclude": ["*.tmp"],
"excludeFrom": "/path/to/exclude-patterns.txt"
},
"my-server": {
"provider": "custom",
"listCommand": "ssh server 'find /data -type f -printf \"%s %P\\n\"'",
"downloadCommand": "scp server:/data/$FILE $DEST/"
}
},
"settings": {
"tempDir": "/tmp/remote-files",
"stateDir": "~/.local/share/remote-files/state",
"autoCleanup": true,
"maxContentLength": 102400
}
}
Override config path with REMOTE_FILES_CONFIG env var.
rclone provider
Requires rclone installed and configured. Fields:
| Field | Required | Description |
|---|---|---|
remote | yes | rclone remote name (e.g. gdrive:, s3:) |
path | yes | Path within the remote |
flags | no | Extra rclone flags |
exclude | no | Exclude patterns |
excludeFrom | no | Path to exclude file |
custom provider
For any backend not covered by rclone. Fields:
| Field | Required | Description |
|---|---|---|
listCommand | yes | Shell command that outputs <size> <path> lines |
downloadCommand | yes | Shell command with $FILE and $DEST variables |
MCP Tools
check_sources
Check for new/changed files without downloading.
Parameters:
source? — specific source name (omit for all)
include_pattern? — glob filter (e.g. "*.docx")
init_source
Initialize a source baseline. All current files are marked as known.
Parameters:
source — source name to initialize
fetch_file
Download a file, extract text, and auto-delete the local copy.
Parameters:
source — source name
path — file path (from check_sources)
keep_local — if true, keep file on disk (default: false)
Returns extracted text for DOCX/PDF/TXT. For unknown formats, returns local_path for Claude to read directly.
Workflow
- First time: Claude calls
init_sourceto create a baseline - Ongoing: Claude calls
check_sourcesto find new files - Per file: Claude calls
fetch_fileto get content and summarize - Files are auto-deleted after extraction
Uninstall
claude mcp remove remote-files
rm -rf ~/.config/remote-files
rm -rf ~/.local/share/remote-files
İlgili Sunucular
Pinata
Interact with Public and Private IPFS through Pinata's API. Requires a Pinata account and API key.
Apache OpenDAL™
Access various storage services like S3, GCS, and Azure Blob through the Apache OpenDAL™ project, configured via environment variables.
Box
Interact with the Intelligent Content Management platform through Box AI.
dbx-mcp-server
An MCP server for interacting with Dropbox files and services.
Hindsight
Hindsight: Agent Memory That Works Like Human Memory
Qiniu MCP Server
Access Qiniu's cloud storage, intelligent multimedia services, and CDN operations via the Model Context Protocol.
s3-tools MCP Server
Provides tools for interacting with AWS S3 buckets.
AWS S3 MCP Server
An MCP server for managing files and buckets on AWS S3. Requires AWS credentials for authentication.
CData IBM Cloud Object Storage
A read-only MCP server for IBM Cloud Object Storage, powered by CData. Requires a separate CData JDBC Driver license.
AWS S3 MCP Server
Manage AWS S3 operations, providing secure access to S3 buckets through pre-signed URLs.