HDFS MCP Server
Access and manage files on HDFS clusters using the MCP protocol, supporting operations like upload, download, move, and copy.
HDFS MCP Server
HDFS MCP Server is a controller based on MCP (Model Context Protocol) that provides access to HDFS clusters through the MCP protocol. The server supports basic HDFS operations such as file upload, download, move, copy, and provides friendly error handling and connection testing capabilities.
Requirements
- Python 3.11 or higher
- Hadoop client installed and configured
uvpackage manager
Installation
-
Clone the repository:
git clone https://github.com/will-sh/hdfs-mcp.git cd hdfs-mcp -
Ensure Python 3.11 is active: The project specifies Python 3.11 in the
.python-versionfile. If you usepyenv, it will automatically use this version when you enter the directory. If you don't have Python 3.11 installed, you can install it using:# Example using pyenv pyenv install 3.11 -
Create and activate virtual environment using
uv:uv venv source .venv/bin/activate # macOS/Linux # .\.venv\Scripts\activate # Windows -
Install dependencies using
uv:uv pip sync
MCP Configuration
{
"mcpServers": {
"hdfs-controller": {
"command": "uv",
"args": [
"--directory",
"/path/to/your/hdfsmcp",
"run",
"hdfs.py"
],
"env": {
"HDFS_NAMENODE": "your_namenode_hostname",
"NAMENODE_PORT": "your_namenode_port"
}
}
}
}
Replace the following with your actual configuration:
/path/to/your/hdfs-mcp: Replace with your project's actual pathyour_namenode_hostname: Replace with your HDFS NameNode hostnameyour_namenode_port: Replace with your HDFS NameNode port (if not specify the default port is 8020)
Features
The HDFS MCP provides the following HDFS operations:
- List directory contents
- Read file contents
- Create directories
- Delete files/directories
- Upload files to HDFS
- Download files from HDFS
- Get file/directory information
- Get disk usage
- Get cluster status
- Copy/move files within HDFS
Usage
- Ensure Hadoop client is properly installed and configured
- Ensure
HADOOP_HOMEenvironment variable is set - Ensure
hdfscommand is in your system PATH
Troubleshooting
If you encounter connection issues, check:
- HDFS NameNode accessibility
- Port configuration
- Network connectivity
- Hadoop client configuration
- Kerberos ticket is valid
Notes
- Ensure you have sufficient permissions to access the HDFS cluster
- Large file operations may take longer, please be patient
- It's recommended to test the connection before operations
Related Servers
Custom PDF MCP Server
A server for processing PDF files, allowing text and table extraction, metadata retrieval, and file listing within a specific directory.
File Explorer MCP
A server for programmatic exploration of local files and folders.
Excel Analyser MCP
Read and analyze Excel (.xlsx) and CSV (.csv) files with scalable, chunked, and column-specific data access, ideal for large datasets.
Folder MCP
A server for local folder operations and file system access.
Music Collection MCP Server
Access and manage local music collections with advanced metadata, classification, and analytics.
Filesystem MCP Server
Provides file system operations, analysis, and manipulation capabilities through a standardized tool interface.
File System MCP Server
A server for comprehensive file and directory management on the local file system.
Synology MCP Server
Manage files and downloads on Synology NAS devices using an AI assistant.
AgentMcp
A local server that allows AI to execute Windows CMD commands, read/write files, and manage directories within a specified path.
Basic Memory
Build a persistent, local knowledge base in Markdown files through conversations with LLMs.