HDFS MCP Server
Access and manage files on HDFS clusters using the MCP protocol, supporting operations like upload, download, move, and copy.
HDFS MCP Server
HDFS MCP Server is a controller based on MCP (Model Context Protocol) that provides access to HDFS clusters through the MCP protocol. The server supports basic HDFS operations such as file upload, download, move, copy, and provides friendly error handling and connection testing capabilities.
Requirements
- Python 3.11 or higher
- Hadoop client installed and configured
uvpackage manager
Installation
-
Clone the repository:
git clone https://github.com/will-sh/hdfs-mcp.git cd hdfs-mcp -
Ensure Python 3.11 is active: The project specifies Python 3.11 in the
.python-versionfile. If you usepyenv, it will automatically use this version when you enter the directory. If you don't have Python 3.11 installed, you can install it using:# Example using pyenv pyenv install 3.11 -
Create and activate virtual environment using
uv:uv venv source .venv/bin/activate # macOS/Linux # .\.venv\Scripts\activate # Windows -
Install dependencies using
uv:uv pip sync
MCP Configuration
{
"mcpServers": {
"hdfs-controller": {
"command": "uv",
"args": [
"--directory",
"/path/to/your/hdfsmcp",
"run",
"hdfs.py"
],
"env": {
"HDFS_NAMENODE": "your_namenode_hostname",
"NAMENODE_PORT": "your_namenode_port"
}
}
}
}
Replace the following with your actual configuration:
/path/to/your/hdfs-mcp: Replace with your project's actual pathyour_namenode_hostname: Replace with your HDFS NameNode hostnameyour_namenode_port: Replace with your HDFS NameNode port (if not specify the default port is 8020)
Features
The HDFS MCP provides the following HDFS operations:
- List directory contents
- Read file contents
- Create directories
- Delete files/directories
- Upload files to HDFS
- Download files from HDFS
- Get file/directory information
- Get disk usage
- Get cluster status
- Copy/move files within HDFS
Usage
- Ensure Hadoop client is properly installed and configured
- Ensure
HADOOP_HOMEenvironment variable is set - Ensure
hdfscommand is in your system PATH
Troubleshooting
If you encounter connection issues, check:
- HDFS NameNode accessibility
- Port configuration
- Network connectivity
- Hadoop client configuration
- Kerberos ticket is valid
Notes
- Ensure you have sufficient permissions to access the HDFS cluster
- Large file operations may take longer, please be patient
- It's recommended to test the connection before operations
Related Servers
File System MCP Server
A server for comprehensive file and directory management on the local file system.
MCP Start App
An MCP server for local file management and system operations.
Filesystem MCP Server for WSL
A filesystem server for Windows Subsystem for Linux (WSL), using native commands for faster file operations.
OpenPyXL MCP Server
An MCP server that wraps the OpenPyXL library, enabling clients to retrieve data from Excel files.
Local Utilities
Provides essential utility tools for text processing, file operations, and system tasks.
File MCP Server
A server providing comprehensive file system operations, automatically downloaded and built on first use.
Filesystem MCP Server
Provides AI agents with secure access to local filesystem operations like reading, writing, and managing files and directories.
YaraFlux
An MCP server for YARA scanning, enabling LLMs to analyze files using YARA rules.
Filesystem MCP Server
A secure server for filesystem operations with controlled access to specified directories.
File Finder
Search for files in the local filesystem using a path fragment.