Dataset Viewer
Interact with the Hugging Face Dataset Viewer API to browse, filter, and get statistics for datasets.
Dataset Viewer MCP Server
An MCP server for interacting with the Hugging Face Dataset Viewer API, providing capabilities to browse and analyze datasets hosted on the Hugging Face Hub.
Features
Resources
- Uses
dataset://
URI scheme for accessing Hugging Face datasets - Supports dataset configurations and splits
- Provides paginated access to dataset contents
- Handles authentication for private datasets
- Supports searching and filtering dataset contents
- Provides dataset statistics and analysis
Tools
The server provides the following tools:
-
validate
- Check if a dataset exists and is accessible
- Parameters:
dataset
: Dataset identifier (e.g. 'stanfordnlp/imdb')auth_token
(optional): For private datasets
-
get_info
- Get detailed information about a dataset
- Parameters:
dataset
: Dataset identifierauth_token
(optional): For private datasets
-
get_rows
- Get paginated contents of a dataset
- Parameters:
dataset
: Dataset identifierconfig
: Configuration namesplit
: Split namepage
(optional): Page number (0-based)auth_token
(optional): For private datasets
-
get_first_rows
- Get first rows from a dataset split
- Parameters:
dataset
: Dataset identifierconfig
: Configuration namesplit
: Split nameauth_token
(optional): For private datasets
-
get_statistics
- Get statistics about a dataset split
- Parameters:
dataset
: Dataset identifierconfig
: Configuration namesplit
: Split nameauth_token
(optional): For private datasets
-
search_dataset
- Search for text within a dataset
- Parameters:
dataset
: Dataset identifierconfig
: Configuration namesplit
: Split namequery
: Text to search forauth_token
(optional): For private datasets
-
filter
- Filter rows using SQL-like conditions
- Parameters:
dataset
: Dataset identifierconfig
: Configuration namesplit
: Split namewhere
: SQL WHERE clause (e.g. "score > 0.5")orderby
(optional): SQL ORDER BY clausepage
(optional): Page number (0-based)auth_token
(optional): For private datasets
-
get_parquet
- Download entire dataset in Parquet format
- Parameters:
dataset
: Dataset identifierauth_token
(optional): For private datasets
Installation
Prerequisites
- Python 3.12 or higher
- uv - Fast Python package installer and resolver
Setup
- Clone the repository:
git clone https://github.com/privetin/dataset-viewer.git
cd dataset-viewer
- Create a virtual environment and install:
# Create virtual environment
uv venv
# Activate virtual environment
# On Unix:
source .venv/bin/activate
# On Windows:
.venv\Scripts\activate
# Install in development mode
uv add -e .
Configuration
Environment Variables
HUGGINGFACE_TOKEN
: Your Hugging Face API token for accessing private datasets
Claude Desktop Integration
Add the following to your Claude Desktop config file:
On Windows: %APPDATA%\Claude\claude_desktop_config.json
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"dataset-viewer": {
"command": "uv",
"args": [
"--directory",
"parent_to_repo/dataset-viewer",
"run",
"dataset-viewer"
]
}
}
}
License
MIT License - see LICENSE for details
Related Servers
ODBC MCP Server
Enables LLM tools to query databases using ODBC connections.
ClickHouse
Query your ClickHouse database server.
Knowledge Graph Memory Server
Enables persistent memory for Claude using a knowledge graph stored in local JSON files.
Redash
Execute queries and retrieve results using the Redash API.
Atlan
Official MCP Server from Atlan which enables you to bring the power of metadata to your AI tools
RentCast
Access property data, valuations, and market statistics using the RentCast API.
Dremio
Integrate Large Language Models (LLMs) with Dremio, a data lakehouse platform.
MCP PGVector Server
Provides semantic search capabilities for PostgreSQL databases using the pgvector extension, with support for multiple embedding providers.
Tableau MCP Server
Interact with Tableau Server using natural language to query data and perform administrative tasks.
Teradata
A collection of tools for managing the platform, addressing data quality and reading and writing to Teradata Database.