Provides tools for geospatial analysis within Jupyter notebooks.
๐ Jupyter Earth MCP Server is a Model Context Protocol (MCP) server implementation that provides a set of tools for ๐บ๏ธ Geospatial analysis in ๐ Jupyter notebooks.
The following demo uses the Earthdata MCP server to search for datasets and data granules on NASA Earthdata, this MCP server to download the data in Jupyter and the jupyter-mcp-server to run further analysis.
Make sure you have the following installed. The collaboration package is needed as the modifications made on the notebook can be seen thanks to Jupyter Real Time Collaboration.
pip install jupyterlab==4.4.1 jupyter-collaboration==4.0.2 ipykernel
pip uninstall -y pycrdt datalayer_pycrdt
pip install datalayer_pycrdt==0.12.17
Then, start JupyterLab with the following command.
jupyter lab --port 8888 --IdentityProvider.token MY_TOKEN --ip 0.0.0.0
You can also run make jupyterlab
.
[!NOTE]
The
--ip
is set to0.0.0.0
to allow the MCP server running in a Docker container to access your local JupyterLab.
Claude Desktop can be downloaded from this page for macOS and Windows.
For Linux, we had success using this UNOFFICIAL build script based on nix
# โ ๏ธ UNOFFICIAL
# You can also run `make claude-linux`
NIXPKGS_ALLOW_UNFREE=1 nix run github:k3d3/claude-desktop-linux-flake \
--impure \
--extra-experimental-features flakes \
--extra-experimental-features nix-command
To use this with Claude Desktop, add the following to your claude_desktop_config.json
(read more on the MCP documentation website).
[!IMPORTANT]
Ensure the port of the
SERVER_URL
andTOKEN
match those used in thejupyter lab
command.The
NOTEBOOK_PATH
should be relative to the directory where JupyterLab was started.
{
"mcpServers": {
"jupyter-earth": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"SERVER_URL",
"-e",
"TOKEN",
"-e",
"NOTEBOOK_PATH",
"datalayer/jupyter-earth-mcp-server:latest"
],
"env": {
"SERVER_URL": "http://host.docker.internal:8888",
"TOKEN": "MY_TOKEN",
"NOTEBOOK_PATH": "notebook.ipynb"
}
}
}
}
CLAUDE_CONFIG=${HOME}/.config/Claude/claude_desktop_config.json
cat <<EOF > $CLAUDE_CONFIG
{
"mcpServers": {
"jupyter-earth": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"SERVER_URL",
"-e",
"TOKEN",
"-e",
"NOTEBOOK_PATH",
"--network=host",
"datalayer/jupyter-earth-mcp-server:latest"
],
"env": {
"SERVER_URL": "http://localhost:8888",
"TOKEN": "MY_TOKEN",
"NOTEBOOK_PATH": "notebook.ipynb"
}
}
}
}
EOF
cat $CLAUDE_CONFIG
The server currently offers 1 tool:
download_earth_data_granules
folder_name
(string): Local folder name to save the data.short_name
(string): Short name of the Earth dataset to download.count
(int): Number of data granules to download.temporal
(tuple): (Optional) Temporal range in the format (date_from, date_to).bounding_box
(tuple): (Optional) Bounding box in the format (lower_left_lon, lower_left_lat, upper_right_lon, upper_right_lat).download_analyze_global_sea_level
You can build the Docker image it from source.
make build-docker
Integration with QA Sphere test management system, enabling LLMs to discover, summarize, and interact with test cases directly from AI-powered IDEs
APIMatic MCP Server is used to validate OpenAPI specifications using APIMatic. The server processes OpenAPI files and returns validation summaries by leveraging APIMaticโs API.
Aggregates multiple MCP resource servers into a single interface using a JSON configuration file.
AI-powered audio generation using the MiniMax Music API.
Work on dataset metadata with MLCommons Croissant validation and creation.
๐ Build iOS Xcode workspace/project and feed back errors to llm.
A Tabby plugin implementing an MCP server for AI-powered terminal control and automation.
Open-source tool for collaborative editing, versioning, evaluating, and releasing prompts.
An MCP server that enables Large Language Models to make HTTP requests and interact with web APIs. It supports automatic tool generation from OpenAPI/Swagger specifications.
Server for advanced AI-driven video editing, semantic search, multilingual transcription, generative media, voice cloning, and content moderation.