cellrank-MCP
A natural language interface for single-cell RNA sequencing (scRNA-Seq) analysis using the CellRank toolkit.
cellrank-MCP
Natural language interface for scRNA-Seq analysis with cellrank through MCP.
🪩 What can it do?
- IO module like read and write scRNA-Seq data
- Preprocessing module,like filtering, quality control, normalization, scaling, highly-variable genes, PCA, Neighbors,...
- Tool module, like clustering, differential expression etc.
- Plotting module, like violin, heatmap, dotplot
❓ Who is this for?
- Anyone who wants to do scRNA-Seq analysis natural language!
- Agent developers who want to call cellrank's functions for their applications
🌐 Where to use it?
You can use cellrank-mcp in most AI clients, plugins, or agent frameworks that support the MCP:
- AI clients, like Cherry Studio
- Plugins, like Cline
- Agent frameworks, like Agno
📚 Documentation
scmcphub's complete documentation is available at https://docs.scmcphub.org
🎬 Demo
A demo showing scRNA-Seq cell cluster analysis in a AI client Cherry Studio using natural language based on cellrank-mcp
🏎️ Quickstart
Install
Install from PyPI
pip install cellrank-mcp
you can test it by running
cellrank-mcp run
run cellrank-mcp locally
Refer to the following configuration in your MCP client:
check path
$ which cellrank
/home/test/bin/cellrank-mcp
"mcpServers": {
"cellrank-mcp": {
"command": "/home/test/bin/cellrank-mcp",
"args": [
"run"
]
}
}
run cellrank-server remotely
Refer to the following configuration in your MCP client:
run it in your server
cellrank-mcp run --transport shttp --port 8000
Then configure your MCP client in local AI client, like this:
"mcpServers": {
"cellrank-mcp": {
"url": "http://localhost:8000/mcp"
}
}
🤝 Contributing
If you have any questions, welcome to submit an issue, or contact me([email protected]). Contributions to the code are also welcome!
Citing
If you use cellRank-mcp in for your research, please consider citing following work:
Weiler, P., Lange, M., Klein, M. et al. CellRank 2: unified fate mapping in multiview single-cell data. Nat Methods 21, 1196–1205 (2024). https://doi.org/10.1038/s41592-024-02303-9
İlgili Sunucular
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
DeepSeek MCP Server
A server for code generation and completion using the DeepSeek API.
Cygnus MCP Server
A simple MCP server exposing Cygnus tools for demonstration, including 'cygnus_alpha' and 'invoke-service'.
MCP Server
A cross-platform MCP server implementation for Amazon Q and Claude, providing multiple tools in a single executable.
MCP Sandbox
Execute Python code and install packages safely within isolated Docker containers.
weibaohui/k8m
Provides multi-cluster Kubernetes management and operations using MCP, featuring a management interface, logging, and nearly 50 built-in tools covering common DevOps and development scenarios. Supports both standard and CRD resources.
Superface
Provides Superface tools for the Model Context Protocol, requiring a SUPERFACE_API_KEY.
Unity Code MCP Server
Powerful tool for the Unity Editor that gives AI Agents ability to perform any action using Unity Editor API, like modification of scripts, scenes, prefabs, assets, configuration and more.
Tuteliq
AI-powered safety tools for detecting grooming, bullying, threats, and harmful interactions in conversations. The server integrates Tuteliq’s behavioral risk detection API via the Model Context Protocol (MCP), enabling AI assistants to analyze interaction patterns rather than relying on keyword moderation. Use cases include platform safety, chat moderation, child protection, and compliance with regulations such as the EU Digital Services Act (DSA), COPPA, and KOSA.
lenderwiki
Query 13,000+ US consumer lenders with eligibility criteria, rates, CFPB complaints, and ratings. Find matching lenders by borrower profile, get full profiles, compare lenders, and check eligibility.
Teleprompter
A server for managing and reusing prompts with Large Language Models (LLMs).