Access YouTube video transcripts and translations using the YouTube Translate API.
A Model Context Protocol (MCP) server for accessing the YouTube Translate API, allowing you to obtain transcripts, translations, and summaries of YouTube videos.
To install youtube-translate-mcp for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @brianshin22/youtube-translate-mcp --client claude
This package requires Python 3.12 or higher:
# Using uv (recommended)
uv pip install youtube-translate-mcp
# Using pip
pip install youtube-translate-mcp
Or install from source:
# Clone the repository
git clone https://github.com/yourusername/youtube-translate-mcp.git
cd youtube-translate-mcp
# Using uv (recommended)
uv pip install -e .
# Using pip
pip install -e .
To run the server:
# Using stdio transport (default)
YOUTUBE_TRANSLATE_API_KEY=your_api_key youtube-translate-mcp
# Using SSE transport
YOUTUBE_TRANSLATE_API_KEY=your_api_key youtube-translate-mcp --transport sse --port 8000
You can also run the server using Docker:
# Build the Docker image
docker build -t youtube-translate-mcp .
# Run with stdio transport
docker run -e YOUTUBE_TRANSLATE_API_KEY=your_api_key youtube-translate-mcp
# Run with SSE transport
docker run -p 8000:8000 -e YOUTUBE_TRANSLATE_API_KEY=your_api_key youtube-translate-mcp --transport sse
YOUTUBE_TRANSLATE_API_KEY
: Required. Your API key for accessing the YouTube Translate API.This package includes a smithery.yaml
file for easy deployment with Smithery.
To deploy, set the YOUTUBE_TRANSLATE_API_KEY
configuration parameter to your YouTube Translate API key.
# Create and activate a virtual environment using uv (recommended)
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies using uv
uv pip install -e .
# Alternatively, with standard tools
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -e .
To test with Claude Desktop (macOS/Windows only), you'll need to add your server to the Claude Desktop configuration file located at ~/Library/Application Support/Claude/claude_desktop_config.json
.
Use this method if you want to test your local development version:
{
"mcpServers": {
"youtube-translate": {
"command": "uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/youtube-translate-mcp",
"run",
"-m", "youtube_translate_mcp"
],
"env": {
"YOUTUBE_TRANSLATE_API_KEY": "YOUR_API_KEY"
}
}
}
}
Make sure to replace /ABSOLUTE/PATH/TO/youtube-translate-mcp
with the actual path to your project directory.
If you prefer to test using Docker (recommended for more reproducible testing):
{
"mcpServers": {
"youtube-translate": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"YOUTUBE_TRANSLATE_API_KEY",
"youtube-translate-mcp"
],
"env": {
"YOUTUBE_TRANSLATE_API_KEY": "YOUR_API_KEY"
}
}
}
}
Replace YOUR_API_KEY
with your actual YouTube Translate API key.
For more information on using MCP servers with Claude Desktop, see the MCP documentation.
MIT
Integrate real-time Scrapeless Google SERP(Google Search, Google Flight, Google Map, Google Jobs....) results into your LLM applications. This server enables dynamic context retrieval for AI workflows, chatbots, and research tools.
Download webpages as markdown files using the r.jina.ai service, with configurable directories and persistent settings.
Fetch Bilibili video comments in bulk, including nested replies. Requires a Bilibili cookie for authentication.
Enable AI agents to get structured data from unstructured web with AgentQL.
MCP Server to let Claude / your AI control the browser
Render website screenshots with ScreenshotOne
Automate browser interactions in the cloud (e.g. web navigation, data extraction, form filling, and more)
High-quality screenshot capture optimized for Claude Vision API. Automatically tiles full pages into 1072x1072 chunks (1.15 megapixels) with configurable viewports and wait strategies for dynamic content.
Use 3,000+ pre-built cloud tools to extract data from websites, e-commerce, social media, search engines, maps, and more
Web content fetching and conversion for efficient LLM usage