MCP Image Extractor
Extracts images from files, URLs, or base64 strings and converts them to base64 for LLM analysis.
MCP Image Extractor
MCP server for extracting and converting images to base64 for LLM analysis.
This MCP server provides tools for AI assistants to:
- Extract images from local files
- Extract images from URLs
- Process base64-encoded images
How it looks in Cursor:
Suitable cases:
- analyze playwright test results: screenshots
Installation
Recommended: Using npx in mcp.json (Easiest)
The recommended way to install this MCP server is using npx directly in your .cursor/mcp.json file:
{
"mcpServers": {
"image-extractor": {
"command": "npx",
"args": [
"-y",
"mcp-image-extractor"
]
}
}
}
This approach:
- Automatically installs the latest version
- Does not require global installation
- Works reliably across different environments
Alternative: Local Path Installation
If you prefer to use a local installation of the package, you can clone the repository and point to the built files:
{
"mcpServers": {
"image-extractor": {
"command": "node",
"args": ["/full/path/to/mcp-image-extractor/dist/index.js"],
"disabled": false
}
}
}
Manual Installation
# Clone and install
git clone https://github.com/ifmelate/mcp-image-extractor.git
cd mcp-image-extractor
npm install
npm run build
npm link
This will make the mcp-image-extractor command available globally.
Then configure in .cursor/mcp.json:
{
"mcpServers": {
"image-extractor": {
"command": "mcp-image-extractor",
"disabled": false
}
}
}
Troubleshooting for Cursor Users: If you see "Failed to create client" error, try the local path installation method above or ensure you're using the correct path to the executable.
Available Tools
extract_image_from_file
Extracts an image from a local file and converts it to base64.
Parameters:
file_path(required): Path to the local image file
Note: All images are automatically resized to optimal dimensions (max 512x512) for LLM analysis to limit the size of the base64 output and optimize context window usage.
extract_image_from_url
Extracts an image from a URL and converts it to base64.
Parameters:
url(required): URL of the image to extract
Note: All images are automatically resized to optimal dimensions (max 512x512) for LLM analysis to limit the size of the base64 output and optimize context window usage.
extract_image_from_base64
Processes a base64-encoded image for LLM analysis.
Parameters:
base64(required): Base64-encoded image datamime_type(optional, default: "image/png"): MIME type of the image
Note: All images are automatically resized to optimal dimensions (max 512x512) for LLM analysis to limit the size of the base64 output and optimize context window usage.
Example Usage
Here's an example of how to use the tools from Claude:
Please extract the image from this local file: images/photo.jpg
Claude will automatically use the extract_image_from_file tool to load and analyze the image content.
Please extract the image from this URL: https://example.com/image.jpg
Claude will automatically use the extract_image_from_url tool to fetch and analyze the image content.
Docker
Build and run with Docker:
docker build -t mcp-image-extractor .
docker run -p 8000:8000 mcp-image-extractor
License
MIT
Server Terkait
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Claude Code History
Retrieve and analyze Claude Code conversation history from local files.
MCP Image Generator
An MCP server for generating images using Together AI or Replicate models.
Liana-MCP
A natural language interface for cell-cell communication analysis using the Liana framework.
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers, without authentication.
MCP Documentation Server
An AI-powered documentation server for code improvement and management, with Claude and Brave Search integration.
Azure DevOps
Integrate with Azure DevOps services to manage work items, repositories, and pipelines.
MCP Server
A framework for AI-powered command execution and a plugin-based tool system. It can be run as a standalone service or embedded in other projects to expose a consistent API for invoking tools and managing tasks.
Ollama
Integrates with Ollama to run local large language models. Requires a running Ollama instance.
Ebitengine MCP
A server for Ebitengine games that provides debugging and recording tools by capturing game state.
velixar-mcp-server
Persistant AI Memory