GemForge (Gemini Tools)
Integrates Google's Gemini for advanced codebase analysis, web search, and processing of text, PDFs, and images.
GemForge (Gemini Tools)
Overview
GemForge-Gemini-Tools-MCP: Enterprise-grade Gemini integration for your favorite MCP agents. Supercharge Claude, Roo Code, and Windsurf with codebase analysis, live search, text/PDF/image processing, and more.
Quick Navigation
- Features
- Quick Start
- Configuration
- Tools
- Heavy-Duty Reliability
- Deployment
- Examples
- Community
- Documentation
Why GemForge?
GemForge is the essential bridge between Google's Gemini AI and the MCP ecosystem:
-
Real-Time Web Access: Fetch breaking news, market trends, and current data with
gemini_search -
Advanced Reasoning: Process complex logic problems with step-by-step thinking via
gemini_reason -
Code Mastery: Analyze full repositories, generate solutions, and debug code with
gemini_code -
Multi-File Processing: Handle 60+ file formats including PDFs, images, and more with
gemini_fileops -
Intelligent Model Selection: Automatically routes to optimal Gemini model for each task
-
Enterprise-Ready: Robust error handling, rate limit management, and API fallback mechanisms
Quick Start
One-Line Install
npx @gemforge/mcp-server@latest init
Manual Setup
- Create configuration file (
claude_desktop_config.json):
{
"mcpServers": {
"GemForge": {
"command": "node",
"args": ["./dist/index.js"],
"env": {
"GEMINI_API_KEY": "your_api_key_here"
}
}
}
}
- Install and run:
npm install gemforge-mcp
npm start
Heavy-Duty Reliability
GemForge is built for production environments:
- Support for 60+ File Types: Process everything from code to documents to images
- Automatic Model Fallbacks: Continues functioning even during rate limits or service disruptions
- Enterprise-Grade Error Logging: Detailed diagnostics for troubleshooting
- API Resilience: Exponential backoff, retry logic, and seamless model switching
- Full Repository Support: Analyze entire codebases with configurable inclusion/exclusion patterns
- XML Content Processing: Specialized handling for structured data
Key Tools
| Tool | Description | Key Capability |
|---|---|---|
gemini_search | Web-connected information retrieval | Real-time data access |
gemini_reason | Complex problem solving with step-by-step logic | Transparent reasoning process |
gemini_code | Deep code understanding and generation | Full repository analysis |
gemini_fileops | Multi-file processing across 60+ formats | Document comparison and transformation |
Example: Real-Time Search
{
"toolName": "gemini_search",
"toolParams": {
"query": "Latest advancements in quantum computing",
"enable_thinking": true
}
}
Example: Code Analysis
{
"toolName": "gemini_code",
"toolParams": {
"question": "Identify improvements and new features",
"directory_path": "path/to/project",
"repomix_options": "--include \"**/*.js\" --no-gitignore"
}
}
Example: Multi-File Comparison
{
"toolName": "gemini_fileops",
"toolParams": {
"file_path": ["contract_v1.pdf", "contract_v2.pdf"],
"operation": "analyze",
"instruction": "Compare these contract versions and extract all significant changes."
}
}
Configuration
GemForge offers flexible configuration options:
Environment Variables
GEMINI_API_KEY=your_api_key_here # Required: Gemini API key
GEMINI_PAID_TIER=true # Optional: Set to true if using paid tier (better rate limits)
DEFAULT_MODEL_ID=gemini-2.5-pro # Optional: Override default model selection
LOG_LEVEL=info # Optional: Set logging verbosity (debug, info, warn, error)
Claude Desktop Integration
{
"mcpServers": {
"GemForge": {
"command": "node",
"args": ["./dist/index.js"],
"env": {
"GEMINI_API_KEY": "your_api_key_here"
}
}
}
}
Advanced Model Selection
GemForge intelligently selects the best model for each task:
gemini_search: Usesgemini-2.5-flashfor speed and search integrationgemini_reason: Usesgemini-2.5-profor deep reasoning capabilitiesgemini_code: Usesgemini-2.5-profor complex code understandinggemini_fileops: Selects betweengemini-2.0-flash-liteorgemini-1.5-probased on file size
Override with model_id parameter in any tool call or set DEFAULT_MODEL_ID environment variable.
Deployment
Smithery.ai
One-click deployment via Smithery.ai
Docker
docker run -e GEMINI_API_KEY=your_api_key ghcr.io/pv-bhat/gemforge:latest
Self-Hosted
Use our MCP.so Directory listing for integration instructions.
What Sets GemForge Apart?
- Cross-Ecosystem Power: Bridge Google's AI with Claude and other MCP agents
- Multi-File Analysis: Compare documents, images, or code versions
- Smart Routing: Automatic model selection based on task requirements
- Production-Ready: Built for enterprise environments

Community & Support
- Join Us: MCP Discord | GemForge Discord
- Contribute: GitHub Discussions
- Feedback: Open an issue or share thoughts on Discord
Documentation
Visit our Documentation Site for:
- Advanced usage tutorials
- API reference
- Troubleshooting tips
License
Licensed under the MIT License. See LICENSE for details.
Acknowledgments
- Google Gemini API for providing the underlying AI capabilities
- Model Context Protocol (MCP) for standardizing AI tool interfaces
Servidores relacionados
Alpha Vantage MCP Server
patrocinadorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
GhostQA
GhostQA sends AI personas through your application — they look at the screen, decide what to do, and interact like real humans. No test scripts. No selectors. You describe personas and journeys in YAML, and GhostQA handles the rest.
Binalyze AIR MCP Server
Interact with Binalyze AIR's digital forensics and incident response capabilities using natural language.
Mobile Xray MCP
Take screenshots and analyze mobile apps with AI assistance directly from your IDE.
VSCode MCP Server
A VSCode extension that acts as an MCP server, providing access to diagnostic tools and debug session management.
Remote MCP Server (Authless)
An example of a remote MCP server without authentication, deployable on Cloudflare Workers.
agency-mcp-server
On-demand access to 150+ specialist AI agent templates — search, browse, and spawn agents. 150x reduction in context usage vs loading agents locally.
Svelte MCP
Official Svelte MCP server, provides docs and suggestions on the generated code.
hanabi-cli
A terminal AI chat interface for any LLM model, with file context, MCP, and deployment support.
Persona MCP Server
Dynamically manage AI personas from markdown files for AI assistants like Claude.
Gemini Image MCP Server
Image generation using Google's Gemini API.