ask-gemini-mcp
MCP server that enables AI assistants to interact with Google Gemini CLI
Ask Gemini MCP
An MCP server for AI-to-AI collaboration via the Gemini CLI. Available on npm: ask-gemini-mcp. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.
Why?
- Get a second opinion — Ask Gemini to review your coding approach before committing to it
- Debate plans — Send architecture proposals to Gemini for critique and alternative suggestions
- Review changes — Have Gemini analyze diffs or modified files to catch issues your primary AI might miss
- Massive context — Gemini reads entire codebases (1M+ tokens) that would overflow other models
Quick Start
Claude Code
# Project scope (available in current project only)
claude mcp add gemini-cli -- npx -y ask-gemini-mcp
# User scope (available across all projects)
claude mcp add --scope user gemini-cli -- npx -y ask-gemini-mcp
Claude Desktop
Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"gemini-cli": {
"command": "npx",
"args": ["-y", "ask-gemini-mcp"]
}
}
}
Other config file locations
- Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/claude/claude_desktop_config.json
Cursor
Add to .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):
{
"mcpServers": {
"gemini-cli": {
"command": "npx",
"args": ["-y", "ask-gemini-mcp"]
}
}
}
Codex CLI
Add to ~/.codex/config.toml (or .codex/config.toml in your project):
[mcp_servers.gemini-cli]
command = "npx"
args = ["-y", "ask-gemini-mcp"]
Or via CLI:
codex mcp add gemini-cli -- npx -y ask-gemini-mcp
OpenCode
Add to opencode.json in your project (or ~/.config/opencode/opencode.json for global):
{
"mcp": {
"gemini-cli": {
"type": "local",
"command": ["npx", "-y", "ask-gemini-mcp"]
}
}
}
Any MCP Client (STDIO Transport)
{
"transport": {
"type": "stdio",
"command": "npx",
"args": ["-y", "ask-gemini-mcp"]
}
}
Prerequisites
- Node.js v20.0.0 or higher (LTS)
- Google Gemini CLI installed and authenticated
Tools
| Tool | Purpose |
|---|---|
ask-gemini | Send prompts to Gemini CLI. Supports @ file syntax, model selection, sandbox mode, and changeMode for structured edits |
fetch-chunk | Retrieve subsequent chunks from cached large responses |
ping | Connection test — verify MCP setup without using Gemini tokens |
Usage Examples
File analysis (@ syntax):
ask gemini to analyze @src/main.js and explain what it doesuse gemini to summarize @. the current directory
Code review:
ask gemini to review the changes in @src/auth.ts for security issuesuse gemini to compare @old.js and @new.js
General questions:
ask gemini about best practices for React state management
Sandbox mode:
use gemini sandbox to create and run a Python script
Models
| Model | Use Case |
|---|---|
gemini-3.1-pro-preview | Default — best quality reasoning |
gemini-3-flash-preview | Faster responses, large codebases |
The server automatically falls back to Flash when Pro quota is exceeded.
Contributing
Contributions are welcome! See open issues for things to work on.
License
MIT License. See LICENSE for details.
Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.
Servidores relacionados
Scout Monitoring MCP
patrocinadorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
patrocinadorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Dify Workflow
A tool server for integrating Dify Workflows via the Model Context Protocol (MCP).
Smart AI Bridge
Intelligent Al routing and integration platform for seamless provider switching
MCPwner
Automated Security Vulnerabilities Pentesting
MCP LSP Go
An MCP server that connects AI assistants to Go's Language Server Protocol (LSP) for advanced code analysis.
BaseCreative MCP
A template for deploying a remote MCP server on Cloudflare Workers without authentication.
Exploit Intelligence Platform MCP Server
An MCP (Model Context Protocol) server that gives AI assistants access to the Exploit Intelligence Platform — hundreds of thousands of vulnerabilities and exploits from NVD, CISA KEV, VulnCheck KEV, InTheWild.io, ENISA EUVD, OSV.dev, EPSS, ExploitDB, Metasploit, GitHub, and more. I
Universal Crypto MCP
Enable AI agents to interact with any EVM blockchain through natural language.
BoostSecurity
BoostSecurity MCP acts as a safeguard preventing agents from adding vulnerable packages into projects. It analyzes every package an AI agent introduces, flags unsafe dependencies, and recommends secure, maintained alternatives to keep projects protected.
Berry MCP Server
A universal framework for easily creating and deploying Model Context Protocol servers with any tools.
Kai
Kai provides a bridge between large language models (LLMs) and your Kubernetes clusters, enabling natural language interaction with Kubernetes resources. The server exposes a comprehensive set of tools for managing clusters, namespaces, pods, deployments, services, and other Kubernetes resources