ask-gemini-mcp

MCP server that enables AI assistants to interact with Google Gemini CLI

Ask Gemini MCP

npm version npm downloads GitHub Release License: MIT

MCP server that connects any AI client to Google Gemini CLI

An MCP server for AI-to-AI collaboration via the Gemini CLI. Available on npm: ask-gemini-mcp. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.

Why?

  • Get a second opinion — Ask Gemini to review your coding approach before committing to it
  • Debate plans — Send architecture proposals to Gemini for critique and alternative suggestions
  • Review changes — Have Gemini analyze diffs or modified files to catch issues your primary AI might miss
  • Massive context — Gemini reads entire codebases (1M+ tokens) that would overflow other models

Quick Start

Claude Code

# Project scope (available in current project only)
claude mcp add gemini-cli -- npx -y ask-gemini-mcp

# User scope (available across all projects)
claude mcp add --scope user gemini-cli -- npx -y ask-gemini-mcp

Claude Desktop

Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/claude/claude_desktop_config.json

Cursor

Add to .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}

Codex CLI

Add to ~/.codex/config.toml (or .codex/config.toml in your project):

[mcp_servers.gemini-cli]
command = "npx"
args = ["-y", "ask-gemini-mcp"]

Or via CLI:

codex mcp add gemini-cli -- npx -y ask-gemini-mcp

OpenCode

Add to opencode.json in your project (or ~/.config/opencode/opencode.json for global):

{
  "mcp": {
    "gemini-cli": {
      "type": "local",
      "command": ["npx", "-y", "ask-gemini-mcp"]
    }
  }
}

Any MCP Client (STDIO Transport)

{
  "transport": {
    "type": "stdio",
    "command": "npx",
    "args": ["-y", "ask-gemini-mcp"]
  }
}

Prerequisites

Tools

ToolPurpose
ask-geminiSend prompts to Gemini CLI. Supports @ file syntax, model selection, sandbox mode, and changeMode for structured edits
fetch-chunkRetrieve subsequent chunks from cached large responses
pingConnection test — verify MCP setup without using Gemini tokens

Usage Examples

File analysis (@ syntax):

  • ask gemini to analyze @src/main.js and explain what it does
  • use gemini to summarize @. the current directory

Code review:

  • ask gemini to review the changes in @src/auth.ts for security issues
  • use gemini to compare @old.js and @new.js

General questions:

  • ask gemini about best practices for React state management

Sandbox mode:

  • use gemini sandbox to create and run a Python script

Models

ModelUse Case
gemini-3.1-pro-previewDefault — best quality reasoning
gemini-3-flash-previewFaster responses, large codebases

The server automatically falls back to Flash when Pro quota is exceeded.

Contributing

Contributions are welcome! See open issues for things to work on.

License

MIT License. See LICENSE for details.

Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.

Related Servers