A server for integrating with the Google Gemini CLI to perform AI-powered tasks.
📚 View Full Documentation - Search me!, Examples, FAQ, Troubleshooting, Best Practices
This is a simple Model Context Protocol (MCP) server that allows AI assistants to interact with the Gemini CLI. It enables the AI to leverage the power of Gemini's massive token window for large analysis, especially with large files and codebases using the @
syntax for direction.
Goal: Use Gemini's powerful analysis capabilities directly in Claude Code to save tokens and analyze large files.
Before using this tool, ensure you have:
claude mcp add gemini-cli -- npx -y gemini-mcp-tool
Type /mcp
inside Claude Code to verify the gemini-cli MCP is active.
If you already have it configured in Claude Desktop:
"gemini-cli": {
"command": "npx",
"args": ["-y", "gemini-mcp-tool"]
}
claude mcp add-from-claude-desktop
Register the MCP server with your MCP client:
Add this configuration to your Claude Desktop config file:
{
"mcpServers": {
"gemini-cli": {
"command": "npx",
"args": ["-y", "gemini-mcp-tool"]
}
}
}
If you installed globally, use this configuration instead:
{
"mcpServers": {
"gemini-cli": {
"command": "gemini-mcp"
}
}
}
Configuration File Locations:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/claude/claude_desktop_config.json
After updating the configuration, restart your terminal session.
/gemini-cli
and commands will populate in Claude Code's interface.ask gemini to analyze @src/main.js and explain what it does
use gemini to summarize @. the current directory
analyze @package.json and tell me about dependencies
ask gemini to search for the latest tech news
use gemini to explain div centering
ask gemini about best practices for React development related to @file_im_confused_about
The sandbox mode allows you to safely test code changes, run scripts, or execute potentially risky operations in an isolated environment.
use gemini sandbox to create and run a Python script that processes data
ask gemini to safely test @script.py and explain what it does
use gemini sandbox to install numpy and create a data visualization
test this code safely: Create a script that makes HTTP requests to an API
These tools are designed to be used by the AI assistant.
ask-gemini
: Asks Google Gemini for its perspective. Can be used for general questions or complex analysis of files.
prompt
(required): The analysis request. Use the @
syntax to include file or directory references (e.g., @src/main.js explain this code
) or ask general questions (e.g., Please use a web search to find the latest news stories
).model
(optional): The Gemini model to use. Defaults to gemini-2.5-pro
.sandbox
(optional): Set to true
to run in sandbox mode for safe code execution.sandbox-test
: Safely executes code or commands in Gemini's sandbox environment. Always runs in sandbox mode.
prompt
(required): Code testing request (e.g., Create and run a Python script that...
or @script.py Run this safely
).model
(optional): The Gemini model to use.Ping
: A simple test tool that echoes back a message.Help
: Shows the Gemini CLI help text.You can use these commands directly in Claude Code's interface (compatibility with other clients has not been tested).
prompt
(required): The analysis prompt. Use @
syntax to include files (e.g., /analyze prompt:@src/ summarize this directory
) or ask general questions (e.g., /analyze prompt:Please use a web search to find the latest news stories
).prompt
(required): Code testing request (e.g., /sandbox prompt:Create and run a Python script that processes CSV data
or /sandbox prompt:@script.py Test this script safely
).message
(optional): A message to echo back.Contributions are welcome! Please see our Contributing Guidelines for details on how to submit pull requests, report issues, and contribute to the project.
This project is licensed under the MIT License. See the LICENSE file for details.
Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.
Integrates with Microsoft's AutoGen framework to enable sophisticated multi-agent conversations via the Model Context Protocol.
Interact with an MCP registry to check health, list entries, and get server details.
Parses HAR (HTTP Archive) files and displays requests in a simplified format for AI assistants.
Remote, no-auth MCP server providing AI-powered codebase context and answers
Client implementation for Mastra, providing seamless integration with MCP-compatible AI models and tools.
A reverse proxy gateway for managing and accessing multiple MCP servers through a single entry point, deployable via Docker.
Provides LLMs with essential random generation abilities, built entirely on Python's standard library.
A server for Zero-Vector's hybrid vector-graph persona and memory management system, featuring advanced LangGraph workflow capabilities.
Generates placeholder images from various providers like placehold.co and lorem-picsum.
An MCP server for generating images with the Midjourney API.