Skip to content

Lykhoyda/ask-llm

Repository files navigation

Ask Gemini MCP

npm version npm downloads GitHub Release License: MIT

MCP server that connects any AI client to Google Gemini CLI

An MCP server for AI-to-AI collaboration via the Gemini CLI. Available on npm: ask-gemini-mcp. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.

ask-gemini-mcp MCP server

Why?

  • Get a second opinion — Ask Gemini to review your coding approach before committing to it
  • Debate plans — Send architecture proposals to Gemini for critique and alternative suggestions
  • Review changes — Have Gemini analyze diffs or modified files to catch issues your primary AI might miss
  • Massive context — Gemini reads entire codebases (1M+ tokens) that would overflow other models

Quick Start

Claude Code

# Project scope (available in current project only)
claude mcp add gemini-cli -- npx -y ask-gemini-mcp

# User scope (available across all projects)
claude mcp add --scope user gemini-cli -- npx -y ask-gemini-mcp

Claude Desktop

Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}
Other config file locations
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/claude/claude_desktop_config.json

Cursor

Add to .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}

Codex CLI

Add to ~/.codex/config.toml (or .codex/config.toml in your project):

[mcp_servers.gemini-cli]
command = "npx"
args = ["-y", "ask-gemini-mcp"]

Or via CLI:

codex mcp add gemini-cli -- npx -y ask-gemini-mcp

OpenCode

Add to opencode.json in your project (or ~/.config/opencode/opencode.json for global):

{
  "mcp": {
    "gemini-cli": {
      "type": "local",
      "command": ["npx", "-y", "ask-gemini-mcp"]
    }
  }
}

Any MCP Client (STDIO Transport)

{
  "transport": {
    "type": "stdio",
    "command": "npx",
    "args": ["-y", "ask-gemini-mcp"]
  }
}

Prerequisites

Tools

Tool Purpose
ask-gemini Send prompts to Gemini CLI. Supports @ file syntax, model selection, sandbox mode, and changeMode for structured edits
fetch-chunk Retrieve subsequent chunks from cached large responses
ping Connection test — verify MCP setup without using Gemini tokens

Usage Examples

File analysis (@ syntax):

  • ask gemini to analyze @src/main.js and explain what it does
  • use gemini to summarize @. the current directory

Code review:

  • ask gemini to review the changes in @src/auth.ts for security issues
  • use gemini to compare @old.js and @new.js

General questions:

  • ask gemini about best practices for React state management

Sandbox mode:

  • use gemini sandbox to create and run a Python script

Models

Model Use Case
gemini-3.1-pro-preview Default — best quality reasoning
gemini-3-flash-preview Faster responses, large codebases

The server automatically falls back to Flash when Pro quota is exceeded.

Contributing

Contributions are welcome! See open issues for things to work on.

License

MIT License. See LICENSE for details.

Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.

About

MCP server for AI-to-AI collaboration — bridge Claude with Gemini, Codex, and other LLMs for code review, second opinions, and plan debate

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors