Package Exports
- @choplin/mcp-gemini-cli
- @choplin/mcp-gemini-cli/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@choplin/mcp-gemini-cli) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
MCP Gemini CLI
A simple MCP server wrapper for Google's Gemini CLI that enables AI assistants to use Gemini's capabilities through the Model Context Protocol.
What it does
This server exposes two tools that interact with Gemini CLI:
googleSearch
: Asks Gemini to perform a Google search using your querygeminiChat
: Sends prompts directly to Gemini for general conversations
Prerequisites
- Gemini CLI installed and configured (optional with --allow-npx flag)
🚀 Quick Start with Claude Code
1. Add the MCP server
claude mcp add -s project gemini-cli -- npx @choplin/mcp-gemini-cli --allow-npx
Or configure your MCP client with the settings shown in the Installation Options section below.
2. Try it out
Example prompts:
- Search: "Search for the latest TypeScript 5.0 features using Google"
- Chat: "Ask Gemini to explain the difference between async/await and promises in JavaScript"
🔧 Installation Options
Using npx with --allow-npx flag
{
"mcpServers": {
"mcp-gemini-cli": {
"command": "npx",
"args": ["@choplin/mcp-gemini-cli", "--allow-npx"]
}
}
}
Local Development
- Clone and install:
git clone https://github.com/choplin/mcp-gemini-cli
cd mcp-gemini-cli
bun install
- Add to Claude Desktop config:
{
"mcpServers": {
"mcp-gemini-cli": {
"command": "bun",
"args": ["run", "/path/to/mcp-gemini-cli/index.ts"]
}
}
}
🛠️ Available Tools
1. googleSearch
Performs a Google search using Gemini CLI.
Parameters:
query
(required): The search querylimit
(optional): Maximum number of resultssandbox
(optional): Run in sandbox modeyolo
(optional): Skip confirmationsmodel
(optional): Gemini model to use (default: "gemini-2.5-pro")
2. geminiChat
Have a conversation with Gemini.
Parameters:
prompt
(required): The conversation promptsandbox
(optional): Run in sandbox modeyolo
(optional): Skip confirmationsmodel
(optional): Gemini model to use (default: "gemini-2.5-pro")
💡 Example Prompts
Try these prompts to see mcp-gemini-cli in action:
- Search: "Search for the latest TypeScript 5.0 features using Google"
- Chat: "Ask Gemini to explain the difference between async/await and promises in JavaScript"
🛠️ Example Usage
googleSearch
// Simple search
googleSearch({ query: "latest AI news" });
// Search with limit
googleSearch({
query: "TypeScript best practices",
limit: 5,
});
geminiChat
// Simple chat
geminiChat({ prompt: "Explain quantum computing in simple terms" });
// Using a different model
geminiChat({
prompt: "Write a haiku about programming",
model: "gemini-2.5-flash",
});
📝 Development
Note: Development requires Bun runtime.
Run in Development Mode
bun run dev
Run Tests
bun test
Build for Production
# Development build
bun run build
# Production build (minified)
bun run build:prod
Linting & Formatting
# Lint code
bun run lint
# Format code
bun run format
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.