Package Exports
- hydracode-cli
- hydracode-cli/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (hydracode-cli) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
🐍 HYDRACODE
Multi-headed AI coding assistant with CLI, Telegram bot, MCP support, and persistent memory.
Features
- Multi-Provider Support: OpenAI, Anthropic, Groq, Together, Ollama, LM Studio, and more
- Smart Router Mode: Automatically routes tasks to appropriate models based on complexity
- Telegram Bot: Use your AI assistant from anywhere via Telegram
- MCP Integration: Connect to Model Context Protocol servers for extended capabilities
- Persistent Memory: Remember conversations, preferences, and context across sessions
- Tool System: File operations, bash commands, search, and more
Installation
npm install -g hydracode-cliQuick Start
# Run the setup wizard
hydracode
# Or start directly if you have OPENAI_API_KEY set
OPENAI_API_KEY=sk-... hydracodeConfiguration
First Run Setup
On first run, HYDRACODE will guide you through setup:
- Choose your LLM provider (OpenAI, Groq, Anthropic, etc.)
- Enter your API key
- Select a model
Environment Variables
# OpenAI
OPENAI_API_KEY=sk-...
# Groq (fast & free)
GROQ_API_KEY=gsk_...
# Anthropic
ANTHROPIC_API_KEY=sk-ant-...
# Or set via CLI
hydracode config --api-key YOUR_KEY --provider groqCommands
Session
/help- Show available commands/clear- Clear conversation history/exit- Exit HYDRACODE
Configuration
/model- Change model & provider/bio- Set custom instructions for the AI/config- Show current configuration
Memory System
/memory- Show memory status/memory on- Enable persistent memory/memory off- Disable memory
MCP (Model Context Protocol)
/mcp presets- List available server presets/mcp add brave-search- Add a preset server/mcp add myserver python server.py- Add custom server/mcp env server-name API_KEY=xxx- Set API keys/mcp connect- Connect to all servers/mcp tools- List available tools
Telegram Bot
/gateway setup- Configure Telegram bot/serve- Start Telegram bot from CLI
Router Mode
/routermode- Configure smart model routing- Routes tasks to different models based on complexity:
- LOW: Simple questions → fast/cheap model
- MID: Moderate tasks → balanced model
- HIGH: Complex tasks → powerful model
MCP Server Presets
| Preset | Description | API Key |
|---|---|---|
brave-search |
Web search | BRAVE_API_KEY |
github |
GitHub API | GITHUB_TOKEN |
filesystem |
File access | None |
puppeteer |
Browser automation | None |
fetch |
HTTP requests | None |
sqlite |
SQLite database | None |
postgres |
PostgreSQL | POSTGRES_CONNECTION_STRING |
slack |
Slack workspace | SLACK_BOT_TOKEN |
Examples
Basic Usage
hydracode
# > Create a Python web scraper that extracts headlines from news sitesWith Groq (Fast & Free)
GROQ_API_KEY=gsk_xxx hydracode
# > Explain how React hooks workTelegram Bot
hydracode
# /gateway setup
# (follow prompts to set up Telegram bot)
# /serveMCP Web Search
hydracode
# /mcp add brave-search
# /mcp env brave-search BRAVE_API_KEY=your_key
# /mcp connect
# > Search for the latest news about AISupported Providers
| Provider | Models | Notes |
|---|---|---|
| OpenAI | gpt-4o, gpt-4o-mini, o1-preview | Best overall quality |
| Groq | llama-3.3-70b, mixtral | Fast & free tier |
| Anthropic | claude-3.5-sonnet, claude-3-opus | Great for coding |
| Together | Llama, Mistral, etc. | Many open models |
| Ollama | Any local model | Run models locally |
| LM Studio | Any local model | Local with GUI |
License
MIT