JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 29
  • Score
    100M100P100Q63564F
  • License MIT

HYDRACODE - Multi-headed AI coding assistant with CLI, Telegram bot, MCP support, and persistent memory

Package Exports

  • hydracode-cli
  • hydracode-cli/dist/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (hydracode-cli) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

🐍 HYDRACODE

Multi-headed AI coding assistant with CLI, Telegram bot, MCP support, and persistent memory.

Features

  • Multi-Provider Support: OpenAI, Anthropic, Groq, Together, Ollama, LM Studio, and more
  • Smart Router Mode: Automatically routes tasks to appropriate models based on complexity
  • Telegram Bot: Use your AI assistant from anywhere via Telegram
  • MCP Integration: Connect to Model Context Protocol servers for extended capabilities
  • Persistent Memory: Remember conversations, preferences, and context across sessions
  • Tool System: File operations, bash commands, search, and more

Installation

npm install -g hydracode-cli

Quick Start

# Run the setup wizard
hydracode

# Or start directly if you have OPENAI_API_KEY set
OPENAI_API_KEY=sk-... hydracode

Configuration

First Run Setup

On first run, HYDRACODE will guide you through setup:

  1. Choose your LLM provider (OpenAI, Groq, Anthropic, etc.)
  2. Enter your API key
  3. Select a model

Environment Variables

# OpenAI
OPENAI_API_KEY=sk-...

# Groq (fast & free)
GROQ_API_KEY=gsk_...

# Anthropic
ANTHROPIC_API_KEY=sk-ant-...

# Or set via CLI
hydracode config --api-key YOUR_KEY --provider groq

Commands

Session

  • /help - Show available commands
  • /clear - Clear conversation history
  • /exit - Exit HYDRACODE

Configuration

  • /model - Change model & provider
  • /bio - Set custom instructions for the AI
  • /config - Show current configuration

Memory System

  • /memory - Show memory status
  • /memory on - Enable persistent memory
  • /memory off - Disable memory

MCP (Model Context Protocol)

  • /mcp presets - List available server presets
  • /mcp add brave-search - Add a preset server
  • /mcp add myserver python server.py - Add custom server
  • /mcp env server-name API_KEY=xxx - Set API keys
  • /mcp connect - Connect to all servers
  • /mcp tools - List available tools

Telegram Bot

  • /gateway setup - Configure Telegram bot
  • /serve - Start Telegram bot from CLI

Router Mode

  • /routermode - Configure smart model routing
  • Routes tasks to different models based on complexity:
    • LOW: Simple questions → fast/cheap model
    • MID: Moderate tasks → balanced model
    • HIGH: Complex tasks → powerful model

MCP Server Presets

Preset Description API Key
brave-search Web search BRAVE_API_KEY
github GitHub API GITHUB_TOKEN
filesystem File access None
puppeteer Browser automation None
fetch HTTP requests None
sqlite SQLite database None
postgres PostgreSQL POSTGRES_CONNECTION_STRING
slack Slack workspace SLACK_BOT_TOKEN

Examples

Basic Usage

hydracode
# > Create a Python web scraper that extracts headlines from news sites

With Groq (Fast & Free)

GROQ_API_KEY=gsk_xxx hydracode
# > Explain how React hooks work

Telegram Bot

hydracode
# /gateway setup
# (follow prompts to set up Telegram bot)
# /serve
hydracode
# /mcp add brave-search
# /mcp env brave-search BRAVE_API_KEY=your_key
# /mcp connect
# > Search for the latest news about AI

Supported Providers

Provider Models Notes
OpenAI gpt-4o, gpt-4o-mini, o1-preview Best overall quality
Groq llama-3.3-70b, mixtral Fast & free tier
Anthropic claude-3.5-sonnet, claude-3-opus Great for coding
Together Llama, Mistral, etc. Many open models
Ollama Any local model Run models locally
LM Studio Any local model Local with GUI

License

MIT