JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 27
  • Score
    100M100P100Q19236F
  • License MIT

Cordenex — AI-Powered Autonomous Coding Assistant with Parallel Execution

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@cortiqa/cordenex) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    🤖 Cordenex v2 — Autonomous AI Software Engineer

    Version License Built by Cortiqa

    Cordenex is a high-performance, autonomous AI coding assistant designed for the terminal. Inspired by the logic of Claude Code and the power of specialized agentic workflows, Cordenex brings Anthropic-level engineering capabilities to your local environment.

    "Stop chatting with code. Start engineering with it."


    🔥 Key Pillars of Cordenex v2

    1. 🧠 Project Intelligence & Memory

    • CORDENEX.md (Project Memory): Define project-specific rules, conventions, and tech-stack details that persist across every AI turn.
    • Auto-Context Detection: Cordenex automatically analyzes your project (languages, frameworks, dependencies) to provide hyper-relevant suggestions without you explaining anything.
    • Session Journaling: Maintains a rolling "Journal" of what was accomplished, providing continuity even if you restart the terminal.

    2. ⚡ Autonomous Execution Engine

    • Parallel Tool Execution: Executes multiple file operations, shell commands, and searches concurrently to save time.
    • Self-Healing Workflows: If a tool fails (e.g., missing directory, syntax error), Cordenex automatically diagnoses the issue, fixes its own mistakes, and retries.
    • Smarter Mode Switching: Automatically switches between Chat Mode (for explanations) and Agent Mode (for complex migrations) based on the task complexity.

    3. 🛡️ Safety & Permission Tiers

    • Tiered Permissions:
      • Read-only: Always allowed (safe).
      • Write: Needs approval by default.
      • Dangerous: Shell access and deletions always ask for confirmation.
    • Smart Approvals: Approve a batch of parallel actions once, not one by one.

    4. 🛠️ Developer First Experience

    • Rich Interactive UI: Features a streaming interface with real-time Thinking Steps (Reasoning blocks).
    • Universal Provider Support: Anthropic (Claude 3.5), OpenAI (GPT-4o), Google (Gemini 2.0), Groq (Llama 3.3 for speed), and Ollama for full offline privacy.
    • Undo System: Accidentally broke something? Use /undo to revert file changes instantly.

    🚀 Installation & Setup

    npm install -g @cortiqa/cordenex

    2. Setup your Provider

    Cordenex supports all major AI backends. We recommend Claude 3.5 Sonnet (via Anthropic) or Llama 3.3 (via Groq) for the best results.

    # Set your API Key
    cordenex config set providers.anthropic.api_key your_key_here
    
    # Set default model
    cordenex switch claude-3-5-sonnet

    💻 Commands Reference

    Command Description
    cordenex chat Start the main interactive engineering session.
    cordenex ask "prompt" Run a one-shot task and exit.
    cordenex init Initialize CORDENEX.md and project rules.
    cordenex auth login Authenticate with Cortiqa Cloud.
    cordenex config set <key> <val> Configure settings (UI, AI, Logging).
    cordenex models List all supported and configured models.

    ⚡ Slash Commands (Within Chat)

    • /undo: Revert the last set of file changes.
    • /memory: View or edit your project-specific CORDENEX.md.
    • /permissions: Change safety level (allow-write, allow-all, default).
    • /changes: Show a summary of files modified in this session.
    • /clear: Reset the conversation history.
    • /model <id>: Switch models on the fly.

    🏗️ Architecture

    Cordenex v2 is built in Go for maximum speed and zero-dependency binaries.

    • internal/core: The brain. Contains the Engine (LLM logic), Orchestrator (mode switching), and Journal (memory).
    • internal/tools: The hands. Filesystem operations, Shell execution, and Context retrieval.
    • internal/providers: The interface. Unified API for OpenAI, Anthropic, Gemini, etc.

    For deeper technical details, see ARCHITECTURE.md.


    🤝 Contributing

    We welcome contributions!

    1. Fork the repo.
    2. Run go build ./cmd/cordenex to test changes.
    3. Submit a PR.

    Built with ❤️ by Cortiqa Website | GitHub