JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 964
  • Score
    100M100P100Q123953F
  • License MIT

TzamunCode - AI Coding Assistant powered by local models (npm installer wrapper)

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (tzamuncode) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    TzamunCode CLI πŸš€

    AI Coding Assistant powered by local models - Built in Saudi Arabia πŸ‡ΈπŸ‡¦

    TzamunCode is a privacy-first AI coding assistant that runs entirely on your local infrastructure using Ollama and vLLM. No cloud dependencies, no API costs, complete control.

    ✨ Features

    • πŸ€– Agentic AI - Multi-step planning and execution
    • πŸ“ Code Generation - Create files, functions, and entire projects
    • ✏️ Multi-file Editing - Edit multiple files in one operation
    • πŸ”§ Tool Calling - Git operations, file search, command execution
    • 🎯 Context Aware - Understands your project structure
    • πŸ”’ Privacy First - Everything runs locally
    • ⚑ Fast - Powered by vLLM for optimized inference
    • 🌍 Multi-model - Use any Ollama model (15+ available)

    πŸš€ Quick Start

    Installation

    # Clone the repository
    git clone https://github.com/tzamun/tzamuncode-cli.git
    cd tzamuncode-cli
    
    # Install
    pip install -e .
    
    # Or install from PyPI (when published)
    pip install tzamuncode

    Prerequisites

    Basic Usage

    # Start interactive chat
    tzamuncode chat
    
    # Generate code
    tzamuncode generate "Create a Flask REST API with authentication"
    
    # Edit a file
    tzamuncode edit app.py "Add error handling to all routes"
    
    # Explain code
    tzamuncode explain main.py
    
    # Quick alias
    tzc chat

    πŸ“– Documentation

    Commands

    chat - Interactive Chat

    tzamuncode chat
    tzamuncode chat --model qwen2.5:32b

    generate - Code Generation

    tzamuncode generate "Create a Python web scraper"
    tzamuncode generate "Add unit tests for user.py" --output tests/

    edit - File Editing

    tzamuncode edit app.py "Refactor to use async/await"
    tzamuncode edit . "Add type hints to all functions"

    explain - Code Explanation

    tzamuncode explain complex_function.py
    tzamuncode explain --detailed auth.py

    review - Code Review

    tzamuncode review pull_request.diff
    tzamuncode review --strict src/

    Configuration

    Create ~/.tzamuncode/config.yaml:

    # Default model
    model: qwen2.5:32b
    
    # Ollama settings
    ollama:
      base_url: http://localhost:11434
      timeout: 120
    
    # vLLM settings (optional, for faster inference)
    vllm:
      enabled: true
      base_url: http://localhost:8000
      model: deepseek-coder-7b
    
    # Preferences
    preferences:
      show_diff: true
      auto_apply: false
      max_context: 64000
      temperature: 0.7

    πŸ—οΈ Architecture

    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚     TzamunCode CLI                  β”‚
    β”‚  (Typer + Rich UI)                  β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
               ↓
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚   Agentic Layer (LangChain)         β”‚
    β”‚  - Multi-step planning              β”‚
    β”‚  - Tool calling                     β”‚
    β”‚  - Context management               β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
               ↓
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚   AI Backend                        β”‚
    β”‚  - Ollama (15+ models)              β”‚
    β”‚  - vLLM (fast inference)            β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

    🀝 Contributing

    We welcome contributions! See CONTRIBUTING.md for guidelines.

    πŸ“„ License

    MIT License - see LICENSE for details.

    🌟 Built by Tzamun Arabia IT Co.

    TzamunCode is part of the Tzamun AI ecosystem:

    • TzamunAI - AI platform with 15+ models
    • TzamunERP - ERPNext + AI integration
    • Auxly - AI coding assistant for IDEs
    • AccessHub - Privileged Access Management

    Visit tzamun.com to learn more.


    Made with ❀️ in Saudi Arabia πŸ‡ΈπŸ‡¦