JSPM

Found 10 results for vram

llm-checker

Intelligent CLI tool with AI-powered model selection that analyzes your hardware and recommends optimal LLM models for your system

  • v3.5.12
  • 63.16
  • Published

llm-pulse

Zero-config CLI for monitoring your local LLM hardware, runtimes, and model compatibility

  • v0.7.4
  • 44.42
  • Published

@zakkster/lite-vram

Enterprise-grade VRAM management for HTML5 games. Features hardware tier detection, priority-based hysteresis eviction, and safe scene streaming.

  • v1.0.6
  • 43.92
  • Published

can-i-run-ai

Detecta seu hardware e mostra quais modelos de IA voce consegue rodar localmente

  • v1.2.0
  • 37.11
  • Published

ollama-checker

Intelligent CLI tool with AI-powered model selection that analyzes your hardware and recommends optimal LLM models for your system

  • v3.0.7
  • 31.47
  • Published

@zakkster/lite-sprite-cache

Zero-GC off-thread ImageBitmap loader with strict VRAM limits, URL deduplication, and LRU eviction to prevent mobile browser crashes.

  • v1.0.0
  • 30.61
  • Published

ownrig-mcp

OwnRig MCP Server — AI hardware compatibility data for Claude, ChatGPT, Cursor, and any MCP-compatible assistant. 50 models, 25 devices, 9 machines, 663 compatibility entries.

  • v1.0.0
  • 29.53
  • Published

vramancer

Can my machine run this model? Estimate VRAM/RAM, tok/s, and TTFT.

  • v1.2.0
  • 18.78
  • Published

vram-calculator-mcp-server

Model Context Protocol server for AI VRAM calculation and GPU recommendation

  • v1.0.0
  • 15.19
  • Published

gpuz-cli

A simple CLI tool to monitor GPU memory usage.

    • v1.0.3
    • 8.38
    • Published