JSPM

stably

4.12.4-rc.0
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 39417
  • Score
    100M100P100Q161157F
  • License UNLICENSED

AI-powered E2E Playwright testing CLI. Stably can understand your codebase, edit/run tests, and handle complex test scenarios for you.

Package Exports

  • stably

Readme

Stably

Code. Ship. Test.

Documentation · Homepage


Stably CLI

This package extends Playwright to add new AI functionality. To get started quickly, please see AI-assisted setup guide. Otherwise continue to read below.

Installation

npm i -g stably

🎭 Note
We let you bring your own Playwright version. This does mean that Playwright must first be setup (our CLI can help you do that)

Usage

Below is a short list of common commands. For a complete list, please use stably --help

  • npx stably: This starts our REPL which will help with test creation or modifications
  • npx stably test: Use this to run tests locally or in the CI
  • npx stably --help: Will print full list of commands

Authentication

Stably CLI supports two auth modes:

  • API key via env vars (highest priority): set STABLY_API_KEY and STABLY_PROJECT_ID
  • Browser login (OAuth): run npx stably login (credentials are stored locally)

If both are present, the CLI will honor the environment variables and warn that stored OAuth credentials are being ignored. To switch back to browser login, unset the env vars.

Cheap local/CI agent testing (Ollama)

You can run the agent against Ollama via its Anthropic-compatible API (cheap, local, and deterministic enough for smoke tests). This is dev-only behavior (it is ignored in NODE_ENV=production builds).

  • STABLY_BYPASS_AI_PROXY=1 (dev-only; ensures we don't override ANTHROPIC_BASE_URL)
  • Dev-only convenience: STABLY_USE_OLLAMA=1 (dev-only; ignored when NODE_ENV=production)
  • STABLY_AGENT_MODEL=qwen2.5-coder:0.5b (fastest/cheapest suggested default)

If you want a convenience helper that starts Ollama (if needed), pulls the model, and prints the exact env vars to use:

pnpm ollama:dev-env

Notes:

  • On Linux, this script will try to install Ollama automatically (requires sudo or root).
  • On macOS, it will try Homebrew if available (brew install ollama / brew install --cask ollama); otherwise it will prompt you to install Ollama manually.
  1. Start Ollama and pull the tiny model:
ollama serve
ollama pull qwen2.5-coder:0.5b
  1. Run the CLI against Ollama (Anthropic-compatible API):
STABLY_BYPASS_AI_PROXY=1 STABLY_USE_OLLAMA=1 STABLY_AGENT_MODEL=qwen2.5-coder:0.5b pnpm dev