JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 28
  • Score
    100M100P100Q95222F
  • License MIT

High-performance WebGPU particle engine with compute shaders, GPU instancing, GLB model support, and skeletal animations

Package Exports

  • hz-particles
  • hz-particles/r3f

Readme

hz-particles

npm version license WebGPU

A high-performance WebGPU particle engine for the web.

Features

  • WebGPU Compute Shaders — GPU-accelerated particle physics simulation
  • GPU Instancing — Efficient rendering of thousands of particles
  • GLB Model Support — Use 3D models as particle shapes with automatic texture extraction
  • Skeletal Animations — Animated GLB models with full animation control
  • Scene Serialization — Save/load particle configurations as JSON
  • Multiple Emitter Shapes — Sphere, cube, cylinder, circle, square emission patterns
  • Real-time Physics — Gravity, attractors, drag, velocity, and lifetime management
  • 3D Object Support — Static 3D objects alongside particle systems

Requirements

WebGPU-compatible browser required:

  • Chrome/Edge 113+ (stable)
  • Firefox Nightly (experimental)
  • Safari Technology Preview (experimental)

Check browser support: caniuse.com/webgpu

Installation

npm install hz-particles

Quick Start

Option 1: Using the built-in WebGPU helper

import { initWebGPU, ParticleSystemManager } from 'hz-particles';

// 1. Setup WebGPU context with the convenience helper
const canvas = document.getElementById('webgpu-canvas');
canvas.width = 800;
canvas.height = 600;
const { device, context, format } = await initWebGPU(canvas);

// 2. Create particle system manager
const manager = new ParticleSystemManager(device);

// 3. Create a particle system
const systemId = manager.createParticleSystem({
  maxParticles: 10000,
  particleCount: 1000,
});

// 4. Get active system and initialize
const system = manager.getActiveSystem();
await system.initComputePipeline(device);

// Optional: Load a texture
const response = await fetch('particle-texture.png');
const blob = await response.blob();
const imageBitmap = await createImageBitmap(blob);
await system.setTexture(imageBitmap);

// 5. Render loop
let lastTime = performance.now();

function render() {
  const currentTime = performance.now();
  const deltaTime = (currentTime - lastTime) / 1000;
  lastTime = currentTime;

  // Update physics
  manager.updateAllSystems(deltaTime);

  // Render
  const commandEncoder = device.createCommandEncoder();
  const renderPass = commandEncoder.beginRenderPass({
    colorAttachments: [{
      view: context.getCurrentTexture().createView(),
      clearValue: { r: 0, g: 0, b: 0, a: 1 },
      loadOp: 'clear',
      storeOp: 'store',
    }],
  });

  system.render(renderPass);
  renderPass.end();
  device.queue.submit([commandEncoder.finish()]);

  requestAnimationFrame(render);
}

render();

Option 2: Using an existing WebGPU setup

If you already have a WebGPU device and context (e.g., from an existing project or framework), you can skip initWebGPU() and use the library directly:

import { ParticleSystemManager, createRenderTextures, createDepthTexture } from 'hz-particles';

// Assume you already have these from your existing WebGPU setup
const device = yourExistingDevice;
const context = yourExistingContext;
const format = navigator.gpu.getPreferredCanvasFormat();
const canvas = yourExistingCanvas;

// Create particle system manager with your device
const manager = new ParticleSystemManager(device);

// Create a particle system
const systemId = manager.createParticleSystem({
  maxParticles: 10000,
  particleCount: 1000,
});

// Initialize and use as shown in Option 1
const system = manager.getActiveSystem();
await system.initComputePipeline(device);

// Your render loop...

This approach is ideal for integrating into existing WebGPU applications, frameworks, or when you need custom WebGPU initialization.

React Three Fiber Integration

While hz-particles uses native WebGPU (not Three.js internally), you can integrate it into React Three Fiber applications:

import { useEffect, useRef } from 'react';
import { Canvas } from '@react-three/fiber';
import { initWebGPU, ParticleSystemManager } from 'hz-particles';

function ParticleLayer() {
  const canvasRef = useRef();
  const engineRef = useRef();

  useEffect(() => {
    let animationId;

    async function init() {
      // Create dedicated WebGPU canvas
      const canvas = canvasRef.current;
      canvas.width = window.innerWidth;
      canvas.height = window.innerHeight;
      const { device, context, format } = await initWebGPU(canvas);

      // Setup particle system
      const manager = new ParticleSystemManager(device);
      manager.createParticleSystem({ particleCount: 1000 });
      const system = manager.getActiveSystem();
      await system.initComputePipeline(device);

      engineRef.current = { device, context, manager, system };

      // Render loop
      let lastTime = performance.now();
      function render() {
        const { device, context, manager, system } = engineRef.current;
        const currentTime = performance.now();
        const deltaTime = (currentTime - lastTime) / 1000;
        lastTime = currentTime;

        manager.updateAllSystems(deltaTime);

        const commandEncoder = device.createCommandEncoder();
        const renderPass = commandEncoder.beginRenderPass({
          colorAttachments: [{
            view: context.getCurrentTexture().createView(),
            clearValue: { r: 0, g: 0, b: 0, a: 0 }, // Transparent
            loadOp: 'clear',
            storeOp: 'store',
          }],
        });
        system.render(renderPass);
        renderPass.end();
        device.queue.submit([commandEncoder.finish()]);

        animationId = requestAnimationFrame(render);
      }
      render();
    }

    init();

    return () => {
      if (animationId) cancelAnimationFrame(animationId);
    };
  }, []);

  return (
    <canvas
      ref={canvasRef}
      id="webgpu-canvas"
      style={{
        position: 'absolute',
        top: 0,
        left: 0,
        width: '100%',
        height: '100%',
        pointerEvents: 'none',
      }}
    />
  );
}

// Usage in App
function App() {
  return (
    <div style={{ position: 'relative', width: '100vw', height: '100vh' }}>
      {/* R3F scene */}
      <Canvas>
        <ambientLight />
        <mesh>
          <boxGeometry />
          <meshStandardMaterial />
        </mesh>
      </Canvas>

      {/* WebGPU particles overlay */}
      <ParticleLayer />
    </div>
  );
}

API Reference

initWebGPU(canvas)

Convenience helper to initialize WebGPU context and device. Library consumers with their own WebGPU setup do NOT need to use this function — you can pass your existing device, context, format, and canvas directly to the library's components.

async function initWebGPU(canvas: HTMLCanvasElement): Promise<{
  device: GPUDevice,
  context: GPUCanvasContext,
  format: GPUTextureFormat,
  canvas: HTMLCanvasElement
}>

Parameters:

  • canvas (required) — Canvas element for WebGPU rendering. You must size the canvas before calling this function.

Returns: Object with WebGPU device, canvas context, texture format, and canvas element

Throws:

  • Error if canvas is not provided
  • Error if WebGPU is not supported

Note: This function does NOT handle canvas sizing. You should set canvas.width and canvas.height before calling initWebGPU().


ParticleSystem

Core particle system class managing particle simulation and rendering.

class ParticleSystem {
  constructor(device: GPUDevice, config?: object)
}

Constructor Parameters:

  • device — GPUDevice instance
  • config (optional) — Configuration object:
    • maxParticles (number, default 10000) — Maximum particle buffer size
    • particleCount (number, default 100) — Initial active particle count

Key Methods:

  • async initComputePipeline(device: GPUDevice) — Initialize GPU compute and render pipelines. Must be called before rendering.

  • async setTexture(imageBitmap: ImageBitmap) — Set particle texture from ImageBitmap.

  • async setGLBModel(arrayBuffer: ArrayBuffer) — Use GLB model geometry as particle shape. Automatically extracts textures.

  • updateParticles(deltaTime: number) — Execute physics simulation step on GPU.

  • spawnParticles() — Emit new particles according to emitter configuration.

  • setGravity(value: number) — Set gravity strength (default 9.8).

  • setAttractor(strength: number, position: [number, number, number]) — Set attractor point with strength and 3D position.

  • render(renderPass: GPURenderPassEncoder) — Render particles to the current render pass.


ParticleSystemManager

Manages multiple particle systems within a single scene.

class ParticleSystemManager {
  constructor(device: GPUDevice)
}

Key Methods:

  • createParticleSystem(config?: object): number — Create new particle system. Returns system ID.

  • getActiveSystem(): ParticleSystem | null — Get currently active particle system instance.

  • getActiveConfig(): object | null — Get active system configuration.

  • setActiveSystem(index: number): boolean — Switch active system by index. Returns success status.

  • removeSystem(index: number): boolean — Remove system by index. Returns success status.

  • updateAllSystems(deltaTime: number) — Update physics for all systems.

  • getSystemsList(): Array<{name: string, id: number, index: number, isActive: boolean}> — Get list of all systems with metadata.

  • duplicateActiveSystem(): number — Clone active system. Returns new system ID.

  • async replaceSystems(sceneData: object): boolean — Load scene from serialized data. Returns success status.


parseGLB(arrayBuffer)

Parse GLB binary format and extract geometry data.

async function parseGLB(arrayBuffer: ArrayBuffer): Promise<{
  positions: Float32Array,
  normals: Float32Array,
  indices: Uint16Array | Uint32Array,
  texCoords: Float32Array | null,
  vertexCount: number,
  indexCount: number,
  animationData: object | null,
  hasBaseColorTexture: boolean
}>

Parameters:

  • arrayBuffer — GLB file as ArrayBuffer

Returns: Parsed geometry and animation data


GLBAnimator

Handles skeletal animation playback for animated GLB models.

class GLBAnimator {
  constructor(animationData: object)

  currentTime: number
  playing: boolean
  speed: number     // default 1.0
  loop: boolean     // default true
}

Key Methods:

  • setRestPose(positions: Float32Array, normals: Float32Array) — Set T-pose/bind pose for animation.

  • update(deltaTime: number): { positions: Float32Array, normals: Float32Array, changed: boolean } — Advance animation by deltaTime. Returns deformed geometry and change status.

  • setAnimation(index: number) — Switch to animation clip by index.

  • getAnimationNames(): string[] — Get list of available animation clip names.


Secondary Exports

The following modules are also exported for advanced usage:

  • ParticleEmitter — Emission shape configuration (sphere, cube, cylinder, circle, square)
  • ParticlePhysics — Physics simulation parameter management
  • ParticleTextureManager — Texture loading utilities
  • Objects3DManager — 3D object scene management
  • saveScene(manager) — Export scene as JSON file download
  • loadScene(event) — Load scene from file input event
  • extractGLBTexture(arrayBuffer) — Extract base color texture from GLB file
  • Shader exports — WGSL shader code for compute and rendering pipelines
  • Geometry helpers — Primitive shape generators (cube, sphere, etc.)
  • Render pipeline creators — Low-level WebGPU pipeline construction

GLB Models & Animation

Load and animate 3D models:

import { parseGLB, GLBAnimator } from 'hz-particles';

// Load GLB file
const response = await fetch('model.glb');
const arrayBuffer = await response.arrayBuffer();
const glbData = await parseGLB(arrayBuffer);

// Use as particle shape
await system.setGLBModel(arrayBuffer);

// Setup animation (if model has animations)
if (glbData.animationData) {
  const animator = new GLBAnimator(glbData.animationData);
  animator.setRestPose(glbData.positions, glbData.normals);
  animator.playing = true;
  animator.loop = true;
  animator.speed = 1.0;

  // In render loop:
  const { positions, normals, changed } = animator.update(deltaTime);
  if (changed) {
    // Update particle system with new geometry
    // (See full API documentation for geometry update methods)
  }
}

Scene Save/Load

Serialize and restore particle configurations:

import { saveScene, loadScene } from 'hz-particles';

// Save current scene
const button = document.getElementById('save-button');
button.addEventListener('click', () => {
  saveScene(manager);  // Downloads scene.json
});

// Load scene from file input
const input = document.getElementById('file-input');
input.addEventListener('change', async (event) => {
  const success = await loadScene(event, manager);
  if (success) {
    console.log('Scene loaded successfully');
  }
});

Online Editor

Try the interactive particle editor at http://localhost:8110/editor when running the Docker container (docker-compose up from the repository root)

The editor provides a visual interface for:

  • Real-time particle system configuration
  • Emitter shape selection and tuning
  • GLB model import and animation control
  • Scene preset library
  • Export/import scene JSON

TypeScript

Type declarations are not yet included in this package. For TypeScript projects, you can suppress the import error:

// @ts-ignore
import { ParticleSystem, initWebGPU } from 'hz-particles';

Community-contributed type definitions are welcome via PR.

Contributing

Contributions are welcome! Please follow these steps:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License - see LICENSE file for details.

Copyright (c) 2025 HZ