JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 99
  • Score
    100M100P100Q70400F
  • License MIT

Perform Affine, Projective or Piecewise Affine transformations over any Image or HTMLElement from only a set of reference points. High-Performance and easy-to-use.

Package Exports

  • homography

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (homography) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

Homography.js

Homography.js is a lightweight High-Performance library for implementing homographies in Javascript or Node.js. It is designed to be easy-to-use (even for developers that are not familiar with Computer Vision), and able to run in real time applications (even in low-spec devices such as budget smartphones). It allows you to perform Affine, Projective or Piecewise Affine warpings over any Image or HTMLElement in your application by only setting a small set of reference points. Additionally, Image warpings can be made persistent (independent of any CSS property), so they can be easily drawn in a canvas, mixed or downloaded. Homography.js is built in a way that frees the user from all the pain-in-the-ass details of homography operations, such as thinking about output dimensions, input coordinate ranges, dealing with unexpected shifts, pads, crops or unfilled pixels in the output image or even knowing what a Transform Matrix is.

Features

  • Apply different warpings to any Image or HTMLElement by just setting two sets of reference points.
  • Perform Affine, Projective or Piecewise Affine transforms or just set Auto and let the library decide which transform to apply depending on the reference points you provide.
  • Simplify how you deal with canvas drawings, or subsequent Computer Vision problems by making your Image transforms persistent and independent of any CSS property.
  • Forget all the pain-in-the-ass details of homography operations, even if you only have fuzzy idea about what an homography is.
  • Avoid warping delays in real-time applications due to its design focused on High-Performance.
  • Support for running in the backend with Node.js.

Installation

Usage

In the Browser

Perform a basic Piecewise Affine Transform from four source points.

    // Select the image you want to warp
    const image = document.getElementById("myImage");
    
    // Define the reference points. In this case using normalized coordinates (from 0.0 to 1.0).
    const srcPoints = [[0, 0], [0, 1], [1, 0], [1, 1]];
    const dstPoints = [[1/5, 1/5], [0, 1/2], [1, 0], [6/8, 6/8]];
    
    // Create a Homography object for a "piecewiseaffine" transform (it could be reused later)
    const homography = new Homography("piecewiseaffine");
    // Set the reference points
    homography.setReferencePoints(srcPoints, dstPoints);
    // Warp your image
    const resultImage = homography.warp(image);
    ...

Perform a complex Piecewise Affine Transform from a large set of pointsInY * pointsInX reference points.

    ...
    // Define a set of reference points that match to a sinusoidal form. 
    // In this case in image axis (x : From 0 to width, y : From 0 to height) for convenience.
    let srcPoints = [], dstPoints = [];
    for (let y = 0; y <= h; y+=height/pointsInY){
        for (let x = 0; x <= w; x+=width/pointsInX){
            srcPoints.push([x, y]); // Add (x, y) as source points
            dstPoints.push([x, amplitude+y+Math.sin((x*n)/Math.PI)*amplitude]); // Apply sinus function on y
        }    
    }
    // Set the reference points (reuse the previous Homography object)
    homography.setReferencePoints(srcPoints, dstPoints);
    // Warp your image. As not image is given, it will reuse the one used for the previous example.
    const resultImage = homography.warp();
    ...
    

Perform a simple Affine Transform and apply it on a HTMLElement.

    ...
    // Set the reference points from which estimate the transform
    const srcPoints = [[0, 0], [0, 1], [1, 0]];
    const dstPoints = [[0, 0], [1/2, 1], [1, 1/8]];
    
    // Don't specify the type of transform to apply, so let the library decide it by itself. 
    const homography = new Homography(); // Default transform value is "auto".
    // Apply the transform over an HTMLElement from the DOM.
    identityHomography.transformHTMLElement(document.getElementById("inputText"), squarePoints, rectanglePoints);
    ...

Calculate 250 different Projective Transforms, apply them over the same input Image and draw them on a canvas.

const ctx = document.getElementById("exampleCanvas").getContext("2d");

// Build the initial reference points (in this case, in image coordinates just for convenience)
const srcPoints = [[0, 0], [0, h], [w, 0], [w, h]];
let dstPoints = [[0, 0], [0, h], [w, 0], [w, h]];
// Create the homography object (it is not necessary to set transform as "projective" as it will be automatically detected)
const homography = new Homography(); 
// Set the static parameters of all the transforms sequence (it will improve the performance of subsequent warpings)
homography.setSourcePoints(srcPoints);
homography.setImage(inputImg);

// Set the parameters for building the future dstPoints at each frame (5 movements of 50 frames each one)
const framesPerMovement = 50;
const movements = [[[0, h/5], [0, -h/5], [0, 0], [0, 0]],
                   [[w, 0], [w, 0], [-w, 0], [-w, 0]],
                   [[0, -h/5], [0, h/5], [0, h/5], [0, -h/5]],
                   [[-w, 0], [-w, 0], [w, 0], [w, 0]],
                   [[0, 0], [0, 0], [0, -h/5], [0, h/5]]];

for(let movement = 0; movement<movements.length; movement++){
    for (let step = 0; step<framesPerMovement; step++){
        // Create the new dstPoints (in Computer Vision applications these points will usually come from webcam detections)
        for (let point = 0; point<srcPoints.length; point++){
            dstPoints[point][0] += movements[movement][point][0]/framesPerMovement;
            dstPoints[point][1] += movements[movement][point][1]/framesPerMovement;
        }
        
        // Update the destiny points and calculate the new warping. 
        homography.setDestinyPoints(dstPoints);
        const img = homography.warp(); //No parameters warp will reuse the previously setted image
        // Clear the canvas and draw the new image (using putImageData instead of drawImage for performance reasons)
        ctx.clearRect(0, 0, w, h);
        ctx.putImageData(img, Math.min(dstPoints[0][0], dstPoints[2][0]), Math.min(dstPoints[0][1], dstPoints[2][1]));
        await new Promise(resolve => setTimeout(resolve, 0.1)); // Just a trick for forcing canvas to refresh
    }
}

*Just take attention to the use of setSourcePoints(srcPoints), setImage(inputImg), setDestinyPoints(dstPoints) and warp(). The rest of code is just to generate coherent sequence of destiny points and drawing the results

With Node.js

// Import the Homography class and the loadImage function 
import { Homography , loadImage} from 'homography-js';
// Import the file stream just for saving the image in some place when warped
import fs from 'fs';

// Define the source and destiny points
const sourcePoints = [[0, 0], [0, 1], [1, 0], [1, 1]];
const dstPoints = [[1/10, 1/2], [0, 1], [9/10, 1/2], [1, 1]];
// Create the homography object and set the reference points
const homography = new Homography()
homography.setReferencePoints(sourcePoints, dstPoints);
// Here, in backend we can use `await loadImage(<img_path>)` instead of an HTMLImageElement 
homography.setImage(await loadImage('./testImg.png'));
// And when warping, we get a pngImage from the 'pngjs2' package instead of an ImageData
const pngImage = homography.warp();
// Just for visualizing the results, we write it in a file.
pngImage.pipe(fs.createWriteStream("transformedImage.png"))

Performance

Benchmark results for every kind of transformation.
  • Image Data Warping section indicates the time for calculating the transformation matrix between a pair of Source and Destiny reference points and appling this transform over an image of size NxN. It generates a persistent ImageData object that can be directly drawn in any Canvas at a negligible computational cost, through context.putImageData(imgData, x, y).
  • 400x400 ↦ NxN, indicates the size of the input image and the size of the expected output image. The CSS Transform Calculation section does not include this information since these sizes does not affect to its performance.
  • First frame column indicates the time for calculating a single image warping, while Rest of Frames column indicates the time for calculating each one of multiple different warpings on the same input image. Frame Rate (1/Rest of Frames) indicates the amount of frames that can be calculated per second.
  • You can test the concrete performance of your objective device just by executing the benchmark.html. Take into account that this execution can take some minutes, since it executes 2,000 frames for each single warping experiment, and 200,000 for each CSS experiment.

Performance tests on an Average Desktop PC.

Intel Core i5-7500 Quad-Core. Chrome 92.0.4515.107. Windows 10.
Image Data Warping
400x400 ↦ 200x200 400x400 ↦ 400x400 400x400 ↦ 800x800
Transform First Frame Rest of Frames Frame Rate First Frame Rest of Frames Frame Rate First Frame Rest of Frames Frame Rate
Affine 5 ms 0.7 ms 1,439 fps 14 ms 2.7 ms 366.7 fps 13 ms 10.8 ms 92.6 fps
Projective 6 ms 1.9 ms 527.4 fps 21 ms 7.2 ms 139.7 fps 30 ms 27.5 ms 36.3 fps
Piecewise Aff. (2 Triangles) 7 ms 1.1 ms 892.9 fps 19 ms 4.4 ms 227.9 fps 40 ms 16.5 ms 60.6 fps
Piecewise Aff. (360 Tri.) 26 ms 2.1 ms 487 fps 21 ms 4.6 ms 216.1 fps 41 ms 22.4 ms 44.6 fps
Piecewise Aff. (~23,000 Tri.) 257 ms 24.3 ms 41.2 fps 228 ms 11.5 ms 87.1 fps 289 ms 62 ms 16.1 fps
CSS Transform Calculation
Transform First Frame Rest of Frames Frame Rate
Affine 4 ms 0.00014 ms 1,696,136.44 fps
Projective 4 ms 0.016 ms 61,650.38 fps

Performance tests on a budget smartphone (a bit destroyed).

Xiaomi Redmi Note 5. Chrome 92.0.4515.115. Android 8.1.0
Image Data Warping
400x400 ↦ 200x200 400x400 ↦ 400x400 400x400 ↦ 800x800
Transform First Frame Rest of Frames Frame Rate First Frame Rest of Frames Frame Rate First Frame Rest of Frames Frame Rate
Affine 25 ms 4.5 ms 221.5 fps 84 ms 16.9 ms 59.11 fps 127 ms 64.7 ms 15.46 fps
Projective 38 ms 15.5 ms 64.4 fps 150 ms 56.8 ms 17.6 fps 232 ms 216 ms 4.62 fps
Piecewise Affine (2 Triangles) 35 ms 8.8 ms 113.9 fps 316 ms 31.7 ms 31.6 fps 138 ms 118 ms 8.5 fps
Piecewise Aff. (360 Tri.) 151 ms 14.3 ms 70 fps 138 ms 30.2 ms 33 fps 274 ms 149 ms 6.7 fps
Piecewise Aff. (~23,000 Tri.) 1.16 s 162 ms 6.15 fps 1.16 s 75 ms 13.3 fps 1.47 s 435 ms 2.3 fps
CSS Transform Calculation
Transform First Frame Rest of Frames Frame Rate
Affine 21 ms 0.0104 ms 96,200.10 fps
Projective 22 ms 0.025 ms 40,536.71 fps