Package Exports
- gaze-detection
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (gaze-detection) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
Gaze-detection
Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences!
Demo
Visit https://gaze-keyboard.netlify.app/ (Works well on mobile too!!) 😃
Inspired by the Android application "Look to speak".
Uses Tensorflow.js's face landmark detection model.
Detection
This tool detects when the user looks right, left, up and straight forward.
How to use
Install
As a module:
npm install gaze-detection --save
Code sample
Start by importing it:
import gaze from "gaze-detection";
Load the machine learning model:
await gaze.loadModel();
Then, set up the camera feed needed for the detection. The setUpCamera
method needs a video
HTML element and, optionally, a camera device ID if you are using more than the default webcam.
const videoElement = document.querySelector("video");
const init = async () => {
// Using the default webcam
await gaze.setUpCamera(videoElement);
// Or, using more camera input devices
const mediaDevices = await navigator.mediaDevices.enumerateDevices();
const camera = mediaDevices.find(
(device) =>
device.kind === "videoinput" &&
device.label.includes(/* The label from the list of available devices*/)
);
await gaze.setUpCamera(videoElement, camera.deviceId);
};
Run the predictions:
const predict = async () => {
const gazePrediction = await gaze.getGazePrediction();
console.log("Gaze direction: ", gazePrediction); //will return 'RIGHT', 'LEFT', 'STRAIGHT' or 'TOP'
if (gazePrediction === "RIGHT") {
// do something when the user looks to the right
}
let raf = requestAnimationFrame(predict);
};
predict();
Stop the detection:
cancelAnimationFrame(raf);