Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences!
Visit https://gaze-keyboard.netlify.app/ (Works well on mobile too!!) π
Inspired by the Android application "Look to speak".
Uses Tensorflow.js's face landmark detection model.
This tool detects when the user looks right, left, up and straight forward.
As a module:
npm install gaze-detection --save
Start by importing it:
import gaze from "gaze-detection";
Load the machine learning model:
await gaze.loadModel();
Then, set up the camera feed needed for the detection. The setUpCamera
method needs a video
HTML element and, optionally, a camera device ID if you are using more than the default webcam.
const videoElement = document.querySelector("video"); const init = async () => { // Using the default webcam await gaze.setUpCamera(videoElement); // Or, using more camera input devices const mediaDevices = await navigator.mediaDevices.enumerateDevices(); const camera = mediaDevices.find( (device) => device.kind === "videoinput" && device.label.includes(/* The label from the list of available devices*/) ); await gaze.setUpCamera(videoElement, camera.deviceId); };
Run the predictions:
const predict = async () => { const gazePrediction = await gaze.getGazePrediction(); console.log("Gaze direction: ", gazePrediction); //will return 'RIGHT', 'LEFT', 'STRAIGHT' or 'TOP' if (gazePrediction === "RIGHT") { // do something when the user looks to the right } let raf = requestAnimationFrame(predict); }; predict();
Stop the detection:
cancelAnimationFrame(raf);