Novel Biological Prompting Technology: Automatic Directional Control by Eye-Tracking Sensor
Abstract
Biological prompting or human teleoperation have been appealing to many researchers lately. Hands-free and voice-free command or control of vehicles, aircraft, machines, robots, etc. would provide huge convenience to users and allow them for multi-tasking. In this work, we would like to dedicate to a new biological prompting function, namely automatic directional control using instantaneous eye gaze data. In this work, we would like to investigate how to apply advanced machine-learning models to design novel automatic gaze-direction identification techniques. We explore the difficulty, namely infliction with outliers and variations, in dealing with eye-tracking data and propose to transform the original data from the time domain into the (statistical) density domain. Thus, we may obtain robust density features by mitigating outliers and variations. Based on our proposed new input features, we can formulate the gaze-direction identification problem as a multi-class classification problem and then design the corresponding random forest classifier. For identification of five different gaze directions (center, up, down, left, and right), our proposed new approach can register up to a 78% accuracy, which outperforms other existing machine-learning models using the same training and test data.