Driver distraction detection and recognition
Aviation and Technology; Mechanical Engineering
Proceedings of the ASME 2020 International Mechanical Engineering Congress and Exposition. Volume 14: Safety Engineering, Risk, and Reliability Analysis
Statistics have shown that the main reason for traffic accidents is human error. Modern vehicles are equipped to protect occupants in the event of a crash. The latest advanced vehicles come with driver behavior monitoring systems in recent years, and many have been proven to be effective systems in the prevention of accidents. However, these systems do not provide a complete solution and can only detect driver fatigue or driver distraction. This project aims to build an AI model for sensing the distraction of drivers and identifying the kind of distraction using the Kinect sensor and the Brio camera and reorient driver’s attention on driving. For this, the system is divided into three sub-segments; calling arm position (arms up or down, arms right or left), facial expressions (blinking and mouth), and head orientation. Each segment develops important info for gauging the distraction of a driver based on the depth mapping of data and color from the Kinect sensor and Brio camera respectively. Testing on a driving simulator is completed on 4 different drivers of diverse ethnicity, sex, and age along with over 240 mins of recorded material. Since all the segments were recorded and prepared separately, they can further be taken to build different outcomes and can be implemented for real car systems.
Kiran Kumar Chinta and Fred Barez. "Driver distraction detection and recognition" Proceedings of the ASME 2020 International Mechanical Engineering Congress and Exposition. Volume 14: Safety Engineering, Risk, and Reliability Analysis (2020). https://doi.org/10.1115/IMECE2020-24474