Advanced Search
Cheng Wendong, Fu Rui, Yuan Wei, Liu Zhuofan, Zhang Mingfang, Liu Tong. Driver Attention Distraction Detection and Hierarchical Prewarning Based on Machine Vision[J]. Journal of Computer-Aided Design & Computer Graphics, 2016, 28(8): 1287-1296.
Citation: Cheng Wendong, Fu Rui, Yuan Wei, Liu Zhuofan, Zhang Mingfang, Liu Tong. Driver Attention Distraction Detection and Hierarchical Prewarning Based on Machine Vision[J]. Journal of Computer-Aided Design & Computer Graphics, 2016, 28(8): 1287-1296.

Driver Attention Distraction Detection and Hierarchical Prewarning Based on Machine Vision

  • Machine vision is the main method for driver attention distraction(DAD) detection. In the current researches eyes, lips and other targets would be easily interfered by light and occlusion. Moreover, recent head pose models are difficult to satisfy the robustness and accuracy of DAD detection. For this problem a DAD detection method and hierarchical prewarning is proposed based on nostril recognition in this paper. Firstly driver face region is detected by a fusion algorithm combined with Adaboost and adaptive skin model, which is pretreated by BF-SSR illumination equalization. Nostrils are then detected within face region according to the cluster characteristics of color, area and roundness. After that head pose model of yaw and pitch is set up according to nostril coordinates on imaging plane. Meanwhile the problem of model parameter initialization is solved during head translational motion. On the basis of the above, attention classification model is established by support vector machine(SVM), which is trained by head rotation angles and nostril coordinate offsets. Finally hierarchical DAD prewarning method is set up according to the time length, distribution proportion and necessary level of visual attention diversion. Experimental results demonstrate that the method has robust adaptability to illumination, eyeglasses and head rotation. The average angle errors of head yaw and pitch are 5.5° and 4.9°, respectively. The classification accuracy rate of attention regions by SVM classifier reaches 85.8% and DAD prewarning accuracy reaches 85.4%.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return