CERES
CERES TEST Only!
  • Communities & Collections
  • Browse CERES
  • Library Staff Log In
    New user? Click here to register. Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Dong, Kuo"

Now showing 1 - 3 of 3
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    ItemOpen Access
    A dual-cameras-based driver gaze mapping system with an application on non-driving activities monitoring
    (IEEE, 2019-09-13) Yang, Lichao; Dong, Kuo; Dmitruk, Arkadiusz Jan; Brighton, James; Zhao, Yifan
    Characterisation of the driver's non-driving activities (NDAs) is of great importance to the design of the take-over control strategy in Level 3 automation. Gaze estimation is a typical approach to monitor the driver's behaviour since the eye gaze is normally engaged with the human activities. However, current eye gaze tracking techniques are either costly or intrusive which limits their applicability in vehicles. This paper proposes a low-cost and non-intrusive dual-cameras based gaze mapping system that visualises the driver's gaze using a heat map. The challenges introduced by complex head movement during NDAs and camera distortion are addressed by proposing a nonlinear polynomial model to establish the relationship between the face features and eye gaze on the simulated driver's view. The Root Mean Square Error of this system in the in-vehicle experiment for the X and Y direction is 7.80±5.99 pixel and 4.64±3.47 pixel respectively with the image resolution of 1440 x 1080 pixels. This system is successfully demonstrated to evaluate three NDAs with visual attention. This technique, acting as a generic tool to monitor driver's visual attention, will have wide applications on NDA characterisation for intelligent design of take over strategy and driving environment awareness for current and future automated vehicles.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Recognition of visual-related non-driving activities using a dual-camera monitoring system
    (Elsevier, 2021-03-25) Yang, Lichao; Dong, Kuo; Ding, Yan; Brighton, James; Zhan, Zhenfei; Zhao, Yifan
    For a Level 3 automated vehicle, according to the SAE International Automation Levels definition (J3016), the identification of non-driving activities (NDAs) that the driver is engaging with is of great importance in the design of an intelligent take-over interface. Much of the existing literature focuses on the driver take-over strategy with associated Human-Machine Interaction design. This paper proposes a dual-camera based framework to identify and track NDAs that require visual attention. This is achieved by mapping the driver's gaze using a nonlinear system identification approach, on the object scene, recognised by a deep learning algorithm. A novel gaze-based region of interest (ROI) selection module is introduced and contributes about a 30% improvement in average success rate and about a 60% reduction in average processing time compared to the results without this module. This framework has been successfully demonstrated to identify five types of NDA required visual attention with an average success rate of 86.18%. The outcome of this research could be applicable to the identification of other NDAs and the tracking of NDAs within a certain time window could potentially be used to evaluate the driver's attention level for both automated and human-driving vehicles
  • No Thumbnail Available
    ItemOpen Access
    Supplementary videos for the paper "A dual-cameras based driver gaze mapping system with an application on non-driving activities monitoring"
    (Cranfield University, 2019-07-04 14:09) Yang, Lichao; Dong, Kuo; Dmitruk, Arkadiusz; Brighton, James; Zhao, Yifan
    These videos are supplementary materials for demonstrating a driver's gaze mapping system which is proposed in the paper named "A dual-cameras based driver gaze mapping system with an application on non-driving activities monitoring"

Quick Links

  • About our Libraries
  • Cranfield Research Support
  • Cranfield University

Useful Links

  • Accessibility Statement
  • CERES Takedown Policy

Contacts-TwitterFacebookInstagramBlogs

Cranfield Campus
Cranfield, MK43 0AL
United Kingdom
T: +44 (0) 1234 750111
  • Cranfield University at Shrivenham
  • Shrivenham, SN6 8LA
  • United Kingdom
  • Email us: researchsupport@cranfield.ac.uk for REF Compliance or Open Access queries

Cranfield University copyright © 2002-2025
Cookie settings | Privacy policy | End User Agreement | Send Feedback