CERES
CERES TEST Only!
  • Communities & Collections
  • Browse CERES
  • Library Staff Log In
    New user? Click here to register. Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Mouzakitis, Alexandros"

Now showing 1 - 5 of 5
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Analysis of autopilot disengagements occurring during autonomous vehicle testing
    (IEEE, 2017-12-20) Lv, Chen; Cao, Dongpu; Zhao, Yifan; Auger, Daniel J.; Sullman, Mark; Wang, Huaji; Millen Dutka, Laura; Skrypchuk, Lee; Mouzakitis, Alexandros
    In present-day highly-automated vehicles, there are occasions when the driving system disengages and the human driver is required to take-over. This is of great importance to a vehicle U+02BC s safety and ride comfort. In the U.S state of California, the Autonomous Vehicle Testing Regulations require every manufacturer testing autonomous vehicles on public roads to submit an annual report summarizing the disengagements of the technology experienced during testing. On 1 January 2016, seven manufacturers submitted their first disengagement reports: Bosch, Delphi, Google, Nissan, Mercedes-Benz, Volkswagen, and Tesla Motors. This work analyses the data from these disengagement reports with the aim of gaining abetter understanding of the situations in which a driver is required to takeover, as this is potentially useful in improving the Society of Automotive Engineers U+0028 SAE U+0029 Level 2 and Level 3 automation technologies. Disengagement events from testing are classified into different groups based on attributes and the causes of disengagement are investigated and compared in detail. The mechanisms and time taken for take-over transition occurred in disengagements are studied. Finally, recommendations for OEMs, manufacturers, and government organizations are also discussed.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Characterization of driver neuromuscular dynamics for human-automation collaboration design of automated vehicles
    (IEEE, 2018-03-05) Lv, Chen; Wang, Huaji; Cao, Dongpu; Zhao, Yifan; Auger, Daniel J.; Sullman, Mark; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros
    In order to design an advanced human-automation collaboration system for highly automated vehicles, research into the driver's neuromuscular dynamics is needed. In this paper a dynamic model of drivers' neuromuscular interaction with a steering wheel is firstly established. The transfer function and the natural frequency of the systems are analyzed. In order to identify the key parameters of the driver-steering-wheel interacting system and investigate the system properties under different situations, experiments with driver-in-the-loop are carried out. For each test subject, two steering tasks, namely the passive and active steering tasks, are instructed to be completed. Furthermore, during the experiments, subjects manipulated the steering wheel with two distinct postures and three different hand positions. Based on the experimental results, key parameters of the transfer function model are identified by using the Gauss-Newton algorithm. Based on the estimated model with identified parameters, investigation of system properties is then carried out. The characteristics of the driver neuromuscular system are discussed and compared with respect to different steering tasks, hand positions and driver postures. These experimental results with identified system properties provide a good foundation for the development of a haptic take-over control system for automated vehicles.
  • No Thumbnail Available
    ItemOpen Access
    Data for "An Orientation Sensor based Head Tracking System for Driver Behaviour Monitoring"
    (Cranfield University, 2017-11-21 13:42) Zhao, Yifan; Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros
    Data used for this paper - files created in MATLAB.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    An orientation sensor based head tracking system for driver behaviour monitoring
    (MDPI, 2017-11-22) Zhao, Yifan; Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel J.; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros
    Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers’ behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    A refined non-driving activity classification using a two-stream convolutional neural network
    (IEEE, 2020-06-29) Yang, Lichao; Yang, Tingyu; Liu, Haochen; Shan, Xiaocai; Brighton, James; Skrypchuk, Lee; Mouzakitis, Alexandros; Zhao, Yifan
    It is of great importance to monitor the driver’s status to achieve an intelligent and safe take-over transition in the level 3 automated driving vehicle. We present a camera-based system to recognise the non-driving activities (NDAs) which may lead to different cognitive capabilities for take-over based on a fusion of spatial and temporal information. The region of interest (ROI) is automatically selected based on the extracted masks of the driver and the object/device interacting with. Then, the RGB image of the ROI (the spatial stream) and its associated current and historical optical flow frames (the temporal stream) are fed into a two-stream convolutional neural network (CNN) for the classification of NDAs. Such an approach is able to identify not only the object/device but also the interaction mode between the object and the driver, which enables a refined NDA classification. In this paper, we evaluated the performance of classifying 10 NDAs with two types of devices (tablet and phone) and 5 types of tasks (emailing, reading, watching videos, web-browsing and gaming) for 10 participants. Results show that the proposed system improves the averaged classification accuracy from 61.0% when using a single spatial stream to 90.5%

Quick Links

  • About our Libraries
  • Cranfield Research Support
  • Cranfield University

Useful Links

  • Accessibility Statement
  • CERES Takedown Policy

Contacts-TwitterFacebookInstagramBlogs

Cranfield Campus
Cranfield, MK43 0AL
United Kingdom
T: +44 (0) 1234 750111
  • Cranfield University at Shrivenham
  • Shrivenham, SN6 8LA
  • United Kingdom
  • Email us: researchsupport@cranfield.ac.uk for REF Compliance or Open Access queries

Cranfield University copyright © 2002-2025
Cookie settings | Privacy policy | End User Agreement | Send Feedback