21 research outputs found

    Measuring Drivers’ Physiological Response to Different Vehicle Controllers in Highly Automated Driving (HAD): Opportunities for Establishing Real-Time Values of Driver Discomfort

    Get PDF
    This study investigated how driver discomfort was influenced by different types of automated vehicle (AV) controllers, compared to manual driving, and whether this response changed in different road environments, using heart-rate variability (HRV) and electrodermal activity (EDA). A total of 24 drivers were subjected to manual driving and four AV controllers: two modelled to depict “human-like” driving behaviour, one conventional lane-keeping assist controller, and a replay of their own manual drive. Each drive lasted for ~15 min and consisted of rural and urban environments, which differed in terms of average speed, road geometry and road-based furniture. Drivers showed higher skin conductance response (SCR) and lower HRV during manual driving, compared to the automated drives. There were no significant differences in discomfort between the AV controllers. SCRs and subjective discomfort ratings showed significantly higher discomfort in the faster rural environments, when compared to the urban environments. Our results suggest that SCR values are more sensitive than HRV-based measures to continuously evolving situations that induce discomfort. Further research may be warranted in investigating the value of this metric in assessing real-time driver discomfort levels, which may help improve acceptance of AV controllers

    Physiological indicators of driver workload during car-following scenarios and takeovers in highly automated driving

    Get PDF
    This driving simulator study, conducted as a part of Horizon2020-funded L3Pilot project, investigated how different car-following situations affected driver workload, within the context of vehicle automation. Electrocardiogram (ECG) and electrodermal activity (EDA)-based physiological metrics were used as objective indicators of workload, along with self-reported workload ratings. A total of 32 drivers were divided into two equal groups, based on whether they engaged in a non-driving related task (NDRT) during automation (SAE Level 3) or monitored the drive (SAE Level 2). Drivers in both groups were exposed to two counterbalanced experimental drives, lasting ∌ 18 min each, of Short (0.5 s) and Long (1.5 s) Time Headway conditions during automated car-following (ACF), which was followed by a takeover that happened with or without a lead vehicle. Results showed that driver workload due to the NDRT was significantly higher than both monitoring the drive during ACF and manual car-following (MCF). Furthermore, the results indicated that a lead vehicle maintain a shorter THW can significantly increase driver workload during takeover scenarios, potentially affecting driver safety. This warrants further research into understanding safe time headway thresholds to be maintained by automated vehicles, without placing additional cognitive or attentional demands on the driver. Our results indicated that ECG and EDA signals are sensitive to variations in workload, which warrants further investigation on the value of combining these two signals to assess driver workload in real-time, to help future driver monitoring systems respond appropriately to the limitations of the driver, and predict their performance in the driving task, if and when they have to resume manual control of the vehicle after a period of automated driving

    Impact of waiting times on risky driver behaviour at railway level crossings

    No full text
    Increased road and rail traffic in Australia results in actively protected crossings being closed for extended periods of time during peak hours. This results in road congestion. It is known that extended periods of warning/waiting times at level crossings have impacts on drivers’ decision making in regards to violating crossing rules. Excessive waiting times could lead to non-compliant behaviour by motorists, resulting in incidents, including injuries and fatalities. However, the correlation between waiting time and rule violation is not well documented, although it is known that a range of personal and environmental factors influence rule non-compliance. This leads to the question of whether longer waiting times affect motorists’ assessment of risk and how long motorists are prepared to wait at level crossings before undertaking risky behaviour. A driving simulator study was used to obtain objective measures of railway level crossing (RLX) rule violations. Sixty participants completed six driving tasks each, with the tasks varying in terms of waiting times. Compliance with road rules at the level crossing during the simulated drives was examined. Main results include that increased waiting times result in increased likelihood of risky driving behaviour, particularly for waiting times longer than three minutes. Risky driving behaviours included entering the activated crossing before boom gates are down; entering the crossing after the train passage but before signals are deactivated; and stopping/reversing on the crossing. The results suggest that, where possible, waiting times should be standardized at values lower than three minutes in order to reduce the likelihood of risky road user behaviour

    Acclimation-induced changes in cell membrane composition and influence on cryotolerance of in vitro shoots of native plant species

    No full text
    Cell membranes are the primary sites of cryopreservation injury and measuring changes to membrane composition arising from cold acclimation may assist with providing a rationale for optimising cryopreservation methods. Shoot tips from two south-west Western Australian species, Grevillea scapigera and Loxocarya cinerea, and Arabidopsis thaliana (reference species) were subjected to cryopreservation using the droplet vitrification protocol. Two pre-conditioning regimes involving a constant temperature (23 °C, CT with a 12 h light/dark cycle) or an alternating temperature (AT) regime (20/10 °C with a 12 h light/dark cycle) were compared. Soluble sugars, sterols and phospholipids present in the shoot tips were analysed. Use of AT pre-conditioning (acclimation) resulted in a modest decrease in cryotolerance in A. thaliana, increased cryotolerance in G. scapigera, and increased survival in the non-frozen control explants of L. cinerea in comparison to CT pre-conditioning. Increased cryotolerance was accompanied by a higher total sugar sterol and phospholipid content, as well as an increase in strong hydrating phospholipid classes such as phosphatidylcholine. The double bond index of bound fatty acyl chains of phospholipids was greater after AT pre-conditioning, mostly due to a higher amount of monoenes in A. thaliana and trienes in G. scapigera and L. cinerea. These findings suggest that AT pre-conditioning treatments for in vitro plants can have a positive influence on cryotolerance for some plant species and this may be related to observed changes in the overall composition of cell membranes. However, alternative factors (e.g. oxidative stress) may be equally important with other species (e.g. L. cinerea)
    corecore