5 research outputs found

    Recognizing behavioral factors while driving: A multimodal corpus to monitor the driver's affective state

    No full text
    The presented study concentrates on the collection of emotional multimodal real-world in-car audio, video and physiological signal recordings while driving. To do so, three sensor systems were integrated in the car and four emotional relevant states of the driver were defined: neutral, positive, frustration and anxiety. To gather as natural as possible emotional data of the driver, the subjects needed to be unbiased and were therefore kept unaware of the detailed research objective. The emotions were induced using so-called Wizard-of-Oz experiments, where the drivers believed to be interacting with an automated technical system, which in fact was controlled by a human. Additionally, on board interviews while driving were conducted by an instructed psychologist. To evaluate the collected data, questionnaires were filled out by the subjects before, during and after the data collection. These include monitoring of the drivers perceived state of emotion, stress, sleepiness and thermal sensation but also detailed questionnaires on their driving experience, attitude towards technology and big five OCEAN personality traits. Afterwards, the data was annotated by expert labelers. The statistical analyses of these results will be presented in the full paper

    Adaptive Transitions for Automation in Cars, Trucks, Busses and Motorcycles

    No full text
    Automated vehicles are entering the roads and automation is applied to cars, trucks, busses and even motorcycles today. High automation foresees transitions during driving in both directions. The driver and rider state become a critical parameter, since automation allows to safely intervene and hand control to the automation when manual driving is not performed safely anymore. When the control transits from automation to manual an appropriate driver state needs to be identified before releasing the automated control. The detection of driver states during manual and automated driving and an appropriate design of the HMI are crucial steps to support these transitions. State of the art SAE Level 3 systems do not take the driver state, personal preferences and predictions of road conditions into account. The ADAS&ME project, funded by the H2020 program of the European Commission, proposes an innovative and fully adaptive humanmachine interaction framework, able to support driver/rider state monitoring-based transitions in automated driving. The HMI framework is applied in the target vehicles: passenger car, truck, bus and motorcycle and in seven different use cases
    corecore