8 research outputs found

    Continuous Mental Effort Evaluation during 3D Object Manipulation Tasks based on Brain and Physiological Signals

    Get PDF
    Designing 3D User Interfaces (UI) requires adequate evaluation tools to ensure good usability and user experience. While many evaluation tools are already available and widely used, existing approaches generally cannot provide continuous and objective measures of usa-bility qualities during interaction without interrupting the user. In this paper, we propose to use brain (with ElectroEncephaloGraphy) and physiological (ElectroCardioGraphy, Galvanic Skin Response) signals to continuously assess the mental effort made by the user to perform 3D object manipulation tasks. We first show how this mental effort (a.k.a., mental workload) can be estimated from such signals, and then measure it on 8 participants during an actual 3D object manipulation task with an input device known as the CubTile. Our results suggest that monitoring workload enables us to continuously assess the 3DUI and/or interaction technique ease-of-use. Overall, this suggests that this new measure could become a useful addition to the repertoire of available evaluation tools, enabling a finer grain assessment of the ergonomic qualities of a given 3D user interface.Comment: Published in INTERACT, Sep 2015, Bamberg, German

    Recent advances in EEG-based neuroergonomics for Human-Computer Interaction

    Get PDF
    International audienceHuman-Computer Interfaces (HCI) are increasingly ubiquitous in multiple applications including industrial design, education, art or entertainment. As such, HCI could be used by very different users, with very different skills and needs. This thus requires user-centered design approaches and appropriate evaluation methods to maximize User eXperience (UX). Existing evaluation methods include behavioral studies, testbeds, questionnaires and inquiries, among others. While useful, such methods suffer from several limitations as they can be either ambiguous, lack real-time recordings, or disrupt the interaction. Neuroergonomics can be an adequate tool to complement traditional evaluation methods. Notably, Electroencephalography (EEG)-based evaluation of UX has the potential to address the limitations above, by providing objective, real-time and non-disruptive metrics of the ergonomics quality of a given HCI (Frey 2014). In this abstract, we present an overview of our recent works in that direction. In particular, we show how we can process EEG signals in order to derive metrics characterizing 1) how the user perceive the HCI display (HCI output) and 2) how the user interacts with the HCI (HCI input)

    Framework for Electroencephalography-based Evaluation of User Experience

    Get PDF
    Measuring brain activity with electroencephalography (EEG) is mature enough to assess mental states. Combined with existing methods, such tool can be used to strengthen the understanding of user experience. We contribute a set of methods to estimate continuously the user's mental workload, attention and recognition of interaction errors during different interaction tasks. We validate these measures on a controlled virtual environment and show how they can be used to compare different interaction techniques or devices, by comparing here a keyboard and a touch-based interface. Thanks to such a framework, EEG becomes a promising method to improve the overall usability of complex computer systems.Comment: in ACM. CHI '16 - SIGCHI Conference on Human Factors in Computing System, May 2016, San Jose, United State

    MEDINDO A EXPERIÊNCIA DO USUÁRIO POR MEIO DE SINAIS PSICOFISIOLÓGICOS

    Get PDF
    This research aimed to identify the main equipment used in the measurement of physiological signals, which can be used to evaluate user satisfaction in usability tests. A systematic literature review was developed to identify the ten physiological signal measurement technologies discussed in this article.Esta pesquisa teve como objetivo identificar os principais equipamentos utilizados na medição de sinais fisiológicos, passíveis de serem empregados na avaliação da satisfação do usuário em testes de usabilidade. Foi desenvolvida uma revisão bibliográfica sistemática para identificar as dez tecnologias de medição de sinais fisiológicos abordadas neste artigo

    EEG-based neuroergonomics for 3D user interfaces: opportunities and challenges

    Get PDF
    International audience3D user interfaces (3DUI) are increasingly used in a number of applications, spanning from entertainment to industrial design. However, 3D interaction tasks are generally more complex for users since interacting with a 3D environment is more cognitively demanding than perceiving and interacting with a 2D one. As such, it is essential that we could evaluate finely user experience, in order to propose seamless interfaces. To do so, a promising research direction is to measure users' inner-state based on brain signals acquired during interaction, by following a neuroergonomics approach. Combined with existing methods, such tool can be used to strengthen the understanding of user experience. In this paper, we review the work being undergone in this area; what has been achieved and the new challenges that arise. We describe how a mobile brain imaging technique such as electroencephalography (EEG) brings continuous and non-disruptive measures. EEG-based evaluation of users can give insights about multiple dimensions of the user experience, with realistic interaction tasks or novel interfaces. We investigate four constructs: workload, attention, error recognition and visual comfort. Ultimately, these metrics could help to alleviate users when they interact with computers

    Classifying EEG Signals during Stereoscopic Visualization to Estimate Visual Comfort

    Get PDF
    International audienceWith stereoscopic displays a sensation of depth that is too strong could impede visual comfort and may result in fatigue or pain. We used Electroencephalography (EEG) to develop a novel brain-computer interface that monitors users' states in order to reduce visual strain. We present the first system that discriminates comfortable conditions from uncomfortable ones during stereoscopic vision using EEG. In particular, we show that either changes in event-related potentials' (ERPs) amplitudes or changes in EEG oscillations power following stereoscopic objects presentation can be used to estimate visual comfort. Our system reacts within 1 s to depth variations, achieving 63% accuracy on average (up to 76%) and 74% on average when 7 consecutive variations are measured (up to 93%). Performances are stable (≈62.5%) when a simplified signal processing is used to simulate online analyses or when the number of EEG channels is lessened. This study could lead to adaptive systems that automatically suit stereoscopic displays to users and viewing conditions. For example, it could be possible to match the stereoscopic effect with users' state by modifying the overlap of left and right images according to the classifier output

    Electroencephalography (EEG)-based Brain-Computer Interfaces

    Get PDF
    International audienceBrain-Computer Interfaces (BCI) are systems that can translate the brain activity patterns of a user into messages or commands for an interactive application. The brain activity which is processed by the BCI systems is usually measured using Electroencephalography (EEG). In this article, we aim at providing an accessible and up-to-date overview of EEG-based BCI, with a main focus on its engineering aspects. We notably introduce some basic neuroscience background, and explain how to design an EEG-based BCI, in particular reviewing which signal processing, machine learning, software and hardware tools to use. We present Brain Computer Interface applications, highlight some limitations of current systems and suggest some perspectives for the field

    Improving Mobile MOOC Learning via Implicit Physiological Signal Sensing

    Get PDF
    Massive Open Online Courses (MOOCs) are becoming a promising solution for delivering high- quality education on a large scale at low cost in recent years. Despite the great potential, today’s MOOCs also suffer from challenges such as low student engagement, lack of personalization, and most importantly, lack of direct, immediate feedback channels from students to instructors. This dissertation explores the use of physiological signals implicitly collected via a "sensorless" approach as a rich feedback channel to understand, model, and improve learning in mobile MOOC contexts. I first demonstrate AttentiveLearner, a mobile MOOC system which captures learners' physiological signals implicitly during learning on unmodified mobile phones. AttentiveLearner uses on-lens finger gestures for video control and monitors learners’ photoplethysmography (PPG) signals based on the fingertip transparency change captured by the back camera. Through series of usability studies and follow-up analyses, I show that the tangible video control interface of AttentiveLearner is intuitive to use and easy to operate, and the PPG signals implicitly captured by AttentiveLearner can be used to infer both learners’ cognitive states (boredom and confusion levels) and divided attention (multitasking and external auditory distractions). Building on top of AttentiveLearner, I design, implement, and evaluate a novel intervention technology, Context and Cognitive State triggered Feed-Forward (C2F2), which infers and responds to learners’ boredom and disengagement events in real time via a combination of PPG-based cognitive state inference and learning topic importance monitoring. C2F2 proactively reminds a student of important upcoming content (feed-forward interventions) when disengagement is detected. A 48-participant user study shows that C2F2 on average improves learning gains by 20.2% compared with a non-interactive baseline system and is especially effective for bottom performers (improving their learning gains by 41.6%). Finally, to gain a holistic understanding of the dynamics of MOOC learning, I investigate the temporal dynamics of affective states of MOOC learners in a 22 participant study. Through both a quantitative analysis of the temporal transitions of affective states and a qualitative analysis of subjective feedback, I investigate differences between mobile MOOC learning and complex learning activities in terms of affect dynamics, and discuss pedagogical implications in detail
    corecore