44,213 research outputs found
A Classification Model for Sensing Human Trust in Machines Using EEG and GSR
Today, intelligent machines \emph{interact and collaborate} with humans in a
way that demands a greater level of trust between human and machine. A first
step towards building intelligent machines that are capable of building and
maintaining trust with humans is the design of a sensor that will enable
machines to estimate human trust level in real-time. In this paper, two
approaches for developing classifier-based empirical trust sensor models are
presented that specifically use electroencephalography (EEG) and galvanic skin
response (GSR) measurements. Human subject data collected from 45 participants
is used for feature extraction, feature selection, classifier training, and
model validation. The first approach considers a general set of
psychophysiological features across all participants as the input variables and
trains a classifier-based model for each participant, resulting in a trust
sensor model based on the general feature set (i.e., a "general trust sensor
model"). The second approach considers a customized feature set for each
individual and trains a classifier-based model using that feature set,
resulting in improved mean accuracy but at the expense of an increase in
training time. This work represents the first use of real-time
psychophysiological measurements for the development of a human trust sensor.
Implications of the work, in the context of trust management algorithm design
for intelligent machines, are also discussed.Comment: 20 page
Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu
Recommended from our members
Electrophysiological Studies of Visual Attention and of Emotion Regulation
Electrophysiological methods, such as electroencephalography (EEG) and electrocardiography (ECG), measure biological activity that allow us to infer underlying cognitive processes. In the first study, we use EEG to track feature-based attention (FBA), a form of visual attention that helps one detect objects with a particular color, motion, or orientation. We explore the use of SSVEPs, generated by flicker presented peripherally, to track attention in a visual search task presented centrally. Classification results show that one can track an observer’s attended color, which suggests that these methods may provide a viable means for tracking FBA in a real-time task. In the second study, we use cardiovascular measures to examine influences of the emotion regulation strategy of reappraisal. We examine cooperation and cardiovascular responses in individuals that were defected on by their opponent in the first round of an iterated Prisoner’s Dilemma. We find significant differences between the emotion regulation conditions using the biopsychosocial (BPS) model of challenge and threat, where participants primed with the reappraisal strategy were weakly comparable with a threat state of the BPS model and participants without an emotion regulation were weakly comparable with a challenge state of the BPS model. In the third study, we use EEG to study the chromatic sensitivity of FBA for color during a visual search task. We use SSVEP responses evoked through peripheral flicker to measure the spectral tuning of color detection mechanisms and how attentional selection is affected by distractor color. Our results find smaller responses for the distractor colors and suggest that feature-based attention to a particular color involves chromatic mechanisms that both enhance the response to a target and minimize responses to distractors
Discovering Gender Differences in Facial Emotion Recognition via Implicit Behavioral Cues
We examine the utility of implicit behavioral cues in the form of EEG brain
signals and eye movements for gender recognition (GR) and emotion recognition
(ER). Specifically, the examined cues are acquired via low-cost, off-the-shelf
sensors. We asked 28 viewers (14 female) to recognize emotions from unoccluded
(no mask) as well as partially occluded (eye and mouth masked) emotive faces.
Obtained experimental results reveal that (a) reliable GR and ER is achievable
with EEG and eye features, (b) differential cognitive processing especially for
negative emotions is observed for males and females and (c) some of these
cognitive differences manifest under partial face occlusion, as typified by the
eye and mouth mask conditions.Comment: To be published in the Proceedings of Seventh International
Conference on Affective Computing and Intelligent Interaction.201
- …