2,603 research outputs found
No Grice: Computers that Lie, Deceive and Conceal
In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behavior, and our interactions. Fusion of such information and reasoning about such information makes it possible, using computational models of human behavior and activities, to provide context- and person-aware interpretations of human behavior and activities, including determination of attitudes, moods, and emotions. Sensors include cameras, microphones, eye trackers, position and proximity sensors, tactile or smell sensors, et cetera. Sensors can be embedded in an environment, but they can also move around, for example, if they are part of a mobile social robot or if they are part of devices we carry around or are embedded in our clothes or body. \ud
\ud
Our daily life behavior and daily life interactions are recorded and interpreted. How can we use such environments and how can such environments use us? Do we always want to cooperate with these environments; do these environments always want to cooperate with us? In this paper we argue that there are many reasons that users or rather human partners of these environments do want to keep information about their intentions and their emotions hidden from these smart environments. On the other hand, their artificial interaction partner may have similar reasons to not give away all information they have or to treat their human partner as an opponent rather than someone that has to be supported by smart technology.\ud
\ud
This will be elaborated in this paper. We will survey examples of human-computer interactions where there is not necessarily a goal to be explicit about intentions and feelings. In subsequent sections we will look at (1) the computer as a conversational partner, (2) the computer as a butler or diary companion, (3) the computer as a teacher or a trainer, acting in a virtual training environment (a serious game), (4) sports applications (that are not necessarily different from serious game or education environments), and games and entertainment applications
Neurophysiological Assessment of Affective Experience
In the field of Affective Computing the affective experience (AX) of the user during the interaction with computers is of great interest. The automatic recognition of the affective state, or emotion, of the user is one of the big challenges. In this proposal I focus on the affect recognition via physiological and neurophysiological signals. Longâstanding evidence from psychophysiological research and more recently from research in affective neuroscience suggests that both, body and brain physiology, are able to indicate the current affective state of a subject. However, regarding the classification of AX several questions are still unanswered. The principal possibility of AX classification was repeatedly shown, but its generalisation over different task contexts, elicitating stimuli modalities, subjects or time is seldom addressed. In this proposal I will discuss a possible agenda for the further exploration of physiological and neurophysiological correlates of AX over different elicitation modalities and task contexts
Bridging the gap between emotion and joint action
Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies
A Review of Verbal and Non-Verbal Human-Robot Interactive Communication
In this paper, an overview of human-robot interactive communication is
presented, covering verbal as well as non-verbal aspects of human-robot
interaction. Following a historical introduction, and motivation towards fluid
human-robot communication, ten desiderata are proposed, which provide an
organizational axis both of recent as well as of future research on human-robot
communication. Then, the ten desiderata are examined in detail, culminating to
a unifying discussion, and a forward-looking conclusion
Recommended from our members
Electrophysiological Studies of Visual Attention and of Emotion Regulation
Electrophysiological methods, such as electroencephalography (EEG) and electrocardiography (ECG), measure biological activity that allow us to infer underlying cognitive processes. In the first study, we use EEG to track feature-based attention (FBA), a form of visual attention that helps one detect objects with a particular color, motion, or orientation. We explore the use of SSVEPs, generated by flicker presented peripherally, to track attention in a visual search task presented centrally. Classification results show that one can track an observerâs attended color, which suggests that these methods may provide a viable means for tracking FBA in a real-time task. In the second study, we use cardiovascular measures to examine influences of the emotion regulation strategy of reappraisal. We examine cooperation and cardiovascular responses in individuals that were defected on by their opponent in the first round of an iterated Prisonerâs Dilemma. We find significant differences between the emotion regulation conditions using the biopsychosocial (BPS) model of challenge and threat, where participants primed with the reappraisal strategy were weakly comparable with a threat state of the BPS model and participants without an emotion regulation were weakly comparable with a challenge state of the BPS model. In the third study, we use EEG to study the chromatic sensitivity of FBA for color during a visual search task. We use SSVEP responses evoked through peripheral flicker to measure the spectral tuning of color detection mechanisms and how attentional selection is affected by distractor color. Our results find smaller responses for the distractor colors and suggest that feature-based attention to a particular color involves chromatic mechanisms that both enhance the response to a target and minimize responses to distractors
- âŠ