260,950 research outputs found
Learning Experiences in Programming: The Motivating Effect of a Physical Interface
A study of undergraduate students learning to program compared the use of a physical interface with use of a screen-based equivalent interface to obtain insights into what made for an engaging learning experience. Emotions characterized by the HUMAINE scheme were analysed, identifying the links between the emotions experienced during programming and their origin. By capturing the emotional experiences of learners immediately after a programming experience, evidence was collected of the very positive emotions experienced by learners developing a program using a physical interface (Arduino) in comparison with a similar program developed using a screen-based equivalent interface
A Model of Emotion as Patterned Metacontrol
Adaptive systems use feedback as a key strategy to cope with uncertainty and change in their environments. The information fed back from the sensorimotor loop into the control architecture can be used to change different elements of the controller at four different levels: parameters of the control model, the control model itself, the functional organization of the agent and the functional components of the agent. The complexity of such a space of potential configurations is daunting. The only viable alternative for the agent ?in practical, economical, evolutionary terms? is the reduction of the dimensionality of the configuration space. This reduction is achieved both by functionalisation —or, to be more precise, by interface minimization— and by patterning, i.e. the selection among a predefined set of organisational configurations. This last analysis let us state the central problem of how autonomy emerges from the integration of the cognitive, emotional and autonomic systems in strict functional terms: autonomy is achieved by the closure of functional dependency. In this paper we will show a general model of how the emotional biological systems operate following this theoretical analysis and how this model is also of applicability to a wide spectrum of artificial systems
Emotional Brain-Computer Interfaces
Research in Brain-computer interface (BCI) has significantly increased during the last few years. In addition to their initial role as assisting devices for the physically challenged, BCIs are now proposed for a wider range of applications. As in any HCI application, BCIs can also benefit from adapting their operation to the emotional state of the user. BCIs have the advantage of having access to brain activity which can provide signicant insight into the user's emotional state. This information can be utilized in two manners. 1) Knowledge of the inuence of the emotional state on brain activity patterns can allow the BCI to adapt its recognition algorithms, so that the intention of the user is still correctly interpreted in spite of signal deviations induced by the subject's emotional state. 2) The ability to recognize emotions can be used in BCIs to provide the user with more natural ways of controlling the BCI through affective modulation. Thus, controlling a BCI by recollecting a pleasant memory can be possible and can potentially lead to higher information transfer rates.\ud
These two approaches of emotion utilization in BCI are elaborated in detail in this paper in the framework of noninvasive EEG based BCIs
A formal model of emotional-response, inspired from human cognition and emotion systems
In this paper, we used the formalisms of decision-making theory and theories in psychology, physiology and cognition to proposing a macro model of human emotional-response. We believe that using such formalism can fill the gap between psychology, cognitive science and AI, and can be useful in the design of human-like agents.
This model can be used in a wide variety of applications such as artificial agents, user interface, and intelligent tutoring systems. Using the proposed model, we can provide for human behaviors like mood, personality and biological response in machines. This capability will enable such systems, to adapt their responses and behaviors. In situations where there are multiple ways for performing an action, this model can help with the decision making process
Brain–computer interface game applications for combined neurofeedback and biofeedback treatment for children on the autism spectrum
Individuals with Autism Spectrum Disorder (ASD) show deficits in social and communicative skills, including imitation, empathy, and shared attention, as well as restricted interests and repetitive patterns of behaviors. Evidence for and against the idea that dysfunctions in the mirror neuron system are involved in imitation and could be one underlying cause for ASD is discussed in this review. Neurofeedback interventions have reduced symptoms in children with ASD by self-regulation of brain rhythms. However, cortical deficiencies are not the only cause of these symptoms. Peripheral physiological activity, such as the heart rate, is closely linked to neurophysiological signals and associated with social engagement. Therefore, a combined approach targeting the interplay between brain, body and behavior could be more effective. Brain-computer interface applications for combined neurofeedback and biofeedback treatment for children with ASD are currently nonexistent. To facilitate their use, we have designed an innovative game that includes social interactions and provides neural- and body-based feedback that corresponds directly to the underlying significance of the trained signals as well as to the behavior that is reinforced
A dataset of continuous affect annotations and physiological signals for emotion analysis
From a computational viewpoint, emotions continue to be intriguingly hard to
understand. In research, direct, real-time inspection in realistic settings is
not possible. Discrete, indirect, post-hoc recordings are therefore the norm.
As a result, proper emotion assessment remains a problematic issue. The
Continuously Annotated Signals of Emotion (CASE) dataset provides a solution as
it focusses on real-time continuous annotation of emotions, as experienced by
the participants, while watching various videos. For this purpose, a novel,
intuitive joystick-based annotation interface was developed, that allowed for
simultaneous reporting of valence and arousal, that are instead often annotated
independently. In parallel, eight high quality, synchronized physiological
recordings (1000 Hz, 16-bit ADC) were made of ECG, BVP, EMG (3x), GSR (or EDA),
respiration and skin temperature. The dataset consists of the physiological and
annotation data from 30 participants, 15 male and 15 female, who watched
several validated video-stimuli. The validity of the emotion induction, as
exemplified by the annotation and physiological data, is also presented.Comment: Dataset available at:
https://rmc.dlr.de/download/CASE_dataset/CASE_dataset.zi
- …
