3,295 research outputs found
Exploring the Affective Loop
Research in psychology and neurology shows that both body and mind are
involved when experiencing emotions (Damasio 1994, Davidson et al.
2003). People are also very physical when they try to communicate their
emotions. Somewhere in between beings consciously and unconsciously
aware of it ourselves, we produce both verbal and physical signs to make
other people understand how we feel. Simultaneously, this production of
signs involves us in a stronger personal experience of the emotions we
express.
Emotions are also communicated in the digital world, but there is little
focus on users' personal as well as physical experience of emotions in
the available digital media. In order to explore whether and how we can
expand existing media, we have designed, implemented and evaluated
/eMoto/, a mobile service for sending affective messages to others. With
eMoto, we explicitly aim to address both cognitive and physical
experiences of human emotions. Through combining affective gestures for
input with affective expressions that make use of colors, shapes and
animations for the background of messages, the interaction "pulls" the
user into an /affective loop/. In this thesis we define what we mean by
affective loop and present a user-centered design approach expressed
through four design principles inspired by previous work within Human
Computer Interaction (HCI) but adjusted to our purposes; /embodiment/
(Dourish 2001) as a means to address how people communicate emotions in
real life, /flow/ (Csikszentmihalyi 1990) to reach a state of
involvement that goes further than the current context, /ambiguity/ of
the designed expressions (Gaver et al. 2003) to allow for open-ended
interpretation by the end-users instead of simplistic, one-emotion
one-expression pairs and /natural but designed expressions/ to address
people's natural couplings between cognitively and physically
experienced emotions. We also present results from an end-user study of
eMoto that indicates that subjects got both physically and emotionally
involved in the interaction and that the designed "openness" and
ambiguity of the expressions, was appreciated and understood by our
subjects. Through the user study, we identified four potential design
problems that have to be tackled in order to achieve an affective loop
effect; the extent to which users' /feel in control/ of the interaction,
/harmony and coherence/ between cognitive and physical expressions/,/
/timing/ of expressions and feedback in a communicational setting, and
effects of users' /personality/ on their emotional expressions and
experiences of the interaction
KEER2022
AvanttĂtol: KEER2022. DiversitiesDescripciĂł del recurs: 25 juliol 202
A sound is worth a thousand words : exploring the taxonomic and causal link between emotions and sound objects
This investigation's object is the relation between emotion and sound and how the
latter can be understood through an emotion-oriented study. Psychological investigations
strive to understand how the world affects people and how people, in turn, understand the
world on the grounds of their own reflections and interpretations. Thus, an emotional
understanding of sound is inevitably linked to the concept of perceived emotion. This
dissertation's purpose is to understand whether there is a taxonomic relation between sounds
and perceived emotions. To this aim, emotional semantics and proposals for emotional
categorization are approached, as well as studies on sound categorization and its relation with
experiments between emotion and sound or music. Two studies investigated the
aforementioned themes. In Experiment 1, participants rated sound-image pairs in a causaloriented
environment, followed by a similar recall task, with the aim of understanding the
connection between the listener and a sound's semantic content. In Experiment 2, participants
rated a group of sounds, half of which were masked to hide their semantic content, with the
goal of understanding the importance of semantic content in auditory stimuli. Taken together,
the data suggest that some emotions cannot be transmitted by sound alone and that it takes a
combination of the listener, the context, and the sound's physical features in order to get a
complete understanding of perceived emotions
BRAIN-COMPUTER MUSIC INTERFACING: DESIGNING PRACTICAL SYSTEMS FOR CREATIVE APPLICATIONS
Brain-computer music interfacing (BCMI) presents a novel approach to music making, as it requires only the brainwaves of a user to control musical parameters. This presents immediate benefits for users with motor disabilities that may otherwise prevent them from engaging in traditional musical activities such as composition, performance or collaboration with other musicians. BCMI systems with active control, where a user can make cognitive choices that are detected within brain signals, provide a platform for developing new approaches towards accomplishing these activities. BCMI systems that use passive control present an interesting alternate to active control, where control over music is accomplished by harnessing brainwave patterns that are associated with subconscious mental states. Recent developments in brainwave measuring technologies, in particular electroencephalography (EEG), have made brainwave interaction with computer systems more affordable and accessible and the time is ripe for research into the potential such technologies can offer for creative applications for users of all abilities.
This thesis presents an account of BCMI development that investigates methods of active, passive and hybrid (multiple control methods) control that include control over electronic music, acoustic instrumental music, multi-brain systems and combining methods of brainwave control.
In practice there are many obstacles associated with detecting useful brainwave signals, in particular when scaling systems otherwise designed for medical studies for use outside of laboratory settings. Two key areas are addressed throughout this thesis. Firstly, improving the accuracy of meaningful brain signal detection in BCMI, and secondly, exploring the creativity available in user control through ways in which brainwaves can be mapped to musical features.
Six BCMIs are presented in this thesis, each with the objective of exploring a unique aspect of user control. Four of these systems are designed for live BCMI concert performance, one evaluates a proof-of-concept through end-user testing and one is designed as a musical composition tool.
The thesis begins by exploring the field of brainwave detection and control and identifies the steady-state visually evoked potential (SSVEP) method of eliciting brainwave control as a suitable technique for use in BCMI. In an attempt to improve signal accuracy of the SSVEP technique a new modular hardware unit is presented that provides accurate SSVEP stimuli, suitable for live music performance. Experimental data confirms the performance of the unit in tests across three different EEG hardware platforms. Results across 11 users indicate that a mean accuracy of 96% and an average response time of 3.88 seconds are attainable with the system. These results contribute to the development of the BCMI for Activating Memory, a multi-user system. Once a stable SSVEP platform is developed, control is extended through the integration of two more brainwave control techniques: affective (emotional) state detection and motor imagery response. In order to ascertain the suitability of the former an experiment confirms the accuracy of EEG when measuring affective states in response to music in a pilot study.
This thesis demonstrates how a range of brainwave detection methods can be used for creative control in musical applications. Video and audio excerpts of BCMI pieces are also included in the Appendices
Co-Design with Myself: A Brain-Computer Interface Design Tool that Predicts Live Emotion to Enhance Metacognitive Monitoring of Designers
Intuition, metacognition, and subjective uncertainty interact in complex ways
to shape the creative design process. Design intuition, a designer's innate
ability to generate creative ideas and solutions based on implicit knowledge
and experience, is often evaluated and refined through metacognitive
monitoring. This self-awareness and management of cognitive processes can be
triggered by subjective uncertainty, reflecting the designer's self-assessed
confidence in their decisions. Despite their significance, few creativity
support tools have targeted the enhancement of these intertwined components
using biofeedback, particularly the affect associated with these processes. In
this study, we introduce "Multi-Self," a BCI-VR design tool designed to amplify
metacognitive monitoring in architectural design. Multi-Self evaluates
designers' affect (valence and arousal) to their work, providing real-time,
visual biofeedback. A proof-of-concept pilot study with 24 participants
assessed its feasibility. While feedback accuracy responses were mixed, most
participants found the tool useful, reporting that it sparked metacognitive
monitoring, encouraged exploration of the design space, and helped modulate
subjective uncertainty
An aesthetics of touch: investigating the language of design relating to form
How well can designers communicate qualities of touch?
This paper presents evidence that they have some capability to do so, much of which appears to have been learned, but at present make limited use of such language. Interviews with graduate designer-makers suggest that they are aware of and value the importance of touch and materiality in their work, but lack a vocabulary to fully relate to their detailed explanations of other aspects such as their intent or selection of materials. We believe that more attention should be paid to the verbal dialogue that happens in the design process, particularly as other researchers show that even making-based learning also has a strong verbal element to it. However, verbal language alone does not appear to be adequate for a comprehensive language of touch. Graduate designers-makersâ descriptive practices combined non-verbal manipulation within verbal accounts. We thus argue that haptic vocabularies do not simply describe material qualities, but rather are situated competences that physically demonstrate the presence of haptic qualities. Such competencies are more important than groups of verbal vocabularies in isolation. Design support for developing and extending haptic competences must take this wide range of considerations into account to comprehensively improve designersâ capabilities
Common Emotion Modeling in Distinct Medium Analysis and Matching
With the ever growing amount of digital information and multimedia on the World Wide Web and the current trend towards personalizing technology, users find themselves wanting a more intuitive way of finding related information, and not just any information but relevant information that is personal to them. One way to personalize and filter the information is by extracting the mood affectation, allowing the user to search based on current mood. The artificial intelligence field has done extensive research and continues to discover and improve current mood extraction techniques for each distinct medium. This paper will explore how to link and integrate the mood extraction of several distinct mediumsâ audio, image, and textâby utilizing a common emotion model that is customizable to the user. This project will allow the user to provide an input medium and find a matching output of a different medium based on default settings or user customization
- âŠ