6 research outputs found
Neuron Depot: keeping your colleagues in sync by combining modern cloud storage services, the local file system, and simple web applications
Neuroscience today deals with a "data deluge" derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations
On the role of mentalizing processes in aesthetic appreciation: An ERP study
We used event-related brain potentials to explore the impact of mental perspective taking on processes of aesthetic appreciation of visual art. Participants (nonexperts) were first presented with information about the life and attitudes of a fictitious artist. Subsequently, they were cued trial-wise to make an aesthetic judgment regarding an image depicting a piece of abstract art either from their own perspective or from the imagined perspective of the fictitious artist (i.e., theory of mind condition). Positive self-referential judgments were made more quickly and negative self-referential judgments were made more slowly than the corresponding judgments from the imagined perspective. Event-related potential analyses revealed significant differences between the two tasks both within the preparation period (i.e., during the cue-stimulus interval) and within the stimulus presentation period. For the theory of mind condition we observed a relative centro-parietal negativity during the preparation period (700 – 330 ms preceding picture onset) and a relative centro-parietal positivity during the stimulus presentation period (700 – 1100 ms after stimulus onset). These findings suggest that different subprocesses are involved in aesthetic appreciation and judgment of visual abstract art from one’s own vs. from another person’s perspective
Sequential modulation of distractor-interference produced by semantic generalization of stimulus features
Sequential modulations of distractor-related interference (i.e., reduced congruency effect after incongruent as compared to congruent predecessor trials, a.k.a. Gratton effect) have been taken to reflect conflict-induced attentional focusing. To dismiss an alternative interpretation based on integration and retrieval of low-level features, it is important to exert experimental control of stimulus and response feature sequences. This has been achieved by considering only trials associated with complete feature changes. Furthermore, distractors from two different perceptual dimensions, such as stimulus location and shape, have been combined in the same experiment to investigate the question of specificity vs. generality of conflict adaptation. With this method feature sequence control can be exerted, in principle, without disregarding data from feature repetition trials. However, such control may be insufficient when the distractor dimensions overlap semantically. In two experiments we found evidence consistent with the assumption that semantic generalization of stimulus features, such as between a stimulus presented at a left-sided location and a stimulus shape pointing to the left, may yield a between-dimension Gratton effect. These findings raise doubts about inferring generalized attentional conflict adaptation when semantically related distractor dimensions are used
Being Moved: Linguistic Representation and Conceptual Structure
This study explored the organisation of the semantic field and the conceptual structure of moving experiences by investigating German-language expressions referring to the emotional state of being moved. We used present and past participles of eight psychological verbs as primes in a free word-association task, as these grammatical forms place their conceptual focus on the eliciting situation and on the felt emotional state, respectively. By applying a taxonomy of basic knowledge types and computing the Cognitive Salience Index, we identified joy and sadness as key emotional ingredients of being moved, and significant life events and art experiences as main elicitors of this emotional state. Metric multidimensional scaling analyses of the semantic field revealed that the core terms designate a cluster of emotional states characterised by low degrees of arousal and slightly positive valence, the latter due to a nearly balanced representation of positive and negative elements in the conceptual structure of being moved
Towards a neural chronometric framework for the aesthetic experience of music
Music is often studied as a cognitive domain alongside language. The emotional aspects of music have also been shown to be important, but views on their nature diverge. For instance, the specific emotions that music induces and how they relate to emotional expression are still under debate. Here we propose a mental and neural chronometry of the aesthetic experience of music initiated and mediated by external and internal contexts such as intentionality, background mood, attention, and expertise. The initial stages necessary for an aesthetic experience of music are feature analysis, integration across modalities, and cognitive processing on the basis of long-term knowledge. These stages are common to individuals belonging to the same musical culture. The initial emotional reactions to music include the startle reflex, core ‘liking’, and arousal. Subsequently, discrete emotions are perceived and induced. Presumably somatomotor processes synchronizing the body with the music also come into play here. The subsequent stages, in which cognitive, affective, and decisional processes intermingle, require controlled cross-modal neural processes to result in aesthetic emotions, aesthetic judgments, and conscious liking. These latter aesthetic stages often require attention, intentionality, and expertise for their full actualization
A functional MRI study of happy and sad emotions in music with and without lyrics
Musical emotions, such as happiness and sadness, have been investigated using instrumental music devoid of linguistic content. However, pop and rock, the most common musical genres, utilize lyrics for conveying emotions. Using participants’ self-selected musical excerpts, we studied their behavior and brain responses to elucidate how lyrics interact with musical emotion processing, as reflected by emotion recognition and activation of limbic areas involved in affective experience. We extracted samples from subjects’ selections of sad and happy pieces and sorted them according to the presence of lyrics. Acoustic feature analysis showed that music with lyrics differed from music without lyrics in spectral centroid, a feature related to perceptual brightness, whereas sad music with lyrics did not diverge from happy music without lyrics, indicating the role of other factors in emotion classification. Behavioral ratings revealed that happy music without lyrics induced stronger positive emotions than happy music with lyrics. We also acquired functional magnetic resonance imaging (fMRI) data while subjects performed affective tasks regarding the music. First, using ecological and acoustically variable stimuli, we broadened previous findings about the brain processing of musical emotions and of songs versus instrumental music. Additionally, contrasts between sad music with versus without lyrics recruited the parahippocampal gyrus, the amygdala, the claustrum, the putamen, the precentral gyrus, the medial and inferior frontal gyri (including Broca’s area), and the auditory cortex, while the reverse contrast produced no activations. Happy music without lyrics activated structures of the limbic system and the right pars opercularis of the inferior frontal gyrus, whereas auditory regions alone responded to happy music with lyrics. These findings point to the role of acoustic cues for the experience of happiness in music and to the importance of lyrics for sad musical emotions