37 research outputs found

    Haptic Aesthetics and Bodily Properties of Ori Gersht’s Digital Art: A Behavioral and Eye-Tracking Study.

    Get PDF
    Experimental aesthetics has shed light on the involvement of pre-motor areas in the perception of abstract art. However, the contribution of texture perception to aesthetic experience is still understudied. We hypothesized that digital screen-based art, despite its immateriality, might suggest potential sensorimotor stimulation. Original born-digital works of art were selected and manipulated by the artist himself. Five behavioral parameters: Beauty, Liking, Touch, Proximity, and Movement, were investigated under four experimental conditions: Resolution (high/low), and Magnitude (Entire image/detail). These were expected to modulate the quantity of material and textural information afforded by the image. While the Detail condition afforded less content-related information, our results show that it augmented the image’s haptic appeal. High Resolution improved the haptic and aesthetic properties of the images. Furthermore, aesthetic ratings positively correlated with sensorimotor ratings. Our results demonstrate a strict relation between the aesthetic and sensorimotor/haptic qualities of the images, empirically establishing a relationship between beholders’ bodily involvement and their aesthetic judgment of visual works of art. In addition, we found that beholders’ oculomotor behavior is selectively modulated by the perceptual manipulations being performed. The eye-tracking results indicate that the observation of the Entire, original images is the only condition in which the latency of the first fixation is shorter when participants gaze to the left side of the images. These results thus demonstrate the existence of a left-side bias during the observation of digital works of art, in particular, while participants are observing their original version

    The consequences of COVID-19 on social interactions: an online study on face covering

    Get PDF
    The COVID-19 pandemic has dramatically changed the nature of our social interactions. In order to understand how protective equipment and distancing measures influence the ability to comprehend others' emotions and, thus, to effectively interact with others, we carried out an online study across the Italian population during the first pandemic peak. Participants were shown static facial expressions (Angry, Happy and Neutral) covered by a sanitary mask or by a scarf. They were asked to evaluate the expressed emotions as well as to assess the degree to which one would adopt physical and social distancing measures for each stimulus. Results demonstrate that, despite the covering of the lower-face, participants correctly recognized the facial expressions of emotions with a polarizing effect on emotional valence ratings found in females. Noticeably, while females' ratings for physical and social distancing were driven by the emotional content of the stimuli, males were influenced by the "covered" condition. The results also show the impact of the pandemic on anxiety and fear experienced by participants. Taken together, our results offer novel insights on the impact of the COVID-19 pandemic on social interactions, providing a deeper understanding of the way people react to different kinds of protective face covering

    Emotional body postures affect inhibitory control only when task-relevant

    Get PDF
    A classical theoretical frame to interpret motor reactions to emotional stimuli is that such stimuli, particularly those threat-related, are processed preferentially, i.e., they are capable of capturing and grabbing attention automatically. Research has recently challenged this view, showing that the task relevance of emotional stimuli is crucial to having a reliable behavioral effect. Such evidence indicated that emotional facial expressions do not automatically influence motor responses in healthy young adults, but they do so only when intrinsically pertinent to the ongoing subject’s goals. Given the theoretical relevance of these findings, it is essential to assess their generalizability to different, socially relevant emotional stimuli such as emotional body postures. To address this issue, we compared the performance of 36 right-handed participants in two different versions of a Go/No-go task. In the Emotional Discrimination task, participants were required to withhold their responses at the display of emotional body postures (fearful or happy) and to move at the presentation of neutral postures. Differently, in the control task, the same images were shown, but participants had to respond according to the color of the actor/actress’ t-shirt, disregarding the emotional content. Results showed that participants made more commission errors (instances in which they moved even though the No-go signal was presented) for happy than fearful body postures in the Emotional Discrimination task. However, this difference disappeared in the control task. Such evidence indicates that, like facial emotion, emotional body expressions do not influence motor control automatically, but only when they are task-relevant

    Shedding light on typical species : implications for habitat monitoring

    Get PDF
    Habitat monitoring in Europe is regulated by Article 17 of the Habitats Directive, which suggests the use of typical species to assess habitat conservation status. Yet, the Directive uses the term “typical” species but does not provide a definition, either for its use in reporting or for its use in impact assessments. To address the issue, an online workshop was organized by the Italian Society for Vegetation Science (SISV) to shed light on the diversity of perspectives regarding the different concepts of typical species, and to discuss the possible implications for habitat monitoring. To this aim, we inquired 73 people with a very different degree of expertise in the field of vegetation science by means of a tailored survey composed of six questions. We analysed the data using Pearson's Chi-squared test to verify that the answers diverged from a random distribution and checked the effect of the degree of experience of the surveyees on the results. We found that most of the surveyees agreed on the use of the phytosociological method for habitat monitoring and of the diagnostic and characteristic species to evaluate the structural and functional conservation status of habitats. With this contribution, we shed light on the meaning of “typical” species in the context of habitat monitoring

    Notulae to the Italian native vascular flora: 1

    Get PDF
    In this contribution, new data concerning the Italian distribution of native vascular flora are presented. It includes new records, exclusions, and confirmations pertaining to the Italian administrative regions for taxa in the genera Arundo, Bromopsis, Cistus, Crocus, Festuca, Galeopsis, Genista, Lamium, Leucanthemum, Nerium, Orobanche, Peucedanum, Pilosella, Polycnemum, Stipa and Viola

    Notulae to the Italian alien vascular flora: 1

    Get PDF
    In this contribution, new data concerning the Italian distribution of alien vascular flora are presented. It includes new records, exclusions, and confirmations for Italy or for Italian administrative regions for taxa in the genera Agave, Arctotheca, Berberis, Bidens, Cardamine, Catalpa, Cordyline, Cotoneaster, Dichondra, Elaeagnus, Eragrostis, Impatiens, Iris, Koelreuteria, Lamiastrum, Lantana, Ligustrum, Limnophila, Lonicera, Lycianthes, Maclura, Mazus, Paspalum, Pelargonium, Phyllanthus, Pyracantha, Ruellia, Sorghum, Symphyotrichum, Triticum, Tulbaghia and Youngia

    Audio-visuomotor processing in the musician's brain: an ERP study on professional violinists and clarinetists

    Get PDF
    The temporal dynamics of brain activation during visual and auditory perception of congruent vs. incongruent musical video clips was investigated in 12 musicians from the Milan Conservatory of music and 12 controls. 368 videos of a clarinetist and a violinist playing the same score with their instruments were presented. The sounds were similar in pitch, intensity, rhythm and duration. To produce an audiovisual discrepancy, in half of the trials, the visual information was incongruent with the soundtrack in pitch. ERPs were recorded from 128 sites. Only in musicians for their own instruments was a N400-like negative deflection elicited due to the incongruent audiovisual information. SwLORETA applied to the N400 response identified the areas mediating multimodal motor processing: the prefrontal cortex, the right superior and middle temporal gyrus, the premotor cortex, the inferior frontal and inferior parietal areas, the EBA, somatosensory cortex, cerebellum and SMA. The data indicate the existence of audiomotor mirror neurons responding to incongruent visual and auditory information, thus suggesting that they may encode multimodal representations of musical gestures and sounds. These systems may underlie the ability to learn how to play a musical instrument

    Comprehending body language and mimics: an ERP and neuroimaging study on Italian actors and viewers

    Get PDF
    In this study, the neural mechanism subserving the ability to understand people's emotional and mental states by observing their body language (facial expression, body posture and mimics) was investigated in healthy volunteers. ERPs were recorded in 30 Italian University students while they evaluated 280 pictures of highly ecological displays of emotional body language that were acted out by 8 male and female Italian actors. Pictures were briefly flashed and preceded by short verbal descriptions (e.g., "What a bore!") that were incongruent half of the time (e.g., a picture of a very attentive and concentrated person shown after the previous example verbal description). ERP data and source reconstruction indicated that the first recognition of incongruent body language occurred 300 ms post-stimulus. swLORETA performed on the N400 identified the strongest generators of this effect in the right rectal gyrus (BA11) of the ventromedial orbitofrontal cortex, the bilateral uncus (limbic system) and the cingulate cortex, the cortical areas devoted to face and body processing (STS, FFA EBA) and the premotor cortex (BA6), which is involved in action understanding. These results indicate that face and body mimics undergo a prioritized processing that is mostly represented in the affective brain and is rapidly compared with verbal information. This process is likely able to regulate social interactions by providing on-line information about the sincerity and trustfulness of others

    "Embodied Body Language": An electrical neuroimaging study with emotional faces and bodies

    No full text
    To date, most investigations in the field of affective neuroscience mainly focused on the processing of facial expressions, overlooking the exploration of emotional body language (EBL), its capability to express our emotions notwithstanding. Few electrophysiological studies investigated the time course and the neural correlates of EBL and the integration of face and body emotion-related information. The aim of the present study was to investigate both the time course and the neural correlates underlying the integration of affective information conveyed by faces and bodies. We analysed EEG activities evoked during an expression matching task, requiring the judgment of emotional congruence between sequentially presented pairs of stimuli belonging to the same category (face-face or body-body), and between stimuli belonging to different categories (face-body or body-face). We focused on N400 time window and results showed that incongruent stimuli elicited a modulation of the N400 in all comparisons except for body-face condition. This modulation was mainly detected in the Middle Temporal Gyrus and within regions related to the mirror mechanism. More specifically, while the perception of incongruent facial expressions activates somatosensory-related representations, incongruent emotional body postures also require the activation of motor and premotor representations, suggesting a strict link between emotion and action
    corecore