18 research outputs found

    Surmising synchrony of sound and sight:Factors explaining variance of audiovisual integration in hurdling, tap dancing and drumming

    Get PDF
    Auditory and visual percepts are integrated even when they are not perfectly temporally aligned with each other, especially when the visual signal precedes the auditory signal. This window of temporal integration for asynchronous audiovisual stimuli is relatively well examined in the case of speech, while other natural action-induced sounds have been widely neglected. Here, we studied the detection of audiovisual asynchrony in three different whole-body actions with natural action-induced sounds–hurdling, tap dancing and drumming. In Study 1, we examined whether audiovisual asynchrony detection, assessed by a simultaneity judgment task, differs as a function of sound production intentionality. Based on previous findings, we expected that auditory and visual signals should be integrated over a wider temporal window for actions creating sounds intentionally (tap dancing), compared to actions creating sounds incidentally (hurdling). While percentages of perceived synchrony differed in the expected way, we identified two further factors, namely high event density and low rhythmicity, to induce higher synchrony ratings as well. Therefore, we systematically varied event density and rhythmicity in Study 2, this time using drumming stimuli to exert full control over these variables, and the same simultaneity judgment tasks. Results suggest that high event density leads to a bias to integrate rather than segregate auditory and visual signals, even at relatively large asynchronies. Rhythmicity had a similar, albeit weaker effect, when event density was low. Our findings demonstrate that shorter asynchronies and visual-first asynchronies lead to higher synchrony ratings of whole-body action, pointing to clear parallels with audiovisual integration in speech perception. Overconfidence in the naturally expected, that is, synchrony of sound and sight, was stronger for intentional (vs. incidental) sound production and for movements with high (vs. low) rhythmicity, presumably because both encourage predictive processes. In contrast, high event density appears to increase synchronicity judgments simply because it makes the detection of audiovisual asynchrony more difficult. More studies using real-life audiovisual stimuli with varying event densities and rhythmicities are needed to fully uncover the general mechanisms of audiovisual integration

    Dual-Tasking in the Near-Hand Space: Effects of Stimulus-Hand Proximity on Between-Task Shifts in the Psychological Refractory Period Paradigm

    Get PDF
    Two decades of research indicate that visual processing is typically enhanced for items that are in the space near the hands (near-hand space). Enhanced attention and cognitive control have been thought to be responsible for the observed effects, amongst others. As accumulating experimental evidence and recent theories of dual-tasking suggest an involvement of cognitive control and attentional processes during dual tasking, dual-task performance may be modulated in the near-hand space. Therefore, we performed a series of three experiments that aimed to test if the near-hand space affects the shift between task-component processing in two visual-manual tasks. We applied a Psychological Refractory Period Paradigm (PRP) with varying stimulus-onset asynchrony (SOA) and manipulated stimulus-hand proximity by placing hands either on the side of a computer screen (near-hand condition) or on the lap (far-hand condition). In Experiment 1, Task 1 was a number categorization task (odd vs. even) and Task 2 was a letter categorization task (vowel vs. consonant). Stimulus presentation was spatially segregated with Stimulus 1 presented on the right side of the screen, appearing first and then Stimulus 2, presented on the left side of the screen, appearing second. In Experiment 2, we replaced Task 2 with a color categorization task (orange vs. blue). In Experiment 3, Stimulus 1 and Stimulus 2 were centrally presented as a single bivalent stimulus. The classic PRP effect was shown in all three experiments, with Task 2 performance declining at short SOA while Task 1 performance being relatively unaffected by task-overlap. In none of the three experiments did stimulus-hand proximity affect the size of the PRP effect. Our results indicate that the switching operation between two tasks in the PRP paradigm is neither optimized nor disturbed by being processed in near-hand space

    Maternal exercise before and during pregnancy does not impact offspring exercise or body composition in mice

    Get PDF
    Abstract Background The genome, the environment, and their interactions simultaneously regulate complex traits such as body composition and voluntary exercise levels. One such environmental influence is the maternal milieu (i.e., in utero environment or maternal care). Variability in the maternal environment may directly impact the mother, and simultaneously has the potential to influence the physiology and/or behavior of offspring in utero, post birth, and into adulthood. Here, we utilized a murine model to examine the effects of the maternal environment in regard to voluntary exercise (absence of wheel running, wheel running prior to gestation, and wheel running prior to and throughout gestation) on offspring weight and body composition (% fat tissue and % lean tissue) throughout development (~3 to ~9 weeks of age). Additionally, we examined the effects of ~6 weeks of maternal exercise (prior to and during gestation) on offspring exercise levels at ~9 weeks of age. Results We observed no substantial effects of maternal exercise on subsequent male or female offspring body composition throughout development, or on the propensity of offspring to engage in voluntary wheel running. At the level of the individual, correlational analyses revealed some statistically significant relationships between maternal and offspring exercise levels, likely reflecting previously known heritability estimates for such traits. Conclusions The current results conflict with previous findings in human and mouse models demonstrating that maternal exercise has the potential to alter offspring phenotypes. We discuss our negative findings in the context of the timing of the maternal exercise and the level of biological organization of the examined phenotypes within the offspring

    Lexical olfaction recruits olfactory orbitofrontal cortex in metaphorical and literal contexts

    Get PDF
    The investigation of specific lexical categories has substantially contributed to advancing our knowledge on how meaning is neurally represented. One sensory domain that has received particularly little attention is olfaction. This study aims to investigate the neural representation of lexical olfaction. In an fMRI experiment, participants read olfactory metaphors, their literal paraphrases, and literal olfactory sentences. Regions of interest were defined by a functional localizer run of odor processing. We observed activation in secondary olfactory areas during metaphorical and literal olfactory processing, thus extending previous findings to the novel source domain of olfaction. Previously reported enhanced activation in emotion-related areas due to metaphoricity could not be replicated. Finally, no primary olfactory cortex was found active during lexical olfaction processing. We suggest that this absence is due to olfactory hedonicity being crucial to understand the meaning of the current olfactory expressions. Consequently, the processing of olfactory hedonicity recruits secondary olfactory areas

    Touching events predict human action segmentation in brain and behavior

    No full text

    Dual-Tasking in the Near-Hand Space: Effects of Stimulus-Hand Proximity on Between-Task Shifts in the Psychological Refractory Period Paradigm

    No full text
    Two decades of research indicate that visual processing is typically enhanced for items that are in the space near the hands (near-hand space). Enhanced attention and cognitive control have been thought to be responsible for the observed effects, amongst others. As accumulating experimental evidence and recent theories of dual-tasking suggest an involvement of cognitive control and attentional processes during dual tasking, dual-task performance may be modulated in the near-hand space. Therefore, we performed a series of three experiments that aimed to test if the near-hand space affects the shift between task-component processing in two visual-manual tasks. We applied a Psychological Refractory Period Paradigm (PRP) with varying stimulus-onset asynchrony (SOA) and manipulated stimulus-hand proximity by placing hands either on the side of a computer screen (near-hand condition) or on the lap (far-hand condition). In Experiment 1, Task 1 was a number categorization task (odd vs. even) and Task 2 was a letter categorization task (vowel vs. consonant). Stimulus presentation was spatially segregated with Stimulus 1 presented on the right side of the screen, appearing first and then Stimulus 2, presented on the left side of the screen, appearing second. In Experiment 2, we replaced Task 2 with a color categorization task (orange vs. blue). In Experiment 3, Stimulus 1 and Stimulus 2 were centrally presented as a single bivalent stimulus. The classic PRP effect was shown in all three experiments, with Task 2 performance declining at short SOA while Task 1 performance being relatively unaffected by task-overlap. In none of the three experiments did stimulus-hand proximity affect the size of the PRP effect. Our results indicate that the switching operation between two tasks in the PRP paradigm is neither optimized nor disturbed by being processed in near-hand space

    Maternal Exercise Before and During Pregnancy Does Not Impact Offspring Voluntary Physical Activity or Body Composition in Mice

    No full text
    Background - The genome, the environment, and their interactions simultaneously regulate complex traits such as body composition and voluntary exercise levels. One such environmental influence is the maternal milieu (i.e., in utero environment or maternal care). Variability in the maternal environment may directly impact the mother, and simultaneously has the potential to influence the physiology and/or behavior of offspring in utero, post birth, and into adulthood. Here, we utilized a murine model to examine the effects of the maternal environment in regard to voluntary exercise (absence of wheel running, wheel running prior to gestation, and wheel running prior to and throughout gestation) on offspring weight and body composition (% fat tissue and % lean tissue) throughout development (~3 to ~9 weeks of age). Additionally, we examined the effects of ~6 weeks of maternal exercise (prior to and during gestation) on offspring exercise levels at ~9 weeks of age. Results - We observed no substantial effects of maternal exercise on subsequent male or female offspring body composition throughout development, or on the propensity of offspring to engage in voluntary wheel running. At the level of the individual, correlational analyses revealed some statistically significant relationships between maternal and offspring exercise levels, likely reflecting previously known heritability estimates for such traits. Conclusions - The current results conflict with previous findings in human and mouse models demonstrating that maternal exercise has the potential to alter offspring phenotypes. We discuss our negative findings in the context of the timing of the maternal exercise and the level of biological organization of the examined phenotypes within the offspring

    Incidental or intentional? Different brain responses to one's own action sounds in hurdling vs. tap dancing

    No full text
    Most human actions produce concomitant sounds. Action sounds can be either part of the action goal (GAS, goal-related action sounds), as for instance in tap dancing, or a mere by-product of the action (BAS, by-product action sounds), as for instance in hurdling. It is currently unclear whether these two types of action sounds—incidental or intentional—differ in their neural representation and whether the impact on the performance evaluation of an action diverges between the two. We here examined whether during the observation of tap dancing compared to hurdling, auditory information is a more important factor for positive action quality ratings. Moreover, we tested whether observation of tap dancing vs. hurdling led to stronger attenuation in primary auditory cortex, and a stronger mismatch signal when sounds do not match our expectations. We recorded individual point-light videos of newly trained participants performing tap dancing and hurdling. In the subsequent functional magnetic resonance imaging (fMRI) session, participants were presented with the videos that displayed their own actions, including corresponding action sounds, and were asked to rate the quality of their performance. Videos were either in their original form or scrambled regarding the visual modality, the auditory modality, or both. As hypothesized, behavioral results showed significantly lower rating scores in the GAS condition compared to the BAS condition when the auditory modality was scrambled. Functional MRI contrasts between BAS and GAS actions revealed higher activation of primary auditory cortex in the BAS condition, speaking in favor of stronger attenuation in GAS, as well as stronger activation of posterior superior temporal gyri and the supplementary motor area in GAS. Results suggest that the processing of self-generated action sounds depends on whether we have the intention to produce a sound with our action or not, and action sounds may be more prone to be used as sensory feedback when they are part of the explicit action goal. Our findings contribute to a better understanding of the function of action sounds for learning and controlling sound-producing actions

    Using enriched semantic event chains to model human action prediction based on (minimal) spatial information

    No full text
    Predicting other people’s upcoming action is key to successful social interactions. Previous studies have started to disentangle the various sources of information that action observers exploit, including objects, movements, contextual cues and features regarding the acting person’s identity. We here focus on the role of static and dynamic inter-object spatial relations that change during an action. We designed a virtual reality setup and tested recognition speed for ten different manipulation actions. Importantly, all objects had been abstracted by emulating them with cubes such that participants could not infer an action using object information. Instead, participants had to rely only on the limited information that comes from the changes in the spatial relations between the cubes. In spite of these constraints, participants were able to predict actions in, on average, less than 64% of the action’s duration. Furthermore, we employed a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of different types of spatial relations: (a) objects’ touching/untouching, (b) static spatial relations between objects and (c) dynamic spatial relations between objects during an action. Assuming the eSEC as an underlying model, we show, using information theoretical analysis, that humans mostly rely on a mixed-cue strategy when predicting actions. Machine-based action prediction is able to produce faster decisions based on individual cues. We argue that human strategy, though slower, may be particularly beneficial for prediction of natural and more complex actions with more variable or partial sources of information. Our findings contribute to the understanding of how individuals afford inferring observed actions’ goals even before full goal accomplishment, and may open new avenues for building robots for conflict-free human-robot cooperationTaikomosios informatikos katedraVytauto Didžiojo universiteta
    corecore