467 research outputs found

    Dynamic patterns make the premotor cortex interested in objects: Influence of stimulus and task revealed by fMRI

    Get PDF
    Research in monkey and man indicates that the ventrolateral premotor cortex (PMv) underlies not only the preparation of manual movements, but also the perceptual representation of pragmatic object properties. However, visual stimuli without any pragmatic meaning were recently found to elicit selective PMv responses if they were subjected to a perceivable pattern of change. We used functional magnetic resonance imaging (fMRI) to investigate if perceptual representations in the PMv might apply not only to pragmatic, but also to dynamic stimulus properties. To this end, a sequential figure matching task that required the processing of dynamic features was contrasted with a non-figure control task (Experiment 1) and an individual figure matching task (Experiment 2). In order to control for potential influences of stimulus properties that might be associated with pragmatic attributes, different types of abstract visual stimuli were employed. The experiments yielded two major findings: if their dynamic properties are attended, then abstract 2D visual figures are sufficient to trigger activation within premotor areas involved in hand-object interaction. Moreover, these premotor activations are independent from stimulus properties that might relate to pragmatic features. The results imply that the PMv is engaged in the processing of stimuli that are usually or actually embedded within either a pragmatic or a dynamic context

    A blueprint for target motion: fMRI reveals perceived sequential complexity to modulate premotor cortex

    Get PDF
    The execution of movements that are guided by an increasingly complex target motion is known to draw on premotor cortices. Whole-brain functional magnetic resonance imaging was used to investigate whether, in the absence of any movement, attending to and predicting increasingly complex target motion also rely on premotor cortices. Complexity was varied as a function of number of sequential elements and amount of dynamic sequential trend in a pulsing target motion. As a result, serial prediction caused activations in premotor and parietal cortices, particularly within the right hemisphere. Parametric analyses revealed that the right ventrolateral premotor cortex and the right anterior intraparietal sulcus were the only areas that, in addition, covaried positively with both behavioral and physical measures of sequential complexity. Further areas that covaried positively with increasing task difficulty reflected influences of both number and trend manipulation. In particular, increasing element number drew on dorsal premotor and corresponding posterior intraparietal regions, whereas increasing trend drew on the visual motion area and area V4. The present findings demonstrate that premotor involvement directly reflects perceptual complexity in attended and predicted target motion. It is suggested that when we try to predict how a target will move, the motor system generates a “blueprint” of the observed motion that allows potential sensorimotor integration. In the absence of any motor requirement, this blueprint appears to be not a by-product of motor planning, but rather the basis for target motion prediction

    burleszk-operette 3 felvonásban - írta Báron Rezső és Fellner Pál - zenéjét szerzé Báron Rezső

    Get PDF
    Városi Szinház. Debreczen, 1913. évi szeptember hó 5 -én, pénteken: Ujdonság. Itt először.Debreceni Egyetem Egyetemi és Nemzeti Könyvtá

    Premotor cortex in observing erroneous action: An fMRI study

    Get PDF
    The lateral premotor cortex (PMC) is involved during action observation in monkeys and humans, reflecting a matching process between observed actions and their corresponding motor schemata. In the present study, functional magnetic resonance imaging (fMRI) was used to investigate if paying attention to the two observable action components, objects and movements, modulates premotor activation during the observation of actions. Participants were asked to classify presented movies as showing correct actions, erroneous actions, or senseless movements. Erroneous actions were incorrect either with regard to employed objects, or to performed movements. The experiment yielded two major results: (1) The ventrolateral premotor cortex (vPMC) and the anterior part of the intraparietal sulcus (aIPS) are strongly activated during the observation of actions in humans. Premotor activation was dominantly located within Brodmann Area (BA) 6, and sometimes extended into BA 44. (2) The presentation of object errors and movements errors allowed to disentangle brain activations corresponding to the analysis of movements and objects in observed actions. Left premotor areas were more involved in the analysis of objects, whereas right premotor areas were dominant in the analysis of movements. It is suggested that the analysis of categorical information, like objects, and that of coordinate information, like movements, are pronounced in different hemispheres

    VMEXT: A Visualization Tool for Mathematical Expression Trees

    Full text link
    Mathematical expressions can be represented as a tree consisting of terminal symbols, such as identifiers or numbers (leaf nodes), and functions or operators (non-leaf nodes). Expression trees are an important mechanism for storing and processing mathematical expressions as well as the most frequently used visualization of the structure of mathematical expressions. Typically, researchers and practitioners manually visualize expression trees using general-purpose tools. This approach is laborious, redundant, and error-prone. Manual visualizations represent a user's notion of what the markup of an expression should be, but not necessarily what the actual markup is. This paper presents VMEXT - a free and open source tool to directly visualize expression trees from parallel MathML. VMEXT simultaneously visualizes the presentation elements and the semantic structure of mathematical expressions to enable users to quickly spot deficiencies in the Content MathML markup that does not affect the presentation of the expression. Identifying such discrepancies previously required reading the verbose and complex MathML markup. VMEXT also allows one to visualize similar and identical elements of two expressions. Visualizing expression similarity can support support developers in designing retrieval approaches and enable improved interaction concepts for users of mathematical information retrieval systems. We demonstrate VMEXT's visualizations in two web-based applications. The first application presents the visualizations alone. The second application shows a possible integration of the visualizations in systems for mathematical knowledge management and mathematical information retrieval. The application converts LaTeX input to parallel MathML, computes basic similarity measures for mathematical expressions, and visualizes the results using VMEXT.Comment: 15 pages, 4 figures, Intelligent Computer Mathematics - 10th International Conference CICM 2017, Edinburgh, UK, July 17-21, 2017, Proceeding

    Why you think Milan is larger than Modena: Neural correlates of the recognition heuristic

    Get PDF
    When ranking two alternatives by some criteria and only one of the alternatives is recognized, participants overwhelmingly adopt the strategy, termed the recognition heuristic (RH), of choosing the recognized alternative. Understanding the neural correlates underlying decisions that follow the RH could help determine whether people make judgments about the RH's applicability or simply choose the recognized alternative. We measured brain activity by using functional magnetic resonance imaging while participants indicated which of two cities they thought was larger (Experiment 1) or which city they recognized (Experiment 2). In Experiment 1, increased activation was observed within the anterior frontomedian cortex (aFMC), precuneus, and retrosplenial cortex when participants followed the RH compared to when they did not. Experiment 2 revealed that RH decisional processes cannot be reduced to recognition memory processes. As the aFMC has previously been associated with self-referential judgments, we conclude that RH decisional processes involve an assessment about the applicability of the RH

    How instructions modify perception: An fMRI study investigating brain areas involved in attributing human agency

    Get PDF
    Behavioural studies suggest that the processing of movement stimuli is influenced by beliefs about the agency behind these actions. The current study examined how activity in social and action related brain areas differs when participants were instructed that identicalmovement stimuli were either human or computer generated.Participants viewed a series of point-light animation figures derived frommotion-capture recordings of amoving actor, while functional magnetic resonance imaging (fMRI) was used to monitor patterns of neural activity. The stimuli were scrambled to produce a range of stimulus realism categories; furthermore, before each trial participants were told that they were about to view either a recording of human movement or a computersimulated pattern of movement. Behavioural results suggested that agency instructions influenced participants' perceptions of the stimuli. The fMRI analysis indicated different functions within the paracingulate cortex: ventral paracingulate cortex was more active for human compared to computer agency instructed trials across all stimulus types, whereas dorsal paracingulate cortex was activated more highly in conflicting conditions (human instruction, lowrealismor vice versa). These findings support the hypothesis that ventral paracingulate encodes stimuli deemed to be of human origin,whereas dorsal paracingulate cortex is involvedmore in the ascertainment of human or intentional agency during the observation of ambiguous stimuli. Our results highlight the importance of prior instructions or beliefs on movement processing and the role of the paracingulate cortex in integrating prior knowledge with bottom-up stimuli

    Surmising synchrony of sound and sight: Factors explaining variance of audiovisual integration in hurdling, tap dancing and drumming.

    Get PDF
    Auditory and visual percepts are integrated even when they are not perfectly temporally aligned with each other, especially when the visual signal precedes the auditory signal. This window of temporal integration for asynchronous audiovisual stimuli is relatively well examined in the case of speech, while other natural action-induced sounds have been widely neglected. Here, we studied the detection of audiovisual asynchrony in three different whole-body actions with natural action-induced sounds-hurdling, tap dancing and drumming. In Study 1, we examined whether audiovisual asynchrony detection, assessed by a simultaneity judgment task, differs as a function of sound production intentionality. Based on previous findings, we expected that auditory and visual signals should be integrated over a wider temporal window for actions creating sounds intentionally (tap dancing), compared to actions creating sounds incidentally (hurdling). While percentages of perceived synchrony differed in the expected way, we identified two further factors, namely high event density and low rhythmicity, to induce higher synchrony ratings as well. Therefore, we systematically varied event density and rhythmicity in Study 2, this time using drumming stimuli to exert full control over these variables, and the same simultaneity judgment tasks. Results suggest that high event density leads to a bias to integrate rather than segregate auditory and visual signals, even at relatively large asynchronies. Rhythmicity had a similar, albeit weaker effect, when event density was low. Our findings demonstrate that shorter asynchronies and visual-first asynchronies lead to higher synchrony ratings of whole-body action, pointing to clear parallels with audiovisual integration in speech perception. Overconfidence in the naturally expected, that is, synchrony of sound and sight, was stronger for intentional (vs. incidental) sound production and for movements with high (vs. low) rhythmicity, presumably because both encourage predictive processes. In contrast, high event density appears to increase synchronicity judgments simply because it makes the detection of audiovisual asynchrony more difficult. More studies using real-life audiovisual stimuli with varying event densities and rhythmicities are needed to fully uncover the general mechanisms of audiovisual integration
    • …
    corecore