238,318 research outputs found

    The Harvard Beat Assessment Test (H-BAT): a battery for assessing beat perception and production and their dissociation

    Get PDF
    Humans have the abilities to perceive, produce, and synchronize with a musical beat, yet there are widespread individual differences. To investigate these abilities and to determine if a dissociation between beat perception and production exists, we developed the Harvard Beat Assessment Test (H-BAT), a new battery that assesses beat perception and production abilities. H-BAT consists of four subtests: (1) music tapping test (MTT), (2) beat saliency test (BST), (3) beat interval test (BIT), and (4) beat finding and interval test (BFIT). MTT measures the degree of tapping synchronization with the beat of music, whereas BST, BIT, and BFIT measure perception and production thresholds via psychophysical adaptive stair-case methods. We administered the H-BAT on thirty individuals and investigated the performance distribution across these individuals in each subtest. There was a wide distribution in individual abilities to tap in synchrony with the beat of music during the MTT. The degree of synchronization consistency was negatively correlated with thresholds in the BST, BIT, and BFIT: a lower degree of synchronization was associated with higher perception and production thresholds. H-BAT can be a useful tool in determining an individual's ability to perceive and produce a beat within a single session

    Smoothness perception : investigation of beat rate effect on frame rate perception

    Get PDF
    Despite the complexity of the Human Visual System (HVS), research over the last few decades has highlighted a number of its limitations. These limitations can be exploited in computer graphics to significantly reduce computational cost and thus required rendering time, without a viewer perceiving any difference in resultant image quality. Furthermore, cross-modal interaction between different modalities, such as the influence of audio on visual perception, has also been shown as significant both in psychology and computer graphics. In this paper we investigate the effect of beat rate on temporal visual perception, i.e. frame rate perception. For the visual quality and perception evaluation, a series of psychophysical experiments was conducted and the data analysed. The results indicate that beat rates in some cases do affect temporal visual perception and that certain beat rates can be used in order to reduce the amount of rendering required to achieve a perceptual high quality. This is another step towards a comprehensive understanding of auditory-visual cross-modal interaction and could be potentially used in high-fidelity interactive multi-sensory virtual environments

    Global timing: a conceptual framework to investigate the neural basis of rhythm perception in humans and non-human species

    Get PDF
    Timing cues are an essential feature of music. To understand how the brain gives rise to our experience of music we must appreciate how acoustical temporal patterns are integrated over the range of several seconds in order to extract global timing. In music perception, global timing comprises three distinct but often interacting percepts: temporal grouping, beat, and tempo. What directions may we take to further elucidate where and how the global timing of music is processed in the brain? The present perspective addresses this question and describes our current understanding of the neural basis of global timing perception

    Neural responses to sounds presented on and off the beat of ecologically valid music

    Get PDF
    The tracking of rhythmic structure is a vital component of speech and music perception. It is known that sequences of identical sounds can give rise to the percept of alternating strong and weak sounds, and that this percept is linked to enhanced cortical and oscillatory responses. The neural correlates of the perception of rhythm elicited by ecologically valid, complex stimuli, however, remain unexplored. Here we report the effects of a stimulus' alignment with the beat on the brain's processing of sound. Human subjects listened to short popular music pieces while simultaneously hearing a target sound. Cortical and brainstem electrophysiological onset responses to the sound were enhanced when it was presented on the beat of the music, as opposed to shifted away from it. Moreover, the size of the effect of alignment with the beat on the cortical response correlated strongly with the ability to tap to a beat, suggesting that the ability to synchronize to the beat of simple isochronous stimuli and the ability to track the beat of complex, ecologically valid stimuli may rely on overlapping neural resources. These results suggest that the perception of musical rhythm may have robust effects on processing throughout the auditory system

    Adaptive Frequency Neural Networks for Dynamic Pulse and Metre Perception.

    Get PDF
    Beat induction, the means by which humans listen to music and perceive a steady pulse, is achieved via a perceptualand cognitive process. Computationally modelling this phenomenon is an open problem, especially when processing expressive shaping of the music such as tempo change.To meet this challenge we propose Adaptive Frequency Neural Networks (AFNNs), an extension of Gradient Frequency Neural Networks (GFNNs).GFNNs are based on neurodynamic models and have been applied successfully to a range of difficult music perception problems including those with syncopated and polyrhythmic stimuli. AFNNs extend GFNNs by applying a Hebbian learning rule to the oscillator frequencies. Thus the frequencies in an AFNN adapt to the stimulus through an attraction to local areas of resonance, and allow for a great dimensionality reduction in the network.Where previous work with GFNNs has focused on frequency and amplitude responses, we also consider phase information as critical for pulse perception. Evaluating the time-based output, we find significantly improved re-sponses of AFNNs compared to GFNNs to stimuli with both steady and varying pulse frequencies. This leads us to believe that AFNNs could replace the linear filtering methods commonly used in beat tracking and tempo estimationsystems, and lead to more accurate methods

    Feel the beat: using cross-modal rhythm to integrate perception of objects, others, and self

    Get PDF
    For a robot to be capable of development, it must be able to explore its environment and learn from its experiences. It must find (or create) opportunities to experience the unfamiliar in ways that reveal properties valid beyond the immediate context. In this paper, we develop a novel method for using the rhythm of everyday actions as a basis for identifying the characteristic appearance and sounds associated with objects, people, and the robot itself. Our approach is to identify and segment groups of signals in individual modalities (sight, hearing, and proprioception) based on their rhythmic variation, then to identify and bind causally-related groups of signals across different modalities. By including proprioception as a modality, this cross-modal binding method applies to the robot itself, and we report a series of experiments in which the robot learns about the characteristics of its own body

    Regularity and asynchrony when tapping to tactile, auditory and combined pulses

    Get PDF
    This research is carried out with the aim to develop assistive technology that helps users following the beat in music, which is of interest to cohchlear implant users. The envisioned technology would use tactile feedback on each musical beat. However, this raises fundamental questions about uni- and cross-modal perception which are not addressed in similar context in the literature. The aim of this study was i) to find out how well users are able to follow tactile pulses. ii) To gain insights in the differences between auditory, tactile and combined auditory-tactile feedback. A tapping experiment was organized with 27 subjects. They were requested to tap along with an auditory pulse, a tactile pulse and a combined auditory-tactile pulse in three different tempi. An evaluation with respect to regularity and asynchrony followed. Subjects were found to perform significantly better in terms of reqularity and asynchrony for the auditory and auditory/tactile condition with respect to the tactile only condition. Mean negative asynchrony (MNA) for auditory and combined (auditory and tactile) conditions were in the range of previous studies. The MNA’s for the tactile conditions showed a remarkable dependence on tempo. In the 90BPM condition a clear anticipation (-20ms) was reported, for the 120BPM condition the mean was around zero, the 150BPM condition showed a positive MNA (a reaction vs anticipation). An effect that could be encorporated into the design of an assistive technology

    A Simpler Explanation for Vestibular Influence on Beat Perception: No Specialized Unit Needed

    Get PDF
    Some researchers have hypothesized the existence of a specialized brain unit for beat perception in music which is directly influenced by vestibular stimulation arising from motion. They also suggest that the unit is involved in the entrainment of movement to music. However, the data used to support this hypothesis may be explained by a simpler phenomenon: the audiogravic and audiogyral effect. This effect is not related to beat perception at all but deals with perceived sound changes under accelerations. If the perception of a sound changes as a consequence of acceleration of the vestibular system, and those accelerations are timed to coincide with particular beats in a stream of unaccented beats, then those beats will actually sound different. The detection of a given meter in that unaccented stream will therefore arise from this change in sound processing, with no need for a specialized brain mechanism for beat perception. There is no direct evidence supporting the existence of an innate brain unit

    Tactus ≠ Tempo: Some Dissociations Between Attentional Focus, Motor Behavior, and Tempo Judgment

    Get PDF
    Three experiments explored the relationships between surface rhythmic activity, tactus or beat rate, attentional focus, sensorimotor synchronization (tapping), and tempo perception. All involved a rhythmic standard followed by a comparison; the experimental task was a judgment of “slower, same, or faster.” In Experiment 1 participants simply judged relative speed; they focused on the beat level in Experiment 2, and they tapped along as they made their judgments in Experiment 3. In all three experiments judgments were highly accurate (89-97% correct, relative to beat-level inter-onset interval) when the standard-comparison involved the same pattern/same tempo, and performed similarly for the same pattern at different tempos (80-83% correct). Performance degraded significantly in other contexts, especially for different patterns at the same tempo. A main effect for pattern (two levels: same vs. different) and a pattern x tempo interaction were observed in all three experiments; a main effect for tempo (collapsed to two levels: same vs. different) occurred only in Experiment 1. Analysis of a subset of the experimental conditions indicated that surface activity was of greater salience than the beat level in some contexts. Tapping along (Experiment 3) did not improve overall performance any more than simply focusing on the tactus level (Experiment 2), and a possible biasing effect of tapping rate on tempo judgment was observed. Thus there is an apparent dissociation between tactus rate, attentional focus, tapping behavior and tempo judgment. This suggests that our perception of musical speed or tempo is more than simple apprehension of the tactus rate
    corecore