73 research outputs found

    Deep Machine Learning Techniques for the Detection and Classification of Sperm Whale Bioacoustics

    Full text link
    We implemented Machine Learning (ML) techniques to advance the study of sperm whale (Physeter macrocephalus) bioacoustics. This entailed employing Convolutional Neural Networks (CNNs) to construct an echolocation click detector designed to classify spectrograms generated from sperm whale acoustic data according to the presence or absence of a click. The click detector achieved 99.5% accuracy in classifying 650 spectrograms. The successful application of CNNs to clicks reveals the potential of future studies to train CNN-based architectures to extract finer-scale details from cetacean spectrograms. Long short-term memory and gated recurrent unit recurrent neural networks were trained to perform classification tasks, including (1) “coda type classification” where we obtained 97.5% accuracy in categorizing 23 coda types from a Dominica dataset containing 8,719 codas and 93.6% accuracy in categorizing 43 coda types from an Eastern Tropical Pacific (ETP) dataset with 16,995 codas; (2) “vocal clan classification” where we obtained 95.3% accuracy for two clan classes from Dominica and 93.1% for four ETP clan types; and (3) “individual whale identification” where we obtained 99.4% accuracy using two Dominica sperm whales. These results demonstrate the feasibility of applying ML to sperm whale bioacoustics and establish the validity of constructing neural networks to learn meaningful representations of whale vocalizations

    Automatic visual bias of perceived auditory location

    Full text link
    Studies of reactions to audiovisual spatial conflict (alias "ventriloquism") are generally presented as informing on the processes of intermodal coordination. However, most of the literature has failed to isolate genuine perceptual effects from voluntary postperceptual adjustments. A new approach, based on psychophysical staircases, is applied to the case of the immediate visual bias of auditory localization. Subjects have to judge the apparent origin of stereophonically controlled sound bursts as left or right of a median reference line. Successive trials belong to one of two staircases, starting respectively at extreme left and right locations, and are moved progressively toward the median on the basis of the subjects' responses. Response reversals occur for locations farther away from center when a central lamp is flashed in synchrony with the bursts than without flashes (Experiment 1), revealing an attraction of the sounds toward the flashes. The effect cannot originate in voluntary postperceptual decision, since the occurrence of response reversal implies that the subject is uncertain concerning the direction of the target sound. The attraction is contingent on sound-flash synchronization, for early response reversals did no longer occur when the inputs from the two modalities were desynchronized (Experiment 2). Taken together, the results show that the visual bias of auditory localization observed repeatedly in less controlled conditions is due partly at least to an automatic attraction of the apparent location of sound by spatially discordant but temporally correlated visual inputs.SCOPUS: ar.jinfo:eu-repo/semantics/publishe

    Psychology and legal change: On the limits of a factual jurisprudence.

    Full text link

    Impact of New Technologies on Geomatics in 2010

    No full text
    • 

    corecore