704 research outputs found

    Natural ultrasonic echoes from wing beating insects are encoded by collicular neurons in the CF-FM bat, Rhinolophus f errumequinum

    Get PDF
    1. Acoustic reflections from a wing beating moth to an 80 kHz ultrasonic signal were recorded from six different incident angles and analyzed in spectral and time domains. The recorded echoes as well as independent components of amplitude and frequency modulations of the echoes were employed as acoustic stimuli during single unit studies. 2. The responses of single inferior colliculus neurons to these stimuli were recorded from four horseshoe bats,Rhinolophus ferrumequinum, a species which uses a long constant frequency (CF) sound with a final frequency modulated (FM) sweep during echolocation. All neurons responding to wing beat echoes reliably encoded the fundamental wing beat frequency as well as the more refined frequency and amplitude modulations. 3. These neurons may provide the bat a neural mechanism to detect periodically moving targets against a cluttered background and also to discriminate various insect species on the basis of their wing beat patterns

    A computer vision approach to classification of birds in flight from video sequences

    Get PDF
    Bird populations are an important bio-indicator; so collecting reliable data is useful for ecologists helping conserve and manage fragile ecosystems. However, existing manual monitoring methods are labour-intensive, time-consuming, and error-prone. The aim of our work is to develop a reliable system, capable of automatically classifying individual bird species in flight from videos. This is challenging, but appropriate for use in the field, since there is often a requirement to identify in flight, rather than when stationary. We present our work in progress, which uses combined appearance and motion features to classify and present experimental results across seven species using Normal Bayes classifier with majority voting and achieving a classification rate of 86%

    Natural ultrasonic echoes from wing beating insects are encoded by collicular neurons in the CF-FM bat, Rhinolophus f errumequinum

    Get PDF
    1. Acoustic reflections from a wing beating moth to an 80 kHz ultrasonic signal were recorded from six different incident angles and analyzed in spectral and time domains. The recorded echoes as well as independent components of amplitude and frequency modulations of the echoes were employed as acoustic stimuli during single unit studies. 2. The responses of single inferior colliculus neurons to these stimuli were recorded from four horseshoe bats,Rhinolophus ferrumequinum, a species which uses a long constant frequency (CF) sound with a final frequency modulated (FM) sweep during echolocation. All neurons responding to wing beat echoes reliably encoded the fundamental wing beat frequency as well as the more refined frequency and amplitude modulations. 3. These neurons may provide the bat a neural mechanism to detect periodically moving targets against a cluttered background and also to discriminate various insect species on the basis of their wing beat patterns

    Classification of bird species from video using appearance and motion features

    Get PDF
    The monitoring of bird populations can provide important information on the state of sensitive ecosystems; however, the manual collection of reliable population data is labour-intensive, time-consuming, and potentially error prone. Automated monitoring using computer vision is therefore an attractive proposition, which could facilitate the collection of detailed data on a much larger scale than is currently possible. A number of existing algorithms are able to classify bird species from individual high quality detailed images often using manual inputs (such as a priori parts labelling). However, deployment in the field necessitates fully automated in-flight classification, which remains an open challenge due to poor image quality, high and rapid variation in pose, and similar appearance of some species. We address this as a fine-grained classification problem, and have collected a video dataset of thirteen bird classes (ten species and another with three colour variants) for training and evaluation. We present our proposed algorithm, which selects effective features from a large pool of appearance and motion features. We compare our method to others which use appearance features only, including image classification using state-of-the-art Deep Convolutional Neural Networks (CNNs). Using our algorithm we achieved a 90% correct classification rate, and we also show that using effectively selected motion and appearance features together can produce results which outperform state-of-the-art single image classifiers. We also show that the most significant motion features improve correct classification rates by 7% compared to using appearance features alone

    CC Sculptoris: A superhumping intermediate polar

    Full text link
    We present high speed optical, spectroscopic and Swift X-ray observations made during the dwarf nova superoutburst of CC Scl in November 2011. An orbital period of 1.383 h and superhump period of 1.443 h were measured, but the principal new finding is that CC Scl is a previously unrecognised intermediate polar, with a white dwarf spin period of 389.49 s which is seen in both optical and Swift X-ray light curves only during the outburst. In this it closely resembles the old nova GK Per, but unlike the latter has one of the shortest orbital periods among intermediate polars.Comment: Accepted for publication in MNRAS; 11 pages, 19 figure

    3D pose estimation of flying animals in multi-view video datasets

    Get PDF
    Flying animals such as bats, birds, and moths are actively studied by researchers wanting to better understand these animals’ behavior and flight characteristics. Towards this goal, multi-view videos of flying animals have been recorded both in lab- oratory conditions and natural habitats. The analysis of these videos has shifted over time from manual inspection by scientists to more automated and quantitative approaches based on computer vision algorithms. This thesis describes a study on the largely unexplored problem of 3D pose estimation of flying animals in multi-view video data. This problem has received little attention in the computer vision community where few flying animal datasets exist. Additionally, published solutions from researchers in the natural sciences have not taken full advantage of advancements in computer vision research. This thesis addresses this gap by proposing three different approaches for 3D pose estimation of flying animals in multi-view video datasets, which evolve from successful pose estimation paradigms used in computer vision. The first approach models the appearance of a flying animal with a synthetic 3D graphics model and then uses a Markov Random Field to model 3D pose estimation over time as a single optimization problem. The second approach builds on the success of Pictorial Structures models and further improves them for the case where only a sparse set of landmarks are annotated in training data. The proposed approach first discovers parts from regions of the training images that are not annotated. The discovered parts are then used to generate more accurate appearance likelihood terms which in turn produce more accurate landmark localizations. The third approach takes advantage of the success of deep learning models and adapts existing deep architectures to perform landmark localization. Both the second and third approaches perform 3D pose estimation by first obtaining accurate localization of key landmarks in individual views, and then using calibrated cameras and camera geometry to reconstruct the 3D position of key landmarks. This thesis shows that the proposed algorithms generate first-of-a-kind and leading results on real world datasets of bats and moths, respectively. Furthermore, a variety of resources are made freely available to the public to further strengthen the connection between research communities
    • …
    corecore