423 research outputs found

    Animal Welfare Assessment

    Get PDF
    This Special Issue provides a collection of recent research and reviews that investigate many areas of welfare assessment, such as novel approaches and technologies used to evaluate the welfare of farmed, captive, or wild animals. Research in this Special Issue includes welfare assessment related to pilot whales, finishing pigs, commercial turkey flocks, and dairy goats; the use of sensors or wearable technologies, such as heart rate monitors to assess sleep in dairy cows, ear tag sensors, and machine learning to assess commercial pig behaviour; non-invasive measures, such as video monitoring of behaviour, computer vision to analyse video footage of red foxes, remote camera traps of free-roaming wild horses, infrared thermography of effort and sport recovery in sport horses; telomere length and regulatory genes as novel biomarkers of stress in broiler chickens; the effect of environment on growth physiology and behaviour of laboratory rare minnows and housing system on anxiety, stress, fear, and immune function of laying hens; and discussions of natural behaviour in farm animal welfare and maintaining health, welfare, and productivity of commercial pig herds

    Experimental Improvements in Pullet Rearing

    Get PDF

    Housing Environment and Farm Animals' Well-Being

    Get PDF
    This reprint contains articles from the Special Issue of Animals “Housing Environment and Farm Animals' Well-Being”, including original research, review, and communication related to livestock and poultry environmental management, air quality control, emissions mitigation, and assessment of animal health and well-being

    Visually guided flight in birds using the budgerigar (Melopsittacus undulatus) as a model system

    Get PDF
    Keywords: Edge detection, Optic flow, Obstacle avoidanc

    Exploring Motion Signatures for Vision-Based Tracking, Recognition and Navigation

    Get PDF
    As cameras become more and more popular in intelligent systems, algorithms and systems for understanding video data become more and more important. There is a broad range of applications, including object detection, tracking, scene understanding, and robot navigation. Besides the stationary information, video data contains rich motion information of the environment. Biological visual systems, like human and animal eyes, are very sensitive to the motion information. This inspires active research on vision-based motion analysis in recent years. The main focus of motion analysis has been on low level motion representations of pixels and image regions. However, the motion signatures can benefit a broader range of applications if further in-depth analysis techniques are developed. In this dissertation, we mainly discuss how to exploit motion signatures to solve problems in two applications: object recognition and robot navigation. First, we use bird species recognition as the application to explore motion signatures for object recognition. We begin with study of the periodic wingbeat motion of flying birds. To analyze the wing motion of a flying bird, we establish kinematics models for bird wings, and obtain wingbeat periodicity in image frames after the perspective projection. Time series of salient extremities on bird images are extracted, and the wingbeat frequency is acquired for species classification. Physical experiments show that the frequency based recognition method is robust to segmentation errors and measurement lost up to 30%. In addition to the wing motion, the body motion of the bird is also analyzed to extract the flying velocity in 3D space. An interacting multi-model approach is then designed to capture the combined object motion patterns and different environment conditions. The proposed systems and algorithms are tested in physical experiments, and the results show a false positive rate of around 20% with a low false negative rate close to zero. Second, we explore motion signatures for vision-based vehicle navigation. We discover that motion vectors (MVs) encoded in Moving Picture Experts Group (MPEG) videos provide rich information of the motion in the environment, which can be used to reconstruct the vehicle ego-motion and the structure of the scene. However, MVs suffer from high noise level. To handle the challenge, an error propagation model for MVs is first proposed. Several steps, including MV merging, plane-at-infinity elimination, and planar region extraction, are designed to further reduce noises. The extracted planes are used as landmarks in an extended Kalman filter (EKF) for simultaneous localization and mapping. Results show that the algorithm performs localization and plane mapping with a relative trajectory error below 5:1%. Exploiting the fact that MVs encodes both environment information and moving obstacles, we further propose to track moving objects at the same time of localization and mapping. This enables the two critical navigation functionalities, localization and obstacle avoidance, to be performed in a single framework. MVs are labeled as stationary or moving according to their consistency to geometric constraints. Therefore, the extracted planes are separated into moving objects and the stationary scene. Multiple EKFs are used to track the static scene and the moving objects simultaneously. In physical experiments, we show a detection rate of moving objects at 96:6% and a mean absolute localization error below 3:5 meters
    corecore