7 research outputs found

    Fast estimation of plant growth dynamics using deep neural networks

    No full text
    BackgroundIn recent years, there has been an increase of interest in plant behaviour as represented by growth-driven responses. These are generally classified into nastic (internally driven) and tropic (environmentally driven) movements. Nastic movements include circumnutations, a circular movement of plant organs commonly associated with search and exploration, while tropisms refer to the directed growth of plant organs toward or away from environmental stimuli, such as light and gravity. Tracking these movements is therefore fundamental for the study of plant behaviour. Convolutional neural networks, as used for human and animal pose estimation, offer an interesting avenue for plant tracking. Here we adopted the Social LEAP Estimates Animal Poses (SLEAP) framework for plant tracking. We evaluated it on time-lapse videos of cases spanning a variety of parameters, such as: (i) organ types and imaging angles (e.g., top-view crown leaves vs. side-view shoots and roots), (ii) lighting conditions (full spectrum vs. IR), (iii) plant morphologies and scales (100 μm-scale Arabidopsis seedlings vs. cm-scale sunflowers and beans), and (iv) movement types (circumnutations, tropisms and twining).ResultsOverall, we found SLEAP to be accurate in tracking side views of shoots and roots, requiring only a low number of user-labelled frames for training. Top views of plant crowns made up of multiple leaves were found to be more challenging, due to the changing 2D morphology of leaves, and the occlusions of overlapping leaves. This required a larger number of labelled frames, and the choice of labelling “skeleton” had great impact on prediction accuracy, i.e., a more complex skeleton with fewer individuals (tracking individual plants) provided better results than a simpler skeleton with more individuals (tracking individual leaves).ConclusionsIn all, these results suggest SLEAP is a robust and versatile tool for high-throughput automated tracking of plants, presenting a new avenue for research focusing on plant dynamics.publishe

    Fast animal pose estimation using deep neural networks

    No full text
    This dataset contains videos of freely moving fruit flies, as well as trained networks and body position estimates for all ~21 million frames. Download the README.txt file for a detailed description of this dataset's content. See the code repository (https://github.com/talmo/leap) for usage examples of these files.Recent work quantifying postural dynamics has attempted to define the repertoire of behaviors performed by an animal. However, a major drawback to these techniques has been their reliance on dimensionality reduction of images which destroys information about which parts of the body are used in each behavior. To address this issue, we introduce a deep learning-based method for pose estimation, LEAP (LEAP Estimates Animal Pose). LEAP automatically predicts the positions of animal body parts using a deep convolutional neural network with as little as 10 frames of labeled data for training. This framework consists of a graphical interface for interactive labeling of body parts and software for training the network and fast prediction on new data (1 hr to train, 185 Hz predictions). We validate LEAP using videos of freely behaving fruit flies (Drosophila melanogaster) and track 32 distinct points on the body to fully describe the pose of the head, body, wings, and legs with an error rate of <3% of the animal's body length. We recapitulate a number of reported findings on insect gait dynamics and show LEAP's applicability as the first step in unsupervised behavioral classification. Finally, we extend the method to more challenging imaging situations (pairs of flies moving on a mesh-like background) and movies from freely moving mice (Mus musculus) where we track the full conformation of the head, body, and limbs

    Open-Source Tools for Behavioral Video Analysis: Setup, Methods, and Development

    Full text link
    Recently developed methods for video analysis, especially models for pose estimation and behavior classification, are transforming behavioral quantification to be more precise, scalable, and reproducible in fields such as neuroscience and ethology. These tools overcome long-standing limitations of manual scoring of video frames and traditional "center of mass" tracking algorithms to enable video analysis at scale. The expansion of open-source tools for video acquisition and analysis has led to new experimental approaches to understand behavior. Here, we review currently available open source tools for video analysis, how to set them up in a lab that is new to video recording methods, and some issues that should be addressed by developers and advanced users, including the need to openly share datasets and code, how to compare algorithms and their parameters, and the need for documentation and community-wide standards. We hope to encourage more widespread use and continued development of the tools. They have tremendous potential for accelerating scientific progress for understanding the brain and behavior.Comment: 20 pages, 2 figures, 2 tables; this is a commentary on video methods for analyzing behavior in animals that emerged from a working group organized by the OpenBehavior project (openbehavior.com

    Automated gesture tracking in head-fixed mice

    No full text
    The preparation consisting of a head-fixed mouse on a spherical or cylindrical treadmill offers unique advantages in a variety of experimental contexts. Head fixation provides the mechanical stability necessary for optical and electrophysiological recordings and stimulation. Additionally, it can be combined with virtual environments such as T-mazes, enabling these types of recording during diverse behaviors. New method, in this paper we present a low-cost, easy-to-build acquisition system, along with scalable computational methods to quantitatively measure behavior (locomotion and paws, whiskers, and tail motion patterns) in head-fixed mice locomoting on cylindrical or spherical treadmills. Existing methods, several custom supervised and unsupervised methods have been developed for measuring behavior in mice. However, to date there is no low-cost, turn-key, general-purpose, and scalable system for acquiring and quantifying behavior in mice. Results, we benchmark our algorithms against ground truth data generated either by manual labeling or by simpler methods of feature extraction. We demonstrate that our algorithms achieve good performance, both in supervised and unsupervised settings. Conclusions, we present a low-cost suite of tools for behavioral quantification, which serve as valuable complements to recording and stimulation technologies being developed for the head-fixed mouse preparation
    corecore