2,330 research outputs found

    Multi-camera Realtime 3D Tracking of Multiple Flying Animals

    Full text link
    Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in realtime - with minimal latency - opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behavior. Here we describe a new system capable of tracking the position and body orientation of animals such as flies and birds. The system operates with less than 40 msec latency and can track multiple animals simultaneously. To achieve these results, a multi target tracking algorithm was developed based on the Extended Kalman Filter and the Nearest Neighbor Standard Filter data association algorithm. In one implementation, an eleven camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behavior of freely flying animals. If combined with other techniques, such as `virtual reality'-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.Comment: pdfTeX using libpoppler 3.141592-1.40.3-2.2 (Web2C 7.5.6), 18 pages with 9 figure

    Neuron-level dynamics of oscillatory network structure and markerless tracking of kinematics during grasping

    Get PDF
    Oscillatory synchrony is proposed to play an important role in flexible sensory-motor transformations. Thereby, it is assumed that changes in the oscillatory network structure at the level of single neurons lead to flexible information processing. Yet, how the oscillatory network structure at the neuron-level changes with different behavior remains elusive. To address this gap, we examined changes in the fronto-parietal oscillatory network structure at the neuron-level, while monkeys performed a flexible sensory-motor grasping task. We found that neurons formed separate subnetworks in the low frequency and beta bands. The beta subnetwork was active during steady states and the low frequency network during active states of the task, suggesting that both frequencies are mutually exclusive at the neuron-level. Furthermore, both frequency subnetworks reconfigured at the neuron-level for different grip and context conditions, which was mostly lost at any scale larger than neurons in the network. Our results, therefore, suggest that the oscillatory network structure at the neuron-level meets the necessary requirements for the coordination of flexible sensory-motor transformations. Supplementarily, tracking hand kinematics is a crucial experimental requirement to analyze neuronal control of grasp movements. To this end, a 3D markerless, gloveless hand tracking system was developed using computer vision and deep learning techniques. 2021-11-3

    Multi-camera real-time three-dimensional tracking of multiple flying animals

    Get PDF
    Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in real time—with minimal latency—opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behaviour. Here, we describe a system capable of tracking the three-dimensional position and body orientation of animals such as flies and birds. The system operates with less than 40 ms latency and can track multiple animals simultaneously. To achieve these results, a multi-target tracking algorithm was developed based on the extended Kalman filter and the nearest neighbour standard filter data association algorithm. In one implementation, an 11-camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behaviour of freely flying animals. If combined with other techniques, such as ‘virtual reality’-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals

    MScMS-II: an innovative IR-based indoor coordinate measuring system for large-scale metrology applications

    No full text
    According to the current great interest concerning large-scale metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance are assuming a more and more important role among system requirements. This paper describes the architecture and the working principles of a novel infrared (IR) optical-based system, designed to perform low-cost and easy indoor coordinate measurements of large-size objects. The system consists of a distributed network-based layout, whose modularity allows fitting differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load. The overall system functionalities, including distributed layout configuration, network self-calibration, 3D point localization, and measurement data elaboration, are discussed. A preliminary metrological characterization of system performance, based on experimental testing, is also presente

    Real-Time Multi-Fisheye Camera Self-Localization and Egomotion Estimation in Complex Indoor Environments

    Get PDF
    In this work a real-time capable multi-fisheye camera self-localization and egomotion estimation framework is developed. The thesis covers all aspects ranging from omnidirectional camera calibration to the development of a complete multi-fisheye camera SLAM system based on a generic multi-camera bundle adjustment method

    3D Hand Movement Measurement Framework for Studying Human-Computer Interaction

    Get PDF
    In order to develop better touch and gesture user interfaces, it is important to be able to measure how humans move their hands while interacting with technical devices. The recent advances in high-speed imaging technology and in image-based object tracking techniques have made it possible to accurately measure the hand movement from videos without the need for data gloves or other sensors that would limit the natural hand movements. In this paper, we propose a complete framework to measure hand movements in 3D in human-computer interaction situations. The framework includes the composition of the measurement setup, selecting the object tracking methods, post-processing of the motion trajectories, 3D trajectory reconstruction, and characterizing and visualizing the movement data. We demonstrate the framework in a context where 3D touch screen usability is studied with 3D stimuli.Peer reviewe

    Decomposition of 3D joint kinematics of walking in Drosophila melanogaster

    Get PDF
    Animals exhibit a rich repertoire of locomotive behaviors. In the context of legged locomotion, i.e. walking, animals can change their heading direction, traverse diverse substrates with different speeds, or can even compensate for the loss of a leg. This versatility emerges from the fact that biological limbs have more joints and/or more degrees of freedom (DOF), i.e. independent directions of motions, than required for any single movement task. However, this further entails that multiple, or even infinitely many, joint configuration can result in the same leg stepping pattern during walking. How the nervous system deals with such kinematic redundancy remains still unknown. One proposed hypothesis is that the nervous system does not control individual DOFs, but uses flexible combinations of groups of anatomical or functional DOFs, referred to as motor synergies. Drosophila melanogaster represents an excellent model organism for studying the motor control of walking, not least because of the extensive genetic toolbox available, which, among others, allows the identification and targeted manipulation of individual neurons or muscles. However, their tiny size and ability for relatively rapid leg movements hampered research on the kinematics at the level of leg joints due to technical limitations until recently. Hence, the main objective of this dissertation was to investigate the three-dimensional (3D) leg joint kinematics of Drosophila during straight walking. For this, I first established a motion capture setup for Drosophila which allowed the accurate reconstruction of the leg joint positions in 3D with high temporal resolution (400 Hz). Afterwards, I created a kinematic leg model based on anatomical landmarks, i.e. joint condyles, extracted from micro computed-tomography scan data. This step was essential insofar that the actual DOFs of the leg joints in Drosophila were currently unknown. By using this kinematic model, I have found that a mobile trochanter-femur joint can best explain the leg movements of the front legs, but is not mandatory in the other leg pairs. Additionally, I demonstrate that rotations of the femur-tibia plane in the middle legs arise from interactions between two joints suggesting that the natural orientation of joint rotational axes can extent the leg movement repertoire without increasing the number of elements to be controlled. Furthermore, each leg pair exhibited distinct joint kinematics in terms of the joint DOFs employed and their angle time courses during swing and stance phases. Since it is proposed that the nervous system could use motor synergies to solve the redundancy problem, I finally aimed to identify kinematic synergies based on the obtained joint angles from the kinematic model. By applying principal component analysis on the mean joint angle sets of leg steps, I found that three kinematic synergies are sufficient to reconstruct the movements of the tarsus tip during stepping for all leg pairs. This suggests that the problem of controlling seven to eight joint DOFs can be in principle reduced to three control parameters. In conclusion, this dissertation provides detailed insights into the leg joint kinematics of Drosophila during forward walking which are relevant for deciphering motor control of walking in insects. When combined with the extensive genetic toolbox offered by Drosophila as model organism, the experimental platform presented here, i.e. the 3D motion capture setup and the kinematic leg model, can facilitate investigations of Drosophila walking behavior in the future

    MoveBox: Democratizing MoCap for the Microsoft Rocketbox Avatar Library

    Get PDF
    This paper presents MoveBox an open sourced toolbox for animating motion captured (MoCap) movements onto the Microsoft Rocketbox library of avatars. Motion capture is performed using a single depth sensor, such as Azure Kinect or Windows Kinect V2. Motion capture is performed in real-time using a single depth sensor, such as Azure Kinect or Windows Kinect V2, or extracted from existing RGB videos offline leveraging deep-learning computer vision techniques. Our toolbox enables real-time animation of the user’s avatar by converting the transformations between systems that have different joints and hierarchies. Additional features of the toolbox include recording, playback and looping animations, as well as basic audio lip sync, blinking and resizing of avatars as well as finger and hand animations. Our main contribution is both in the creation of this open source tool as well as the validation on different devices and discussion of MoveBox’s capabilities by end users
    corecore