2 research outputs found

    Data from: How oscillating aerodynamic forces explain the timbre of the hummingbird's hum and other animals in flapping flight

    No full text
    The source of the hummingbird's distinctive hum is not fully understood, but there are clues to its origin in the acoustic nearfield. Hence we studied six freely hovering Anna's hummingbirds, performing acoustic nearfield holography using a 2176 microphone array in vivo, while also directly measuring the 3D aerodynamic forces using a new aerodynamic force platform. We corroborate the acoustic measurements by developing a first-principles acoustic model that integrates the aerodynamic forces with wing kinematics, which shows how the timbre of the hummingbird's hum arises from the oscillating lift and drag forces on each wing. Comparing birds and insects, we find that the characteristic humming timbre and radiated power of their flapping wings originates from the higher harmonics in the aerodynamic forces that support their bodyweight. Our model analysis across insects and birds shows that allometric deviation makes larger birds quieter and elongated flies louder, while also clarifying complex bioacoustic behavior

    Aircraft Marshaling Signals Dataset of FMCW Radar and Event-Based Camera for Sensor Fusion

    No full text
    The advent of neural networks capable of learning salient features from variance in the radar data has expanded the breadth of radar applications, often as an alternative sensor or a complementary modality to camera vision. Gesture recognition for command control is the most commonly explored application. Nevertheless, more suitable benchmarking datasets are needed to assess and compare the merits of the different proposed solutions. Furthermore, most current publicly available radar datasets used in gesture recognition provide little diversity, do not provide access to raw ADC data, and are not significantly challenging. To address these shortcomings, we created and made available a new dataset that combines two synchronized modalities: radar and dynamic vision camera of 10 aircraft marshalling signals at several distances and angles, recorded from 13 people. Moreover, we propose a sparse encoding of the time domain (ADC) signals that achieve a dramatic data rate reduction (>76%) while retaining the efficacy of the downstream FFT processing (<2% accuracy loss on recognition tasks). Finally, we demonstrate early sensor fusion results based on compressed radar data encoding in range-Doppler maps with dynamic vision data. This approach achieves higher accuracy than either modality alone
    corecore