1,374 research outputs found

    Now Hear This! Orientation and Behavioral Responses of Hatchling Loggerhead Sea Turtles, Caretta caretta, to Environmental Acoustic Cues

    Full text link
    Although the visual and geologic orientation cues utilized by sea turtle hatchlings during seafinding, when they move from the nest to the sea after hatching, have been well studied, the potential for auditory stimuli to act as an orientation cue has not been well explored. Over the past several decades our knowledge of the auditory capacity of sea turtles has increased greatly, yet little is known about the biological significance of this sensory ability. To investigate whether hatchlings can use ocean sounds during seafinding, we measured the behavioral responses of hatchling loggerhead sea turtles (Caretta caretta) collected from nesting beaches in North Carolina to the presence of beach wave sound recorded on a nesting beach during the summer of 2015. The highest sound energy of beach waves occursHz, which overlaps with the most sensitive hearing range of loggerhead hatchlings (range of frequency detection: 50-1600 Hz, maximum sensitivity: 50-400 Hz). In our experiment, we placed turtles in a V-maze that isolated them from visual, vibratory, and chemical cues. One end of the V held a speaker producing beach wave sounds recorded from nesting beaches, while the other end held sound-reducing foam. We examined the phonotaxic behaviors of the hatchlings at two sound pressure levels (68 dB re: 20μPa and 64 dB re: 20μPa measured directly in front of the speaker). In the presence of the higher sound pressure level (68 dB re: 20μPa), hatchlings exhibited no phonotaxic response (p=1.0); yet, at the reduced sound pressure level (64 dB re: 20μPa), hatchlings exhibited a negative phonotaxic response (p=0.005). In control trials, hatchlings oriented to the two sides of the V-maze equally (p=0.701), suggesting the hatchlings in the lower volume treatment group were responding negatively to the sound. These results indicate the need for further auditory orientation experiments to better understand hatchling behavioral responses to environmental acoustic cues and to address possible impacts of anthropogenic beach sounds that have the potential to disorient hatchlings during seafinding

    The effect of hearing aid microphone mode on performance in an auditory orienting task

    Get PDF
    OBJECTIVES: Although directional microphones on a hearing aid provide a signal-to-noise ratio benefit in a noisy background, the amount of benefit is dependent on how close the signal of interest is to the front of the user. It is assumed that when the signal of interest is off-axis, users can reorient themselves to the signal to make use of the directional microphones to improve signal-to-noise ratio. The present study tested this assumption by measuring the head-orienting behavior of bilaterally fit hearing-impaired individuals with their microphones set to omnidirectional and directional modes. The authors hypothesized that listeners using directional microphones would have greater difficulty in rapidly and accurately orienting to off-axis signals than they would when using omnidirectional microphones. DESIGN: The authors instructed hearing-impaired individuals to turn and face a female talker in simultaneous surrounding male-talker babble. Participants pressed a button when they felt they were accurately oriented in the direction of the female talker. Participants completed three blocks of trials with their hearing aids in omnidirectional mode and three blocks in directional mode, with mode order randomized. Using a Vicon motion tracking system, the authors measured head position and computed fixation error, fixation latency, trajectory complexity, and proportion of misorientations. RESULTS: Results showed that for larger off-axis target angles, listeners using directional microphones took longer to reach their targets than they did when using omnidirectional microphones, although they were just as accurate. They also used more complex movements and frequently made initial turns in the wrong direction. For smaller off-axis target angles, this pattern was reversed, and listeners using directional microphones oriented more quickly and smoothly to the targets than when using omnidirectional microphones. CONCLUSIONS: The authors argue that an increase in movement complexity indicates a switch from a simple orienting movement to a search behavior. For the most off-axis target angles, listeners using directional microphones appear to not know which direction to turn, so they pick a direction at random and simply rotate their heads until the signal becomes more audible. The changes in fixation latency and head orientation trajectories suggest that the decrease in off-axis audibility is a primary concern in the use of directional microphones, and listeners could experience a loss of initial target speech while turning toward a new signal of interest. If hearing-aid users are to receive maximum directional benefit in noisy environments, both adaptive directionality in hearing aids and clinical advice on using directional microphones should take head movement and orientation behavior into account

    The Impact of Anthropogenic Noise on Fish Behavior, Communication, and Development

    Get PDF
    Noise pollution is pervasive to nearly all aquatic and terrestrial ecosystems and was labeled a pollutant of global concern by the World Health Organization in 2011. In the past few decades, underwater ambient noise levels have risen almost 30 dB SPL re: 1 µPa in the frequency range that most fish produce and detect acoustic stimuli due to rises in shipping, oil exploration, and pile driving. Changes to the natural soundscape can impact almost all aspects of an animal’s life. My dissertation research takes an integrative, whole-animal approach to examining how increased background noise impacts fish behavior, physiology, development, and communication. First, I found that social interactions occurring in noisy conditions were less effective. Males spent more time distracted or stressed during territorial fights, resulting in a longer time to fight resolution. Males also changed when and how they courted gravid females. Female hearing capabilities were significantly reduced following noise exposure. Changes to male signal production, female detection capabilities, and possibly the signal itself all interfere with effective social communication. Cumulatively, this resulted in a lower incidence of spawning during noise. Noise exposure also hindered mouthbrooding and maternal care behaviors. Females exposed to noise during brooding were more likely to cannibalize or prematurely release under-developed juveniles. Juveniles that were exposed to noise during development had lower growth rates, higher mortality, and altered social and startle behaviors. Finally, I found that fish possess all components of the proposed inner ear CRF-signaling system and that its expression is mediated by sex, reproductive state, and noise exposure. Because noise-induced changes in expression are dependent on physiological state, it is possible that noise-induced threshold shifts could also be modulated by reproductive condition. Overall, these results provide one of the most comprehensive whole-animal pictures on how increased background noise impacts fish. By examining subtle, sub-lethal changes to behavior, physiology, and communication, we can better inform conservation efforts before human-influenced noise levels reach potentially lethal levels

    Aerospace Medicine and Biology: A continuing bibliography with indexes, supplement 165, March 1977

    Get PDF
    This bibliography lists 198 reports, articles, and other documents introduced into the NASA scientific and technical information system in February 1977

    Olfaction, navigation, and the origin of isocortex

    Get PDF

    Predicting perceptual transparency of head-worn devices

    Get PDF
    | openaire: EC/H2020/812719/EU//VRACEAcoustically transparent head-worn devices are a key component of auditory augmented reality systems, in which both real and virtual sound sources are presented to a listener simultaneously. Head-worn devices can exhibit high transparency simply through their physical design but in practice will always obstruct the sound field to some extent. In this study, a method for predicting the perceptual transparency of head-worn devices is presented using numerical analysis of device measurements, testing both coloration and localization in the horizontal and median plane. Firstly, listening experiments are conducted to assess perceived coloration and localization impairments. Secondly, head-related transfer functions of a dummy head wearing the head-worn devices are measured, and auditory models are used to numerically quantify the introduced perceptual effects. The results show that the tested auditory models are capable of predicting perceptual transparency and are therefore robust in applications that they were not initially designed for.Peer reviewe

    Electrophysiologic assessment of (central) auditory processing disorder in children with non-syndromic cleft lip and/or palate

    Get PDF
    Session 5aPP - Psychological and Physiological Acoustics: Auditory Function, Mechanisms, and Models (Poster Session)Cleft of the lip and/or palate is a common congenital craniofacial malformation worldwide, particularly non-syndromic cleft lip and/or palate (NSCL/P). Though middle ear deficits in this population have been universally noted in numerous studies, other auditory problems including inner ear deficits or cortical dysfunction are rarely reported. A higher prevalence of educational problems has been noted in children with NSCL/P compared to craniofacially normal children. These high level cognitive difficulties cannot be entirely attributed to peripheral hearing loss. Recently it has been suggested that children with NSCLP may be more prone to abnormalities in the auditory cortex. The aim of the present study was to investigate whether school age children with (NSCL/P) have a higher prevalence of indications of (central) auditory processing disorder [(C)APD] compared to normal age matched controls when assessed using auditory event-related potential (ERP) techniques. School children (6 to 15 years) with NSCL/P and normal controls with matched age and gender were recruited. Auditory ERP recordings included auditory brainstem response and late event-related potentials, including the P1-N1-P2 complex and P300 waveforms. Initial findings from the present study are presented and their implications for further research in this area —and clinical intervention—are outlined. © 2012 Acoustical Society of Americapublished_or_final_versio

    3D-Sonification for Obstacle Avoidance in Brownout Conditions

    Get PDF
    Helicopter brownout is a phenomenon that occurs when making landing approaches in dusty environments, whereby sand or dust particles become swept up in the rotor outwash. Brownout is characterized by partial or total obscuration of the terrain, which degrades visual cues necessary for hovering and safe landing. Furthermore, the motion of the dust cloud produced during brownout can lead to the pilot experiencing motion cue anomalies such as vection illusions. In this context, the stability and guidance control functions can be intermittently or continuously degraded, potentially leading to undetected surface hazards and obstacles as well as unnoticed drift. Safe and controlled landing in brownout can be achieved using an integrated presentation of LADAR and RADAR imagery and aircraft state symbology. However, though detected by the LADAR and displayed on the sensor image, small obstacles can be difficult to discern from the background so that changes in obstacle elevation may go unnoticed. Moreover, pilot workload associated with tracking the displayed symbology is often so high that the pilot cannot give sufficient attention to the LADAR/RADAR image. This paper documents a simulation evaluating the use of 3D auditory cueing for obstacle avoidance in brownout as a replacement for or compliment to LADAR/RADAR imagery

    A comparison of hearing and auditory functioning between dogs and humans

    Get PDF
    Given the range of tasks that requires dogs and humans to work effectively together, it is important for us to appreciate the similarities and differences in hearing ability across the two species, as well as the limits of our knowledge of this comparative information. Humans often assume that dogs’ hearing abilities are similar to their own and try to communicate with them verbally as they do with other humans. In the first part of this review, we compare the auditory system of the two species in relation to their ability to function generally as a sound amplification and detection system before considering the specific capacities of the system in the second part. We then examine the factors that disturb hearing function before reviewing a range of potentially problematic behavioral responses that are closely associated with the functioning of the auditory system. Finally, we consider important aspects of comparative auditory perception and related cognitive processes. A major observation of this review is how little research has been done in investigating the auditory capabilities of the dog. There may be significant mismatches between what we expect dogs (and perhaps specific types of dog, given historic functional breed selection) can hear versus what they can actually hear. This has significant implications for what should be considered if we wish to select specific dogs for work associated with particular hearing abilities and to protect and maintain their hearing throughout life. Only with a more complete understanding of the dogs’ hearing ability compared with our own can we more fully appreciate perceptual and associated cognitive differences between the species alongside behavioral differences that might occur when we are exposed to a given soundscap
    • …
    corecore