13 research outputs found
Birdsong fails to support object categorization in human infants.
Recent evidence reveals a precocious link between language and cognition in human infants: listening to their native language supports infants' core cognitive processes, including object categorization, and does so in a way that other acoustic signals (e.g., time-reversed speech; sine-wave tone sequences) do not. Moreover, language is not the only signal that confers this cognitive advantage: listening to vocalizations of non-human primates also supports object categorization in 3- and 4-month-olds. Here, we move beyond primate vocalizations to clarify the breadth of acoustic signals that promote infant cognition. We ask whether listening to birdsong, another naturally produced animal vocalization, also supports object categorization in 3- and 4-month-old infants. We report that listening to zebra finch song failed to confer a cognitive advantage. This outcome brings us closer to identifying a boundary condition on the range of non-linguistic acoustic signals that initially support infant cognition
Face-related ERPs are modulated by point of gaze
This study examined the influence of gaze fixation on face-sensitive ERPs. A fixation crosshair presented prior to face onset directed visual attention to upper, central, or lower face regions while ERPs were recorded. This manipulation modulated a face-sensitive component (N170) but not an early sensory component (P1). Upper and lower face fixations elicited enhanced N170 amplitude and longer N170 latency. Results expand upon extant hemodynamic research by demonstrating early effects at basic stages of face processing. These findings distinguish attention to facial features in context from attention to isolated features, and they inform electrophysiological studies of face processing in clinical populations