1,656 research outputs found

    A Bird’s Eye View of Human Language Evolution

    Get PDF
    Comparative studies of linguistic faculties in animals pose an evolutionary paradox: language involves certain perceptual and motor abilities, but it is not clear that this serves as more than an input–output channel for the externalization of language proper. Strikingly, the capability for auditory–vocal learning is not shared with our closest relatives, the apes, but is present in such remotely related groups as songbirds and marine mammals. There is increasing evidence for behavioral, neural, and genetic similarities between speech acquisition and birdsong learning. At the same time, researchers have applied formal linguistic analysis to the vocalizations of both primates and songbirds. What have all these studies taught us about the evolution of language? Is the comparative study of an apparently species-specific trait like language feasible? We argue that comparative analysis remains an important method for the evolutionary reconstruction and causal analysis of the mechanisms underlying language. On the one hand, common descent has been important in the evolution of the brain, such that avian and mammalian brains may be largely homologous, particularly in the case of brain regions involved in auditory perception, vocalization, and auditory memory. On the other hand, there has been convergent evolution of the capacity for auditory–vocal learning, and possibly for structuring of external vocalizations, such that apes lack the abilities that are shared between songbirds and humans. However, significant limitations to this comparative analysis remain. While all birdsong may be classified in terms of a particularly simple kind of concatenation system, the regular languages, there is no compelling evidence to date that birdsong matches the characteristic syntactic complexity of human language, arising from the composition of smaller forms like words and phrases into larger ones

    Testing the Template Hypothesis of Vocal Learning in Songbirds.

    Get PDF
    The auditory forebrain regions NCM and CMM of songbirds are associated with perception and complex auditory processing. Expression of the immediate-early gene ZENK varies in response to different sounds. Two hypotheses are proposed for this. First, ZENK may reflect access to a representation of song memories. Second, ZENK may reflect attention. I tested these hypotheses by measuring ZENK in response to tutored heterospecific or isolate songs compared to non-tutored wild-type song. Young zebra finch females were exposed to different tutoring conditions and later exposed to different playbacks, and the expression of ZENK in CMM and NCM measured. ZENK responses varied across playback stimuli in some brain regions, but did not interact with tutoring conditions. These results do not support the hypothesis that ZENK activation reflects auditory memories

    How Could Language Have Evolved?

    Get PDF
    The evolution of the faculty of language largely remains an enigma. In this essay, we ask why. Language's evolutionary analysis is complicated because it has no equivalent in any nonhuman species. There is also no consensus regarding the essential nature of the language “phenotype.” According to the “Strong Minimalist Thesis,” the key distinguishing feature of language (and what evolutionary theory must explain) is hierarchical syntactic structure. The faculty of language is likely to have emerged quite recently in evolutionary terms, some 70,000–100,000 years ago, and does not seem to have undergone modification since then, though individual languages do of course change over time, operating within this basic framework. The recent emergence of language and its stability are both consistent with the Strong Minimalist Thesis, which has at its core a single repeatable operation that takes exactly two syntactic elements a and b and assembles them to form the set {a, b}

    RoboFinch: A versatile audio-visual synchronised robotic bird model for laboratory and field research on songbirds

    Get PDF
    1. Singing in birds is accompanied by beak, head and throat movements. The role of these visual cues has long been hypothesised to be an important facilitator in vocal communication, including social interactions and song acquisition, but has seen little experimental study. 2. To address whether audio-visual cues are relevant for birdsong we used high-speed video recording, 3D scanning, 3D printing technology and colour-realistic painting to create RoboFinch, an open source adult-mimicking robot which matches temporal and chromatic properties of songbird vision. We exposed several groups of juvenile zebra finches during their song developmental phase to one of six singing robots that moved their beaks synchronised to their song and compared them with birds in a non-synchronised and two control treatments. 3. Juveniles in the synchronised treatment approached the robot setup from the start of the experiment and progressively increased the time they spent singing, contra to the other treatment groups. Interestingly, birds in the synchronised group seemed to actively listen during tutor song playback, as they sung less during the actual song playback compared to the birds in the asynchronous and audio-only control treatments. 4. Our open source RoboFinch setup thus provides an unprecedented tool for systematic study of the functionality and integration of audio-visual cues associated with song behaviour. Realistic head and beak movements aligned to specific song elements may allow future studies to assess the importance of multisensory cues during song development, sexual signalling and social behaviour. All software and assembly instructions are open source, and the robot can be easily adapted to other species. Experimental manipulations of stimulus combinations and synchronisation can further elucidate how audio-visual cues are integrated by receivers and how they may enhance signal detection, recognition, learning and memory

    Social Cognition and the Evolution of Language: Constructing Cognitive Phylogenies

    Get PDF
    Human language and social cognition are closely linked: advanced social cognition is necessary for children to acquire language, and language allows forms of social understanding (and, more broadly, culture) that would otherwise be impossible. Both “language” and “social cognition” are complex constructs, involving many independent cognitive mechanisms, and the comparative approach provides a powerful route to understanding the evolution of such mechanisms. We provide a broad comparative review of mechanisms underlying social intelligence in vertebrates, with the goal of determining which human mechanisms are broadly shared, which have evolved in parallel in other clades, and which, potentially, are uniquely developed in our species. We emphasize the importance of convergent evolution for testing hypotheses about neural mechanisms and their evolution

    No need to Talk, I Know You: Familiarity Influences Early Multisensory Integration in a Songbird's Brain

    Get PDF
    It is well known that visual information can affect auditory perception, as in the famous “McGurk effect,” but little is known concerning the processes involved. To address this issue, we used the best-developed animal model to study language-related processes in the brain: songbirds. European starlings were exposed to audiovisual compared to auditory-only playback of conspecific songs, while electrophysiological recordings were made in their primary auditory area (Field L). The results show that the audiovisual condition modulated the auditory responses. Enhancement and suppression were both observed, depending on the stimulus familiarity. Seeing a familiar bird led to suppressed auditory responses while seeing an unfamiliar bird led to response enhancement, suggesting that unisensory perception may be enough if the stimulus is familiar while redundancy may be required for unfamiliar items. This is to our knowledge the first evidence that multisensory integration may occur in a low-level, putatively unisensory area of a non-mammalian vertebrate brain, and also that familiarity of the stimuli may influence modulation of auditory responses by vision
    corecore