291 research outputs found

    Genetic influences in different aspects of language development: The etiology of language skills in 4.5 year-old twins

    Get PDF
    The genetic and environmental etiologies of diverse aspects of language ability and disability, including articulation, phonology, grammar, vocabulary, and verbal memory, were investigated in a U.K. sample of 787 pairs of 4.5-year-old same-sex and opposite-sex twins. Moderate genetic influence was found for all aspects of language in the normal range. A similar pattern was found at the low end of the distribution with the exception of two receptive measures. Environmental influence was mainly due to nonshared factors, unique to the individual, with little influence from shared environment for most measures. Genetic and environmental influences on language ability and disability are quantitatively and qualitatively similar for males and females

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    Linguistic theory, linguistic diversity and Whorfian economics

    Get PDF
    Languages vary greatly in their words, sounds and sentence structures. Linguistic theory has shown that many aspects of variation are superficial and may not reflect underlying formal similarities between languages, which are relevant to how humans learn and process language. In this chapter, I show both how languages can vary and how the surface variations can be manifestations of underlying similarities. Economists have sometimes adopted a ‘Whorfian’ view that differences in languages can cause differences in how their speakers think and behave. Psychological experiments have shown both support for this hypothesis and evidence against it. Specific arguments that language causes thought, which have been made in recent economics papers, are examined in the light of what linguistics tells us about superficial and underlying variatio

    Integrating Mechanisms of Visual Guidance in Naturalistic Language Production

    Get PDF
    Situated language production requires the integration of visual attention and lin-guistic processing. Previous work has not conclusively disentangled the role of perceptual scene information and structural sentence information in guiding visual attention. In this paper, we present an eye-tracking study that demonstrates that three types of guidance, perceptual, conceptual, and structural, interact to control visual attention. In a cued language production experiment, we manipulate percep-tual (scene clutter) and conceptual guidance (cue animacy), and measure structural guidance (syntactic complexity of the utterance). Analysis of the time course of lan-guage production, before and during speech, reveals that all three forms of guidance affect the complexity of visual responses, quantified in terms of the entropy of atten-tional landscapes and the turbulence of scan patterns, especially during speech. We find that perceptual and conceptual guidance mediate the distribution of attention in the scene, whereas structural guidance closely relates to scan-pattern complexity. Furthermore, the eye-voice span of the cued object and its perceptual competitor are similar; its latency mediated by both perceptual and structural guidance. These results rule out a strict interpretation of structural guidance as the single dominant form of visual guidance in situated language production. Rather, the phase of the task and the associated demands of cross-modal cognitive processing determine the mechanisms that guide attention

    The interaction of visual and linguistic saliency during syntactic ambiguity resolution

    Get PDF
    Psycholinguistic research using the visual world paradigm has shown that the pro-cessing of sentences is constrained by the visual context in which they occur. Re-cently, there has been growing interest on the interactions observed when both lan-guage and vision provide relevant information during sentence processing. In three visual world experiments on syntactic ambiguity resolution, we investigate how vi-sual and linguistic information influence the interpretation of ambiguous sentences. We hypothesize that (1) visual and linguistic information both constrain which in-terpretation is pursued by the sentence processor, and (2) the two types of informa-tion act upon the interpretation of the sentence at different points during processing. In Experiment 1, we show that visual saliency is utilized to anticipate the upcoming arguments of a verb. In Experiment 2, we operationalize linguistic saliency using intonational breaks and demonstrate that these give prominence to linguistic refer-ents. These results confirm prediction (1). In Experiment 3, we manipulate visual and linguistic saliency together and find that both types of information are used, but at different points in the sentence, to incrementally update its current interpre-tation. This finding is consistent with prediction (2). Overall, our results suggest an adaptive processing architecture in which different types of information are used when they become available, optimizing different aspects of situated language pro-cessing

    Grammatical gender and linguistic relativity: A systematic review

    Get PDF
    Many languages assign nouns to a grammatical gender class, such that ‘bed’ might be assigned masculine gender in one language (e.g. Italian) but feminine gender in another (e.g. Spanish). In the context of research assessing the potential for language to influence thought (the linguistic relativity hypothesis), a number of scholars have investigated whether grammatical gender assignment ‘rubs off’ on concepts themselves, such that Italian speakers might conceptualise beds as more masculine than Spanish speakers. We systematically reviewed 43 pieces of empirical research examining grammatical gender and thought, which together tested 5,895 participants. We classified the findings in terms of their support for this hypothesis, and assessed the results against parameters previously identified as potentially influencing outcomes. Overall, we found that support was strongly task- and context-dependent, and rested heavily on outcomes that have clear and equally-viable alternative explanations. We also argue that it remains unclear whether grammatical gender is in fact a useful tool for investigating relativity

    Basketball Game as Psychology Experiment

    No full text
    • …
    corecore