415 research outputs found
Learning to use gestures in narratives: developmental trends in formal and semantic gesture competence
This study analyses the way in which children develop their competence in the formal and semantic aspects of gesture. The analysis is focused upon the use of representational gestures in a narrative context. A group of 30 Italian children from 4 to 10 years was videotaped while telling a video cartoon to an adult. Gestures were coded according to the parameters used in Sign Languages analysis and analysed in terms of the acquisition of their properties, the accuracy of their execution and correctness in content representation.It was investigated also the development of the symbolic competence in relation both to the use of some of these parameters and to the representational strategies adopted. Results indicate a developmental trend in all the phenomena investigated and point out some formal similarities between gesture and Sign Languages
Recommended from our members
Gesture production and comprehension in children with specific language impairment
Children with specific language impairment (SLI) have difficulties with spoken language. However, some recent research suggests that these impairments reflect underlying cognitive limitations. Studying gesture may inform us clinically and theoretically about the nature of the association between language and cognition. A total of 20 children with SLI and 19 typically developing (TD) peers were assessed on a novel measure of gesture production. Children were also assessed for sentence comprehension errors in a speech-gesture integration task. Children with SLI performed equally to peers on gesture production but performed less well when comprehending integrated speech and gesture. Error patterns revealed a significant group interaction: children with SLI made more gesture-based errors, whilst TD children made semantically based ones. Children with SLI accessed and produced lexically encoded gestures despite having impaired spoken vocabulary and this group also showed stronger associations between gesture and language than TD children. When SLI comprehension breaks down, gesture may be relied on over speech, whilst TD children have a preference for spoken cues. The findings suggest that for children with SLI, gesture scaffolds are still more related to language development than for TD peers who have out-grown earlier reliance on gestures. Future clinical implications may include standardized assessment of symbolic gesture and classroom based gesture support for clinical groups
Gestural abilities of children with specific language impairment
BackgroundSpecific language impairment (SLI) is diagnosed when language is significantly below chronological age expectations in the absence of other developmental disorders, sensory impairments or global developmental delays. It has been suggested that gesture may enhance communication in children with SLI by providing an alternative means to convey words or extend utterances. However, gesture is a complex task that requires the integration of social, cognitive and motor skills, skills that some children with SLI may find challenging. In addition, there is reason to believe that language and gesture form an integrated system leading to the prediction that children with a SLI may also have difficulties with gestural communication. AimsTo explore the link between language and gesture in children with poor language skills. Methods & ProcedureFifteen children with SLI and 14 age-matched typically developing children (TD) participated in this study. The children completed measures of expressive and receptive vocabulary, non-verbal cognition, motor control, gesture comprehension and gesture production. Outcomes & ResultsTD children achieved significantly higher scores on measures of gesture production and gesture comprehension relative to children with SLI. Significant correlations between both measures of vocabulary and both measures of gesture suggest a tight link between language and gesture. Conclusions & ImplicationsThe findings support the idea that gesture and language form one integrated communication system, rather than two separate communication modalities. This implies that children with SLI may have underlying deficits that impact not only on language but also on gesture production and comprehension
Simultaneity as an emergent property of efficient communication in language: A comparison of silent gesture and sign language
Sign languages use multiple articulators and iconicity in the visual modality which allow linguistic units to be organized not only linearly but also simultaneously. Recent research has shown that users of an established sign language such as LIS (Italian Sign Language) use simultaneous and iconic constructions as a modality-specific resource to achieve communicative efficiency when they are required to encode informationally rich events. However, it remains to be explored whether the use of such simultaneous and iconic constructions recruited for communicative efficiency can be employed even without a linguistic system (i.e., in silent gesture) or whether they are specific to linguistic patterning (i.e., in LIS). In the present study, we conducted the same experiment as in Slonimska et al. with 23 Italian speakers using silent gesture and compared the results of the two studies. The findings showed that while simultaneity was afforded by the visual modality to some extent, its use in silent gesture was nevertheless less frequent and qualitatively different than when used within a linguistic system. Thus, the use of simultaneous and iconic constructions for communicative efficiency constitutes an emergent property of sign languages. The present study highlights the importance of studying modality-specific resources and their use for linguistic expression in order to promote a more thorough understanding of the language faculty and its modality-specific adaptive capabilities
Recommended from our members
Deaf children's non-verbal working memory is impacted by their language experience
Several recent studies have suggested that deaf children perform more poorly on working memory tasks compared to hearing children, but these studies have not been able to determine whether this poorer performance arises directly from deafness itself or from deaf children's reduced language exposure. The issue remains unresolved because findings come mostly from (1) tasks that are verbal as opposed to non-verbal, and (2) involve deaf children who use spoken communication and therefore may have experienced impoverished input and delayed language acquisition. This is in contrast to deaf children who have been exposed to a sign language since birth from Deaf parents (and who therefore have native language-learning opportunities within a normal developmental timeframe for language acquisition). A more direct, and therefore stronger, test of the hypothesis that the type and quality of language exposure impact working memory is to use measures of non-verbal working memory (NVWM) and to compare hearing children with two groups of deaf signing children: those who have had native exposure to a sign language, and those who have experienced delayed acquisition and reduced quality of language input compared to their native-signing peers. In this study we investigated the relationship between NVWM and language in three groups aged 6–11 years: hearing children (n = 28), deaf children who were native users of British Sign Language (BSL; n = 8), and deaf children who used BSL but who were not native signers (n = 19). We administered a battery of non-verbal reasoning, NVWM, and language tasks. We examined whether the groups differed on NVWM scores, and whether scores on language tasks predicted scores on NVWM tasks. For the two executive-loaded NVWM tasks included in our battery, the non-native signers performed less accurately than the native signer and hearing groups (who did not differ from one another). Multiple regression analysis revealed that scores on the vocabulary measure predicted scores on those two executive-loaded NVWM tasks (with age and non-verbal reasoning partialled out). Our results suggest that whatever the language modality—spoken or signed—rich language experience from birth, and the good language skills that result from this early age of acquisition, play a critical role in the development of NVWM and in performance on NVWM tasks
Cross-linguistic views of gesture usage
People have stereotypes about gesture usage. For instance, speakers in East Asia are not supposed to gesticulate, and it is believed that Italians gesticulate more than the British. Despite the prevalence of such views, studies that investigate these stereotypes are scarce. The present study examined peopleÕs views on spontaneous gestures by collecting data from five different countries. A total of 363 undergraduate students from five countries (France, Italy, Japan, the Netherlands and USA) participated in this study. Data were collected through a two-part questionnaire. Part 1 asked participants to rate two characteristics of gesture: frequency and size of gesture for 13 different languages. Part 2 asked them about their views on factors that might affect the production of gestures. The results showed that most participants in this study believe that Italian, Spanish, and American English speakers produce larger gestures more frequently than other language speakers. They also showed that each culture group, even within Europe, put weight on a slightly different aspect of gestures
RNA editing
Uređivanje RNA opisuje molekularne procese koji uključuju post-transkripcijsku promjenu nukleotida molekule RNA u odnosu na kodirajuću DNA što je često nužno za uspostavljanje biološke funkcije. Odvija se u svim domenama života osim kod prokariota, te kod nekih RNA-virusa. Uređivanjem su zahvaćeni jezgrini, mitohondrijski te plastidni transkripti RNA, a uređivana RNA ne mora nužno kodirati za proteine. Uređivanje RNA se može svrstati u dvije skupine – supstitucijsko, u kojem dolazi do zamjene nukleotida te insercijsko ili delecijsko koje mijenja broj nukleotida u uređivanom transkriptu. Glavni predstavnici supstitucijskih uređivanja kod sisavaca jesu proteinski kompleksi APOBEC i ADAR. Protein APOBEC je citidinska deaminaza te provodi deaminaciju C→U, dok je protein ADAR adenozinska deaminaza te mijenja A→I. Kod praživotinja uređuje se mitohondrijska RNA. Za Trypanosoma karakteristična je insercija tj. delecija uridina kod gena koji kodiraju za proteine. Kod amebe Acanthoamoebe castelanii se uređuje molekula tRNA na 5'-kraju akceptorske petlje. Uređivanje kod biljaka najčešće se odnosi na modifikaciju C→U koja se odvija na mitohondrijskim i plastidnim transkriptima. RNA-virusi kao Ebolavirus i virus hepatitisa delta također bivaju podvrgnuti uređivanju RNA, pomoću vlastitih ili domaćinskih proteina, što ima značajan utjecaj na njihovu virulentnost. Uređivanje RNA je vrlo kompleksan i energetski skup proces te se postavlja pitanje kako je uopće došlo do njegove široke rasprostranjenosti. Model neutralne konstruktivne evolucije pokušava dati zadovoljavajući odgovor na to pitanje gledajući na postojanje sustava uređivanja RNA kao na sustav koji preuzima dio odgovornosti u procesu ispravljanja grešaka u DNA. Pri tome se štetne mutacije ispravljaju ne na razini DNA, već na razini RNA.RNA editing describes molecular processes that include posttranscriptional changes in RNA nucleotide sequence compared to coding DNA sequence and may be necessary for their biological function. It is spread in eukaryotes, and it also occurs in some RNA viruses. Nucleus, mitochondria and plastids are cellular compartments in which RNA editing occurs. Also transcripts to be edited are not necessarily protein coding. There are two main groups of RNA editing – editing by substitution of nucleotides and insertions or deletions in which number of nucleotides in RNA transcripts is changed. For substitution editing, there are two main protein systems in mammals – APOBEC and ADAR. APOBEC has cytidine deaminase activity and conducts C to U deamination whereas ADARs are adenosine deaminases and they change A→I. Protozan RNA editing includes editing of mitochondrial RNA transcripts. Typical editing for trypanosomatid protozoans involves insertion as well as deletion of U in protein coding RNA transcripts. RNA editing in protoza Acanthoamoebe castelanii includes editing of 5’-end of tRNA acceptor stem. In plants C to U editing is the most common type of modification. Mitochondrial and plastids RNAs are targeted for editing in this case. RNA of viruses, like Ebolavirus and hepatitis delta virus, can also undergo editing using its own or host protein complexes. It can largely affect their virulence. RNA editing is complex and energetically expensive process so there is open question why did it even arose. Constructive neutral evolution model tries to give satisfying answer. It proposes that RNA editing system emerged after allowing mutations to happen. RNA editing allows corrections of DNA mutations on RNA level and therefore it permits DNA-encoded information to degenerate progressively
Neurophysiological evidence for rapid processing of verbal and gestural information in understanding communicative actions
During everyday social interaction, gestures are a fundamental part of human communication. The communicative pragmatic role of hand gestures and their interaction with spoken language has been documented at the earliest stage of language development, in which two types of indexical gestures are most prominent: the pointing gesture for directing attention to objects and the give-me gesture for making requests. Here we study, in adult human participants, the neurophysiological signatures of gestural-linguistic acts of communicating the pragmatic intentions of naming and requesting by simultaneously presenting written words and gestures. Already at ~150 ms, brain responses diverged between naming and request actions expressed by word-gesture combination, whereas the same gestures presented in isolation elicited their earliest neurophysiological dissociations significantly later (at ~210 ms). There was an early enhancement of request-evoked brain activity as compared with naming, which was due to sources in the frontocentral cortex, consistent with access to action knowledge in request understanding. In addition, an enhanced N400-like response indicated late semantic integration of gesture-language interaction. The present study demonstrates that word-gesture combinations used to express communicative pragmatic intentions speed up the brain correlates of comprehension processes – compared with gesture-only understanding – thereby calling into question current serial linguistic models viewing pragmatic function decoding at the end of a language comprehension cascade. Instead, information about the social-interactive role of communicative acts is processed instantaneously
- …
