159 research outputs found
Recommended from our members
The use and function of gestures in word-finding difficulties in aphasia
Background: Gestures are spontaneous hand and arm movements that are part of everyday communication. The roles of gestures in communication are disputed. Most agree that they augment the information conveyed in speech. More contentiously, some argue that they facilitate speech, particularly when word-finding difficulties (WFD) occur. Exploring gestures in aphasia may further illuminate their role.
Aims: This study explored the spontaneous use of gestures in the conversation of participants with aphasia (PWA) and neurologically healthy participants (NHP). It aimed to examine the facilitative role of gesture by determining whether gestures particularly accompanied WFD and whether those difficulties were resolved.
Methods & Procedures: Spontaneous conversation data were collected from 20 PWA and 21 NHP. Video samples were analysed for gesture production, speech production, and WFD. Analysis 1 examined whether the production of semantically rich gestures in these conversations was affected by whether the person had aphasia, and/or whether there were difficulties in the accompanying speech. Analysis 2 identified all WFD in the data and examined whether these were more likely to be resolved if accompanied by a gesture, again for both groups of participants.
Outcomes & Results: Semantically rich gestures were frequently employed by both groups of participants, but with no effect of group. There was an effect of the accompanying speech, with gestures occurring most commonly alongside resolved WFD. An interaction showed that this was particularly the case for PWA. NHP, on the other hand, employed semantically rich gestures most frequently alongside fluent speech. Analysis 2 showed that WFD were common in both groups of participants. Unsurprisingly, these were more likely to be resolved for NHP than PWA. For both groups, resolution was more likely if a WFD was accompanied by a gesture.
Conclusions: These findings shed light on the different functions of gesture within conversation. They highlight the importance of gesture during WFD, both in aphasic and neurologically healthy language, and suggest that gesture may facilitate word retrieval
Recommended from our members
What can co-speech gestures in aphasia tell us about the relationship between language and gesture?: A single case study of a participant with Conduction Aphasia
Cross-linguistic evidence suggests that language typology influences how people gesture when using ‘manner-of-motion’ verbs (Kita 2000; Kita & Özyürek 2003) and that this is due to ‘online’ lexical and syntactic choices made at the time of speaking (Kita, Özyürek, Allen, Brown, Furman & Ishizuka, 2007). This paper attempts to relate these findings to the co-speech iconic gesture used by an English speaker with conduction aphasia (LT) and five controls describing a Sylvester and Tweety1 cartoon. LT produced co-speech gesture which showed distinct patterns which we relate to different aspects of her language impairment, and the lexical and syntactic choices she made during her narrative
Recommended from our members
Event Processing through naming: Investigating event focus in two people with aphasia
Some people with aphasia may have trouble with verbs because of fundamental difficulties in processing situations in a way that maps readily onto language. This paper describes a novel assessment, the Order of Naming Test, that explores the conceptual processing of events through the order in which people name the entities involved. The performance of non-brain damaged control participants is described. The responses of two people with non-fluent aphasia are then discussed. Both 'Helen' and 'Ron' showed significant difficulty with verbs and sentences. Ron also had trouble on a range of tasks tapping aspects of event processing, despite intact non-verbal cognition. While Helen's performance on the Order of Naming Test was very similar to the controls, Ron's differed in a number of respects, suggesting that he was less focused on the main participant entities. However, certain aspects of his response pointed at covert event processing abilities that might be fruitfully exploited in therapy
Recommended from our members
The role of semantically rich gestures in aphasic conversation
Introduction
Gestures play an important role in everyday communication (Kendon, 1997). They provide additional information to conversation partners about the meaning of verbal utterances and help to clarify even abstract concepts. There is evidence that gestures are not simply produced for the benefit of the listener but also support the speaker (Krauss, Chen, & Chawla, 1996; McNeill, Cassell, & McCullough, 1994). The relationship between speech and gesture is of great theoretical interest. Indeed the strong ties between speech and gesture have stimulated discussions about the neurological links between the modalities and the possible gestural origins of language.
Because of the importance of gesture in communication, several studies have investigated the use of gestures in aphasia (see Rose, 2006 for review). It is important to know how people with aphasia (PWA) use gesture as an accompaniment to speech, as a compensatory modality and during word-finding difficulties. Such knowledge can contribute to potential treatment regimes and may point to strategies that can assist everyday communication. Studying gesture use in people with compromised language can also contribute to the theoretical debate about the relationship between the modalities.
Most studies to date have focused on the effects of gesture in structured naming tasks, rather than in more natural conversation.
Methods
Aims
This study examines the natural conversational use of gestures in aphasic speech and addresses several research questions. This presentation focuses only on the following research questions:
(1)To what extend to PWA and neurologically healthy participants (NHP) employ semantically rich gestures (i.e., gestures that convey stand alone meanings or reflect an aspect of the spoken discourse)? What impact does their semantic competence have on gesture production?
(2)Do semantically rich gestures take different roles during conversation (facilitative, communicative, augmentative, compensatory)?
(3)Do different topics, for example, narrative (i.e., telling about a life event) and procedural (i.e., describing a process) elicit different gesture patterns?
Procedures
Language and conversation data of 20 PWA and 21 NHP have been collected. Extensive background testing of PWA has been done including tests of lexical semantics and non-verbal semantics.
Conversation samples of eight minutes in total have been collected. Video samples have been transcribed and analysed for gesture production, speech production and word-finding difficulties. Semantically rich gestures (e.g., they reflect concrete or abstract reference in the discourse (iconic, metaphoric and air writing & number gestures) or convey meaning in their own right (pantomime and emblem gestures)) were contrasted with semantically empty gestures (e.g., they refer to places or objects (deictic gestures) or mark speech rhythm (beat gestures). The roles of semantically rich gestures were coded to determine if participants are using gesture mainly to supplement speech, to replace speech or to facilitate lexical retrieval.
The following methods are being used in the analysis:
(1)All semantically rich gestures are identified within the conversation.
(2)Semantically rich gestures which occur during a word finding difficulty (i.e. which occur within three seconds of word finding behaviour and before the next utterance) will be either categorised as being facilitative or communicative.
a.If the word finding difficulty is resolved, the gesture will be categorised as being facilitative.
b.If the word finding difficulty is not resolved (by the speaker), the gesture will be categorised as being communicative.
(3)All other semantically rich gestures will be either categorised as being augmentative or compensatory.
a.If a gesture occurs alongside speech and supplements it, it is considered as being augmentative.
b.If a gesture is produced to replace speech, it will be categorised as being compensatory.
Results
The data analysis is on-going and results for both PWA and NHP will be available for presentation at the conference.
Preliminary results indicate that PWA used significantly more semantically rich gestures than semantically empty ones (t(15) = 5.229, p < .05). Surprisingly, the semantic impairment did not correlate with the use of semantically rich gestures (rs = .053, n.s.; rs = .171, n.s.). Overall, semantically rich gestures took different roles (X2(3) = 34.956, p < .05). Most semantically rich gestures were produced during a procedural than a narrative conversation (t(15) = -2.538, p < .05).
Discussion
Semantically rich gestures play an important role in conversation for PWA. They can take different roles with many gestures being produced alongside speech (augmentative gestures) and those facilitating lexical access (facilitative gestures). Only a small number of gestures replace speech (compensatory gestures). Finding out more about the different roles of gestures in speech production, helps us to better understand the relationship between language and gestures. This is vital for gestures to be implemented into aphasia therapy
Recommended from our members
A Systematically Conducted Scoping Review of the Evidence and Fidelity of Treatments for Verb Deficits in Aphasia: Verb-in-Isolation Treatments
Purpose: Aphasia research demonstrates increasing interest in the treatment of verb retrieval deficits. This systematically conducted scoping review reports on the level and fidelity of the current evidence for verb treatments; on its effectiveness regarding the production of trained and untrained verbs, functional communication, sentences, and discourse; and on the potential active ingredients. Recommendations to guide clinical decision making and future research are made.
Method: The computerized database search included studies from January 1980 to September 2018. The level of evidence of each study was documented, as was fidelity in terms of treatment delivery, enactment, and receipt. Studies were also categorized according to the treatment methods used.
Results: Thirty-seven studies were accepted into the review, and all but 1 constituted a low level of evidence. Thirty-three studies (89%) described treatment in sufficient detail to allow replication, dosage was poorly reported, and the fidelity of treatment was rarely assessed. The most commonly reported treatment techniques were phonological and semantic cueing in 25 (67.5%) and 20 (54%) studies, respectively. Retrieval of trained verbs improved for 80% of participants, and improvements generalized to untrained verbs for 15% of participants. There was not sufficient detail to evaluate the impact of treatment on sentence production, functional communication, and discourse.
Conclusions: The evidence for verb treatments is predominantly of a low level. There are encouraging findings in terms of treatments being replicable; however, this is tempered by poor monitoring of treatment fidelity. The quality of verb treatment research would be improved by researchers reaching consensus regarding outcome measures (including generalization to, e.g., sentences and discourse) by manualizing treatment to facilitate implementation and exploring the opinions of participants. Finally, while treatment is largely effective in improving production of trained verbs, lack of generalization to untrained items leads to the recommendation that personally relevant verbs are prioritized
Recommended from our members
Semantisch reiche Gesten und ihre Funktion im Gespräch
Hintergrund und Fragestellung
Alltägliche Kommunikation beschränkt sich nicht nur auf den verbalen Austausch von Information. Vielmehr spielen auch Gesten eine wesentliche Rolle. Bei Sichtkontakt vermitteln sie dem Gesprächspartner zusätzliche Informationen und tragen auch zur Erklärung abstrakter Inhalte bei. Studien weisen darauf hin, dass Gesten nicht nur zugunsten des Gesprächspartners produziert werden, sondern dass sie auch den Sprecher unterstützen. Durch ihre Bedeutung im Gespräch hat die Gestenproduktion bei Aphasie einen zentralen Bestandteil in der Forschung. Es ist wichtig, herauszufinden, wie Aphasiker (im Vergleich zu Sprachgesunden) Gesten sowohl sprachbegleitend als auch sprachersetzend im Gespräch einsetzen. Dieses Wissen kann einen Einfluss auf mögliche Behandlungsmethoden haben und Strategien aufzeigen, die die alltägliche Kommunikation unterstützen. Desweiteren kann die Erforschung der Gestenproduktion bei Spracheinschränkung auch zur theoretischen Diskussion über die Beziehungen zwischen Sprache und Gestik beitragen. Die meisten Studien haben sich bisher überwiegend auf die Auswirkungen von Gestik in strukturierten Benennaufgaben konzentriert. Diese Studie untersucht den Einsatz sprachbegleitender Gestik in natürlichen Gesprächssituationen, um so eine Reihe von relevanten Forschungsfragen zu beantworten: (1) In welchem Ausmaß produzieren Aphasiker und Kontrollpersonen semantisch reiche Gesten? (2) Welche Auswirkungen haben semantische Kompetenzen auf die Gestenproduktion? (3) Übernehmen semantisch reiche Gesten im Gespräch verschiedene Funktionen? Gibt es unterschiedliche Verteilungsmuster für Aphasiker und Kontrollpersonen? (4) Rufen verschiedene Gesprächsthemen (d.h. narrative und prozedurale) unterschiedliche Gesten hervor?
Methodik
20 Aphasiker und 21 neurologisch gesunde Kontrollpersonen haben an der Studie teilgenommen. Im Vorfeld wurden eine Reihe linguistisch-kognitiver Tests durchgeführt, inklusive Tests zur Ermittlung verbaler und non-verbaler semantischer Fähigkeiten. Insgesamt wurden sechzehn Minuten an narrative und prozeduralen Gesprächsdaten erhoben. Videodaten wurden transkribiert und in Bezug auf Sprach- und Gestenproduktion analysiert. Semantisch reiche Gesten (ikonische, metaphorische, pantomimische, emblematische Gesten und Luftschreiben & Zahlen) wurden semantisch leeren Gesten (deiktische Gesten, Beats und andere Gesten) gegenüber gestellt. Weiterhin wurden Wortfindungsstörungen (WFS) und deren Kombination mit Gesten identifiziert, um verschiedene Funktionen semantisch reicher Gesten (fazilitativ, kommunikativ, unterstützend und ersetzend) zu ermitteln.
Ergebnisse
Sowohl Aphasiker als auch Kontrollpersonen verwendeten signifikant mehr semantisch reiche Gesten als semantisch leere (F (1,37) = 22.057, p < .001). Dies war insbesondere in den prozeduralen Gesprächen der Fall (F (1, 39) = 61.485, p < .001). Unterschiede zeigten sich bei den verschiedenen Funktionen der Gesten: Beide Gruppen verwendeten rund 50% aller Gesten zum Lösen von WFS. Während die andere Hälfte bei den Kontrollpersonen überwiegend sprachbegleitend zur zusätzlichen Informationsvermittlung eingesetzt wurde, gestikulierten Aphasiker zum großen Teil während nicht gelöster WFS. Sprachersetzende Gesten traten bei beiden Gruppen nur sehr selten auf. Überraschenderweise konnte bei den Aphasikern kein Zusammenhang zwischen verbalen und non-verbalen semantischen Fähigkeiten und der Produktion dieser Gesten festgestellt werden (rs (17) = .230, n.s.; rs (17) = .362, n.s.).
Diskussion
Die Studie bestätigt vorausgehende Ergebnisse, dass in Bezug auf Anzahl und Art der Gesten kein Unterschied zwischen Aphasikern und Kontrollpersonen festgestellt werden konnte. Erst mit der weiteren Betrachtung der Gestenfunktion wurden Unterschiede sichtbar: Während Kontrollpersonen Gesten überwiegend zur eigenen Unterstützung oder zur Untermalung des Gesprochenen verwendeten, gestikulierten Aphasiker mehr für ihren Gesprächspartner und zogen ihn in die aktive Suche bei WFS ein.
Literatur
Kendon, A. (1997). Gesture. Annual Review of Anthropology, 26, 109-128.
Krauss, R. M., Chen, Y., & Chawla, P. (1996). Nonverbal behavior and nonverbal communication: What do conversational hand gestures tell us? In M. Zanna (Ed.), Advances in experimental social psychology (pp. 389-450). Diego, CA: Academic Press.
McNeill, D., Cassell, J., & McCullough, K.-E. (1994). Communicative effects of speech-mismatched gestures. Research on Language and Social Interaction, 27(3), 223-237.
Rose, M. L. (2006). The utility of arm and hand gestures in the treatment of aphasia. Advances in Speech-Language Pathology, 8(2), 92-109. Wilkinson, R. (2010). Interaction-focused intervention: A conversation analytic approach to aphasia therapy. Journal of Interactional Research in Communication Disorders, 1(1), 45-68
The City Gesture Checklist: The development of a novel gesture assessment
Background
People with aphasia rely on gesture more than healthy controls to get their message across, but use a limited range of gesture types. Gesture therapy is thus a potential avenue of intervention for people with aphasia. However, currently no gesture assessment evaluates how they use gesture. Such a tool could inform therapy targets and measure outcomes. In gesture research, many different coding categories are used to describe gesture forms and functions. These coding methods are prohibitively time‐consuming to use in clinical practice. There is therefore a need for a ‘quick and dirty’ method of assessing gesture use.
Aims
To investigate current practice among UK‐based clinicians (speech and language therapists) in relation to gesture assessment and therapy, to synthesize gesture‐coding frameworks used in aphasia research, to develop a gesture checklist based on the synthesized coding frameworks suitable for use in clinical practice, and to investigate the interrater reliability (IRR) of the checklist among experienced and unfamiliar users.
Methods & Procedures
The research team synthesized seven gesture‐coding frameworks and trialled three resulting prototype checklists at a co‐design workshop with 20 clinicians. Attending clinicians were also consulted about their current clinical gesture practice using a questionnaire. A final City Gesture Checklist (CGC) was developed based upon outcomes and feedback from the workshop. The IRR of the CGC was evaluated between the research team and 11 further clinicians within a second workshop. Both groups used the CGC to count gestures in video clips of people with aphasia talking to a conversation partner.
Main Contribution
A total of 18 workshop attendees completed the current practice questionnaire. Of these, 10 reported assessing gesture informally and five also used formal assessment. Gesture‐coding synthesis highlighted six main categories of gesture form. Clinicians at the co‐design workshop provided feedback on prototype checklists regarding the relevance and usability of the gesture categories, layout, use of images and instructions. A final version of the CGC was created incorporating their recommendations. The IRR for the CGC was moderate between both the researchers and clinicians.
Conclusions & Implications
The CGC can be used to assess the types of gesture that people with aphasia produce. The IRR was moderate amongst both experienced users and new users who had received no training. Future research directions include investigating how to improve IRR, evaluating intra‐rater reliability and sensitivity to change, and exploring use of the CGC in clinical practice
Recommended from our members
To the sentence and beyond: a single case therapy report for mild aphasia
Background: Mild aphasia has received limited attention in the research literature, with few published treatment studies despite significant disruption of communication reported by affected individuals. This includes difficulty understanding and producing grammatically complex language, and consequent discourse and/or conversational difficulties. The limited research may be due to a lack of clarity regarding the deficits underlying the disorder, with linguistic and/or cognitive impairments implicated, as well as limited research and treatment resources being targeted at those with more severe deficits
- …