17,967 research outputs found
Investigating the dual function of gesture in blind and visually impaired children. (Poster)
Co-speech gesture research explores the role of gesture in communication, i.e. whether gestures are intended for the listener/audience (e.g. Mol et al. 2009; Alibali et al., 2001; Holler & Beattie, 2003) or support the process of speech production (Kita & Davies, 2009; Hostetter et al. 2007). To investigate the role of gesture in communication we turn to blind and visually impaired speakers whose opportunities to learn gestures visually are limited (cf. Iverson & Goldin-Meadow 1998; 2001). The present study aims at providing insight into the nature and occurrence of co-speech gestures in spontaneous speech: between blind, severely visually impaired and sighted individuals. Participants were asked to read a short story (either in print or in Braille) and to re-tell it to the interviewer. Care was taken to establish an environment in which the participants would feel safe and would not refrain from gesturing for fear of hurting themselves or others. We predicted that if blind speakers did not gesture as much as their visually impaired peers it would suggest that gesture is to some extent acquired through visual instruction. However, following Iverson et al. (2000) and Iverson and Goldin-Meadow (1998) we hypothesized that despite the absence of visual gestural stimuli during the language-learning process gesture is present in the language of the blind participants - but there would be differences in gesture form, types and functions. The present study aims at exploring and categorizing these differences, with regard to how sensory references are visible in the gestures of participants with various degrees of sight impairment. Regardless of dissimilarities, the presence of gesture in both the blind and impaired individuals points towards a dual function of co-speech gestures, i.e. a device for both the speaker and their interlocutor
HAND GESTURE RECOGNITION
The most important methodology used for communication between hearing and speech impaired person is the sign language. So, it becomes a necessary task to create a bridge between the two persons who wants to communicate. Many algorithms have been developed in the recent years, to help people who are not aware of the sign language but very few with good results exist. The difficult part in the hand gesture recognition is the segmentation of the hand or segregation of the hand and identifying the hand gesture. This paper describes the some possible ways of segmentation using RGB color spaces and models and presents the best algorithm with highest accuracy to perform the same. Various experiments were conducted for different gestures and results were obtained with accuracy. The algorithms were implemented in MATLAB programming language
Recommended from our members
Words are not enough: Empowering people with aphasia in the design process
Recommended from our members
The use and function of gestures in word-finding difficulties in aphasia
Background: Gestures are spontaneous hand and arm movements that are part of everyday communication. The roles of gestures in communication are disputed. Most agree that they augment the information conveyed in speech. More contentiously, some argue that they facilitate speech, particularly when word-finding difficulties (WFD) occur. Exploring gestures in aphasia may further illuminate their role.
Aims: This study explored the spontaneous use of gestures in the conversation of participants with aphasia (PWA) and neurologically healthy participants (NHP). It aimed to examine the facilitative role of gesture by determining whether gestures particularly accompanied WFD and whether those difficulties were resolved.
Methods & Procedures: Spontaneous conversation data were collected from 20 PWA and 21 NHP. Video samples were analysed for gesture production, speech production, and WFD. Analysis 1 examined whether the production of semantically rich gestures in these conversations was affected by whether the person had aphasia, and/or whether there were difficulties in the accompanying speech. Analysis 2 identified all WFD in the data and examined whether these were more likely to be resolved if accompanied by a gesture, again for both groups of participants.
Outcomes & Results: Semantically rich gestures were frequently employed by both groups of participants, but with no effect of group. There was an effect of the accompanying speech, with gestures occurring most commonly alongside resolved WFD. An interaction showed that this was particularly the case for PWA. NHP, on the other hand, employed semantically rich gestures most frequently alongside fluent speech. Analysis 2 showed that WFD were common in both groups of participants. Unsurprisingly, these were more likely to be resolved for NHP than PWA. For both groups, resolution was more likely if a WFD was accompanied by a gesture.
Conclusions: These findings shed light on the different functions of gesture within conversation. They highlight the importance of gesture during WFD, both in aphasic and neurologically healthy language, and suggest that gesture may facilitate word retrieval
Imitation, mirror neurons and autism
Various deficits in the cognitive functioning of people with autism have been documented in recent years but these provide only partial explanations for the condition. We focus instead on an imitative disturbance involving difficulties both in copying actions and in inhibiting more stereotyped mimicking, such as echolalia. A candidate for the neural basis of this disturbance may be found in a recently discovered class of neurons in frontal cortex, 'mirror neurons' (MNs). These neurons show activity in relation both to specific actions performed by self and matching actions performed by others, providing a potential bridge between minds. MN systems exist in primates without imitative and ‘theory of mind’ abilities and we suggest that in order for them to have become utilized to perform social cognitive functions, sophisticated cortical neuronal systems have evolved in which MNs function as key elements. Early developmental failures of MN systems are likely to result in a consequent cascade of developmental impairments characterised by the clinical syndrome of autism
Recommended from our members
Enactivism and ethnomethodological conversation analysis as tools for expanding Universal Design for Learning: the case of visually impaired mathematics students
Blind and visually impaired mathematics students must rely on accessible materials such as tactile diagrams to learn mathematics. However, these compensatory materials are frequently found to offer students inferior opportunities for engaging in mathematical practice and do not allow sensorily heterogenous students to collaborate. Such prevailing problems of access and interaction are central concerns of Universal Design for Learning (UDL), an engineering paradigm for inclusive participation in cultural praxis like mathematics. Rather than directly adapt existing artifacts for broader usage, UDL process begins by interrogating the praxis these artifacts serve and then radically re-imagining tools and ecologies to optimize usability for all learners. We argue for the utility of two additional frameworks to enhance UDL efforts: (a) enactivism, a cognitive-sciences view of learning, knowing, and reasoning as modal activity; and (b) ethnomethodological conversation analysis (EMCA), which investigates participants’ multimodal methods for coordinating action and meaning. Combined, these approaches help frame the design and evaluation of opportunities for heterogeneous students to learn mathematics collaboratively in inclusive classrooms by coordinating perceptuo-motor solutions to joint manipulation problems. We contextualize the thesis with a proposal for a pluralist design for proportions, in which a pair of students jointly operate an interactive technological device
Affective Medicine: a review of Affective Computing efforts in Medical Informatics
Background: Affective computing (AC) is concerned with emotional interactions performed with and through computers. It is defined as “computing that relates to, arises from, or deliberately influences emotions”. AC enables investigation and understanding of the relation between human emotions and health as well as application of assistive and useful technologies in the medical domain. Objectives: 1) To review the general state of the art in AC and its applications in medicine, and 2) to establish synergies between the research communities of AC and medical informatics. Methods: Aspects related to the human affective state as a determinant of the human health are discussed, coupled with an illustration of significant AC research and related literature output. Moreover, affective communication channels are described and their range of application fields is explored through illustrative examples. Results: The presented conferences, European research projects and research publications illustrate the recent increase of interest in the AC area by the medical community. Tele-home healthcare, AmI, ubiquitous monitoring, e-learning and virtual communities with emotionally expressive characters for elderly or impaired people are few areas where the potential of AC has been realized and applications have emerged. Conclusions: A number of gaps can potentially be overcome through the synergy of AC and medical informatics. The application of AC technologies parallels the advancement of the existing state of the art and the introduction of new methods. The amount of work and projects reviewed in this paper witness an ambitious and optimistic synergetic future of the affective medicine field
- …