745,321 research outputs found
Stutter-like dysfluencies in Flemish sign language users
Although stuttering is primarily considered to be a disorder of speech, stutter-like dysfluencies have been reported to occur during non-speech activities such as musical expression and sign language. Recently we conducted a questionnaire study aimed at documenting the possible occurrence and nature of stutter-like dysfluencies in Flemish Sign Language. A questionnaire was sent to 66 individuals who have knowledge of Flemish Sign Language and come regularly in contact with Flemish Sign Language users. They were 38 Flemish Sign Language interpreters and 28 employees of special needs schools adapted to deaf and partially deaf pupils. The questionnaire consisted of three parts. First, the participants were inquired about their occupational activities. The second part focused on the research questions. Namely, the interviewees were asked whether they had ever noticed dysfluencies in the manual communication of the deaf and partially deaf. If so, they had to indicate on a list which type of dysfluencies they had perceived and specify whether the dysfluencies generally occurred at the beginning, in the middle or at the end of a sign movement. Finally, the participants were asked to provide details (such as gender, age, nature of the fluency problems, awareness, secondary behaviour, and influencing factors) on each deaf and partially deaf person they consider to be dysfluent in the manual mode. Of the 66 individuals surveyed, 13 (i.e. 20%) responded. Of those 13 respondents, nine (i.e. 69%) reported to have observed dysfluencies in the manual communication of Flemish Sign Language users. Concerning the nature of these dysfluencies, participants mostly perceived ‘involuntary interjections’, ‘repetitions of sign movement’, ‘unusual body movements’ and ‘poor fluidity of the sign’. Looking at the distribution of the dysfluencies within the sign movement, fluency failures can occur at various loci but there seems to be a slight preponderance for the initial position. Individuals considered to be dysfluent in the manual mode are often males. They can be aware of their fluency problems and if so will often demonstrate secondary stuttering behaviour. Events accompanied by stress, fatigue or emotion will increase the manual dysfluencies at least in some cases. The current study revealed mainly features that are typical of stuttering, but also some features that are unlike those usually observed in stutterers. If dysfluencies in manual communication can be regarded as stuttering, this has implications for our perception of the stuttering phenomenon. One possibility is to hold on to the idea that stuttering is ‘first and foremost a disorder of speech’. On the other hand, instead of being a (speech) disorder on itself, stuttered speech and manual dysfluencies could be considered as symptoms of an underlying disturbance in motor functioning. In that case, one would expect to encounter stutter-like dysfluencies in all sorts of behaviour demanding extensive motor planning
NEW shared & interconnected ASL resources: SignStream® 3 Software; DAI 2 for web access to linguistically annotated video corpora; and a sign bank
2017 marked the release of a new version of SignStream® software, designed to facilitate linguistic analysis of ASL video. SignStream® provides an intuitive interface for labeling and time-aligning manual and non-manual components of the signing. Version 3 has many new features. For example, it enables representation of morpho-phonological information, including display of handshapes. An expanding ASL video corpus, annotated through use of SignStream®, is shared publicly on the Web. This corpus (video plus annotations) is Web-accessible—browsable, searchable, and downloadable—thanks to a new, improved version of our Data Access Interface: DAI 2. DAI 2 also offers Web access to a brand new Sign Bank, containing about 10,000 examples of about 3,000 distinct signs, as produced by up to 9 different ASL signers. This Sign Bank is also directly accessible from within SignStream®, thereby boosting the efficiency and consistency of annotation; new items can also be added to the Sign Bank. Soon to be integrated into SignStream® 3 and DAI 2 are visualizations of computer-generated analyses of the video: graphical display of eyebrow height, eye aperture, an
On modality in Georgian sign language (GESL)
Modality is one of the most fascinating and complex areas of language studies. This paper illustrates the types of modal constructions in Georgian Sign language (GESL), including negative forms. GESL shows modality semantics with a combination of manual and facial signs. Modals in GESL can occur in the pre-verbal, clause-final, or clause-initial positions, as in many other sign languages (SLs). GESL modal constructions show the specific tense-related negation strategy. Modal constructions in this language often use combinations of modal signs with an equal value
Scalable ASL sign recognition using model-based machine learning and linguistically annotated corpora
We report on the high success rates of our new, scalable, computational approach for sign recognition from monocular video, exploiting linguistically annotated ASL datasets with multiple signers. We recognize signs using a hybrid framework combining state-of-the-art learning methods with features based on what is known about the linguistic composition of lexical signs. We model and recognize the sub-components of sign production, with attention to hand shape, orientation, location, motion trajectories, plus non-manual features, and we combine these within a CRF framework. The effect is to make the sign recognition problem robust, scalable, and feasible with relatively smaller datasets than are required for purely data-driven methods. From a 350-sign vocabulary of isolated, citation-form lexical signs from the American Sign Language Lexicon Video Dataset (ASLLVD), including both 1- and 2-handed signs, we achieve a top-1 accuracy of 93.3% and a top-5 accuracy of 97.9%. The high probability with which we can produce 5 sign candidates that contain the correct result opens the door to potential applications, as it is reasonable to provide a sign lookup functionality that offers the user 5 possible signs, in decreasing order of likelihood, with the user then asked to select the desired sign
A STUDY ON THE TECHNIQUES OF TEACHING ENGLISH TO HEARING IMPAIRED STUDENTS OF SMPLB PANCABHAKTI MAGETAN
Being able to use English is very important in this globalization era. Endless information from mass media that is written or spoken in English is provided for people who know or understand English. Nowadays, English becomes one of the important subjects that must be given as a compulsory subject in the primary school until university level. Therefore, the government has included this subject in the SLB, where it is the school for the students that have physical weaknesses. Consequently, the English teacher should be able to use appropriate techniques to teach the hearingimpaired students.
This study aimed at getting information about the techniques of teaching English, the most dominant technique, and the strengths and the weaknesses of the techniques used by the teacher in SMPLB Panca Bhakti Magetan.
The purpose of this study is to know the techniques of teaching English, the most dominant technique, and the strengths and weaknesses of the techniques used by the teacher.
The method used was a descriptive qualitative method, because the data would be reported on
the written form rather than a statistical form. The subject of the study was the English teacher at
SMPLB Panca Bhakti Magetan.
The findings of the study revealed that the techniques used by the teacher to teach English at SMPLB Panca Bhakti Magetan were total and manual communication techniques. The most dominant technique used by the teacher to teach English was the manual communication technique. The strengths of the total communication technique were the hearingimpaired students could learn lips reading so they could communicate with other people who cannot used
finger spelling or sign language and they would pay more attention to the materials. But, by using the total technique, it needed long times because the hearingimpaired students often got difficulty to read the teacher’s lips and to present sign and speech together was also difficult for teacher and the hearingimpaired students. The manual communication technique was a technique that made the HearingImpaired students easy to understand the materials because this technique always was used by the hearingimpaired students every day. This technique was suitable with the ability of the hearingimpaired students but they had to learn a lot of sign language so it made this technique inefficient. Besides, it was difficult to make communication with normal people because not all of people understand sign language or finger spelling.
Therefore, the English teacher of hearingimpaired students had to add her knowledge by joining training, seminar or workshop of English to increase her professionalism. The teacher also should motivate the hearingimpaired students to use dictionary if they didn’t understand the meaning of sentences or words
Stutter-like dysfluencies in Flemish sign language users
The purpose of this communication is to report on the occurrence of stutter-like behaviour in Flemish Sign Language users. A questionnaire was sent to 38 Flemish Sign Language interpreters and 28 employees of special needs schools adapted to deaf and partially deaf pupils inquiring whether they had ever observed dysfluencies in the manual communication of the deaf and partially deaf. Of the 13 individuals who responded, nine indicated to have perceived such behaviour. The characteristics of the observed dysfluencies are summarized and implications are discussed
A survey on mouth modeling and analysis for Sign Language recognition
© 2015 IEEE.Around 70 million Deaf worldwide use Sign Languages (SLs) as their native languages. At the same time, they have limited reading/writing skills in the spoken language. This puts them at a severe disadvantage in many contexts, including education, work, usage of computers and the Internet. Automatic Sign Language Recognition (ASLR) can support the Deaf in many ways, e.g. by enabling the development of systems for Human-Computer Interaction in SL and translation between sign and spoken language. Research in ASLR usually revolves around automatic understanding of manual signs. Recently, ASLR research community has started to appreciate the importance of non-manuals, since they are related to the lexical meaning of a sign, the syntax and the prosody. Nonmanuals include body and head pose, movement of the eyebrows and the eyes, as well as blinks and squints. Arguably, the mouth is one of the most involved parts of the face in non-manuals. Mouth actions related to ASLR can be either mouthings, i.e. visual syllables with the mouth while signing, or non-verbal mouth gestures. Both are very important in ASLR. In this paper, we present the first survey on mouth non-manuals in ASLR. We start by showing why mouth motion is important in SL and the relevant techniques that exist within ASLR. Since limited research has been conducted regarding automatic analysis of mouth motion in the context of ALSR, we proceed by surveying relevant techniques from the areas of automatic mouth expression and visual speech recognition which can be applied to the task. Finally, we conclude by presenting the challenges and potentials of automatic analysis of mouth motion in the context of ASLR
A Cross-Linguistic Preference For Torso Stability In The Lexicon: Evidence From 24 Sign Languages
When the arms move in certain ways, they can cause the torso to twist or rock. Such extraneous torso movement is undesirable, especially during sign language communication, when torso position may carry linguistic significance, so we expend effort to resist it when it is not intended. This so-called “reactive effort” has only recently been identified by Sanders and Napoli (2016), but their preliminary work on three genetically unrelated languages suggests that the effects of reactive effort can be observed cross-linguistically by examination of sign language lexicons. In particular, the frequency of different kinds of manual movements in the lexicon correlates with the amount of reactive effort needed to resist movement of the torso. Following this line of research, we present evidence from 24 sign languages confirming that there is a cross-linguistic preference for minimizing the reactive effort needed to keep the torso stable
- …
