3 research outputs found

    ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ๋Œ€์ƒ์œผ๋กœ ํ•œ ์Œ์„ฑ ์‚ฌ์šฉ์ž ์ธํ„ฐํŽ˜์ด์Šค์˜ ์‚ฌ์šฉ์ž ๊ฒฝํ—˜์— ๋Œ€ํ•œ ์‚ฌ์šฉ์ž ์ค‘์‹ฌ ์—ฐ๊ตฌ: ์ง€๋Šฅํ˜• ๊ฐœ์ธ ๋น„์„œ๋ฅผ ์ค‘์‹ฌ์œผ๋กœ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ์ธ๋ฌธ๋Œ€ํ•™ ํ˜‘๋™๊ณผ์ • ์ธ์ง€๊ณผํ•™์ „๊ณต, 2021. 2. ์œค๋ช…ํ™˜.In recent years, research on Voice User Interfaces (VUIs) has been actively conducted. The VUI has many advantages which can be very useful for the general public as well as for elderly people and people with disabilities. The VUI is considered very suitable for individuals with disabilities to promote universal access to information, decreasing the gap between users with non-disabilities and users with disabilities. In this respect, many researchers have been trying to apply the VUI to various areas for people with disabilities to increase their independence and quality of life. However, previous studies related to VUIs for people with disabilities usually focused on developments and evaluations of new systems, and empirical studies are limited. There have been a few studies related to User Experience (UX) of VUIs for people with disabilities. This situation is not different with studies related to Intelligent Personal Assistants (IPAs) which one of the most wildly being used VUIs nowadays. Although IPAs have potential to be practically used for users with disabilities because they can perform various tasks than simple VUIs, research related to UX of IPAs for them has been paid little attention to, only focusing on a young adult and middle-aged group among people with non-disabilities as end-users. Many previous studies referred to that IPAs would be helpful to people with disabilities. However, only a few studies related to IPAs have been conducted from the angle of users with disabilities, especially in terms of UX. It is known for that investigating usability and UX for users with disabilities is more difficult and delicate than that of users with non-disabilities. It can be said that research on UX of IPAs for users with disabilities should be conducted more closely to understand their interactions with IPAs. The purpose of the research in this dissertation is to investigate UX of VUIs for users with disabilities, focusing on IPAs. The research in this dissertation consists of three independent main studies. Study 1 investigates UX of commercially available VUIs for users with disabilities, by examining acceptance, focusing on the differences between users with different types of disabilities and identifying the reasons why they use or not use VUIs. A questionnaire survey was conducted for users with disabilities having used one or more VUIs. The collected data were analyzed statistically and qualitatively. The results of this study show acceptance of VUIs and the relationships between the acceptance factors for users with disabilities, with some differences between users with different types of disabilities. The results of this study also provide some insights related to UX of VUIs for users with disabilities from their perspective, showing that the acceptance factors can be used as criteria in comprehending the issues. Study 2 investigates UX of IPAs based on online reviews written by users through semantic network analysis. Before investigating UX of IPAs for users with disabilities, important factors for UX of IPAs were proposed by investigating UX of IPAs for users with non-disabilities in this study. As a case study, online reviews on smart speakers from the internet were collected. Then, the collected text data were preprocessed and structured in which words having similar meaning were clustered into one representative keyword. After this, the frequency of the keywords was calculated, and keywords in top 50 frequency were used for the analysis, because they were considered core keywords. Based on the keywords, a network was visualized, and centrality was measured. The results of this study show that most of the users were satisfied with the use of IPAs, although they felt that the performance of them was not completely reliable. In addition, the results of this study show aesthetic aspects of IPAs are also important for usersโ€™ enjoyment, especially for the satisfaction of users. This study proposes eleven important factors to be considered for UX of IPAs and among them, suggests ten factors to be considered in the design of IPAs to improve UX of IPAs and to satisfy users. Study 3 investigates UX of IPAs for users with disabilities and identifies how the use of IPAs affects quality of life of them, based on Study 1 and Study 2. In this study, comparisons with users with non-disabilities are also conducted. A questionnaire survey and a written interview were conducted for users with disabilities and users with non-disabilities having used one or more smart speakers. The collected data were analyzed statistically and qualitatively. The results of this study show that, regardless of disability, most users are sharing the main UX of IPAs and can benefit the use of IPA. The results of this study also show that the investigation on qualitative data is essential to the study for users with disabilities, offering various insights related to UX of IPAs from the angle of them and clear differences in UX of IPAs between users with disabilities and users with non-disabilities. This study proposes important factors for UX of IPAs for users with disabilities and users with non-disabilities based on the discussed factors for UX of IPAs in Study 2. This study also discusses various design implications for UX of IPAs and provides three important design implications which should be considered to improve UX, focusing on the interaction design of IPAs for not only users with disabilities but also all potential users. Each study provides design implications. Study 1 discusses design implications for UX of VUIs for users with disabilities. Study 2 suggests design implications for UX of IPAs, focusing on users with non-disabilities. Study 3 discusses various design implications for UX of IPAs and proposes three specific implications focusing on the interaction design of IPAs for all potential users. It is possible to expect that reflecting the implications in the interaction design of IPA will be helpful to all potential users, not just users with disabilities.์ตœ๊ทผ์— ๋“ค์–ด์™€ ์Œ์„ฑ ์‚ฌ์šฉ์ž ์ธํ„ฐํŽ˜์ด์Šค๋“ค(Voice User Interfaces, VUIs)์— ๋Œ€ํ•œ ์—ฐ๊ตฌ๊ฐ€ ํ™œ๋ฐœํžˆ ์ง„ํ–‰๋˜๊ณ  ์žˆ๋‹ค. VUI๋Š” ์ผ๋ฐ˜์ ์ธ ์‚ฌ๋žŒ๋“ค์€ ๋ฌผ๋ก , ๊ณ ๋ น์ž ๋ฐ ์žฅ์• ์ธ๋“ค์—๊ฒŒ๋„ ๋งค์šฐ ์œ ์šฉํ•œ ๋งŽ์€ ์žฅ์ ๋“ค์„ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค. VUI๋Š” ์žฅ์• ์ธ๋“ค์—๊ฒŒ ๋ณดํŽธ์  ์ •๋ณด ์ ‘๊ทผ์„ ์šฉ์ดํ•˜๊ฒŒ ํ•œ๋‹ค๋Š” ์ ์—์„œ ์žฅ์• ์ธ๊ณผ ๋น„์žฅ์• ์ธ ๊ฐ„ ์กด์žฌํ•  ์ˆ˜ ์žˆ๋Š” ์ •๋ณด ๊ฒฉ์ฐจ๋ฅผ ์ค„์ด๋Š” ๋งค์šฐ ์œ ์šฉํ•œ ์—ญํ• ์„ ํ•  ์ˆ˜ ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๊ด€์ ์—์„œ, ๋งŽ์€ ์—ฐ๊ตฌ์ž๋“ค์€ ์žฅ์• ์ธ๋“ค์˜ ๋…๋ฆฝ์„ฑ๊ณผ ์‚ถ์˜ ์งˆ์„ ๋†’์ด๊ธฐ ์œ„ํ•ด VUI๋ฅผ ๋‹ค์–‘ํ•œ ์˜์—ญ์— ์ ์šฉํ•˜๋ ค๊ณ  ํ•˜๊ณ  ์žˆ๋‹ค. ํ•˜์ง€๋งŒ, ์žฅ์• ์ธ๋“ค์„ ์œ„ํ•œ VUIs์™€ ๊ด€๋ จ๋œ ์„ ํ–‰์—ฐ๊ตฌ๋“ค์€ ๋Œ€๋ถ€๋ถ„ ์ƒˆ๋กœ์šด ์‹œ์Šคํ…œ์˜ ๊ฐœ๋ฐœ ๋ฐ ํ‰๊ฐ€์— ์ค‘์ ์„ ๋‘๊ณ  ์žˆ์œผ๋ฉฐ ๊ฒฝํ—˜์  ์—ฐ๊ตฌ๋Š” ์ œํ•œ์ ์ด๋‹ค. ํŠนํžˆ, ์žฅ์• ์ธ์„ ์œ„ํ•œ VUIs์™€ ๊ด€๋ จ๋œ ์—ฐ๊ตฌ๋“ค ์ค‘ ์‚ฌ์šฉ์ž๊ฒฝํ—˜(User Experience, UX)์— ๋Œ€ํ•œ ์—ฐ๊ตฌ๋Š” ์ƒ๋‹นํžˆ ๋“œ๋ฌผ๋‹ค. ์ด๋Ÿฌํ•œ ์ƒํ™ฉ์€ ์˜ค๋Š˜๋‚  ๊ฐ€์žฅ ๋งŽ์ด ์‚ฌ์šฉ๋˜๋Š” VUIs ์ค‘ ํ•˜๋‚˜์ธ ์ง€๋Šฅํ˜• ๊ฐœ์ธ ๋น„์„œ๋“ค(Intelligent Personal Assistants, IPAs)์— ๋Œ€ํ•œ ์—ฐ๊ตฌ์—์„œ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์ด๋‹ค. IPAs๋Š” ๋‹จ์ˆœํ•œ VUIs๋ณด๋‹ค ๋‹ค์–‘ํ•œ ์ž‘์—…์„ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์— ์žฅ์• ๊ฐ€ ์žˆ๋Š” ์‚ฌ์šฉ์ž๋“ค์—๊ฒŒ ๋งค์šฐ ์‹ค์šฉ์ ์œผ๋กœ ์‚ฌ์šฉ๋  ์ˆ˜ ์žˆ๋‹ค. ํ•˜์ง€๋งŒ, IPAs์˜ UX ๊ด€๋ จ๋œ ์—ฐ๊ตฌ๋Š” ๊ฑฐ์˜ ์ฃผ๋ชฉ๋ฐ›์ง€ ๋ชป ํ•˜๊ณ  ์žˆ์œผ๋ฉฐ, ๋น„์žฅ์• ์ธ ์ค‘ ์ฒญ๋…„ ๋ฐ ์ค‘๋…„์ธต๋งŒ์ด ์ตœ์ข… ์‚ฌ์šฉ์ž๋“ค๋กœ ๊ณ ๋ ค๋˜๊ณ  ์žˆ๋‹ค. ์ด์ „์˜ ๋งŽ์€ ์„ ํ–‰์—ฐ๊ตฌ๋“ค์€ IPAs๊ฐ€ ์žฅ์• ๊ฐ€ ์žˆ๋Š” ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ํฐ ๋„์›€์ด ๋  ์ˆ˜ ์žˆ๋‹ค๊ณ  ์ž…์„ ๋ชจ์•„ ๋งํ•œ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์‹ค์žฌ๋กœ ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์˜ ์ž…์žฅ์—์„œ ์ง„ํ–‰๋œ IPAs์™€ ๊ด€๋ จ๋œ ์—ฐ๊ตฌ๋Š” ๋ณ„๋กœ ์—†์œผ๋ฉฐ IPAs์˜ UX ๊ด€๋ จ๋œ ์—ฐ๊ตฌ๋Š” ๋”์šฑ ๋ถ€์กฑํ•œ ์ƒํ™ฉ์ด๋‹ค. ๋น„์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค๊ณผ ๋น„๊ตํ–ˆ์„ ๋•Œ ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์˜ ์ž…์žฅ์—์„œ ์‚ฌ์šฉ์„ฑ(usability) ๋ฐ UX๋ฅผ ์กฐ์‚ฌํ•˜๋Š” ๊ฒƒ์€ ๋ณต์žกํ•˜๊ณ  ์–ด๋ ค์šด ์ผ์ด๋‹ค. ๊ทธ๋ ‡๊ธฐ์— ์žฅ์• ๊ฐ€ ์žˆ๋Š” ์‚ฌ์šฉ์ž๋ฅผ ์œ„ํ•œ IPAs์˜ UX์— ๋Œ€ํ•œ ์—ฐ๊ตฌ๊ฐ€ ๋ณด๋‹ค ์ฒ ์ €ํžˆ ์ด๋ฃจ์–ด์ ธ์•ผ ํ•œ๋‹ค๊ณ  ๋ณผ ์ˆ˜ ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๋ฐฐ๊ฒฝ์„ ๋‘๊ณ , ๋ณธ ํ•™์œ„๋…ผ๋ฌธ์˜ ์—ฐ๊ตฌ ๋ชฉ์ ์€ ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ๋Œ€์ƒ์œผ๋กœ IPAs์— ์ค‘์ ์„ ๋‘๊ณ  VUIs์˜ UX๋ฅผ ์กฐ์‚ฌํ•˜๋Š” ๊ฒƒ์ด๋‹ค. ๋ณธ ํ•™์œ„๋…ผ๋ฌธ์€ ํฌ๊ฒŒ ์„ธ ๊ฐœ์˜ ๋…๋ฆฝ์ ์ธ ์—ฐ๊ตฌ๋กœ ์ด๋ฃจ์–ด์ ธ ์žˆ๋‹ค. ์—ฐ๊ตฌ 1์—์„œ๋Š” ๋‹ค๋ฅธ ์žฅ์• ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์˜ ์ฐจ์ด๋“ค๊ณผ ๊ทธ๋“ค์ด VUIs์„ ์‚ฌ์šฉํ•˜๊ฑฐ๋‚˜ ์‚ฌ์šฉํ•˜์ง€ ์•Š๋Š” ์ด์œ ๋ฅผ ํŒŒ์•…ํ•˜๋Š”๋ฐ ์ค‘์ ์„ ๋‘๊ณ , ์ƒ์šฉํ™”๋œ VUIs์˜ UX๋ฅผ ์กฐ์‚ฌํ•œ๋‹ค. ํ•˜๋‚˜์ด์ƒ์˜ VUIs๋ฅผ ์‚ฌ์šฉํ•œ ๊ฒฝํ—˜์ด ์žˆ๋Š” ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ๋Œ€์ƒ์œผ๋กœ ์„ค๋ฌธ์กฐ์‚ฌ๋ฅผ ํ•˜์˜€๋‹ค. ์ˆ˜์ง‘๋œ ๋ฐ์ดํ„ฐ๋Š” ํ†ต๊ณ„์ ์œผ๋กœ ๊ทธ๋ฆฌ๊ณ  ์ •์„ฑ์ ์œผ๋กœ ๋ถ„์„ํ•˜์˜€๋‹ค. ์ด ์—ฐ๊ตฌ์˜ ๊ฒฐ๊ณผ๋“ค์€ ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์˜ ์žฅ์• ์œ ํ˜•์— ๋”ฐ๋ผ VUIs์˜ ์ˆ˜์šฉ๋„(acceptance)์™€ ์ˆ˜์šฉ๋„ ์š”์ธ๋“ค ๊ฐ„ ๊ด€๊ณ„์— ์ฐจ์ด๊ฐ€ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์ค€๋‹ค. ๋˜ํ•œ, ์ด ์—ฐ๊ตฌ์˜ ๊ฒฐ๊ณผ๋“ค์€ ์ˆ˜์šฉ๋„ ์š”์ธ๋“ค์ด VUIs์˜ UX ์ด์Šˆ๋“ค์„ ์ดํ•ดํ•˜๋Š”๋ฐ ์‚ฌ์šฉ๋  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์คŒ๊ณผ ํ•จ๊ป˜ ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ์œ„ํ•œ VUIs์˜ UX์™€ ๊ด€๋ จ๋œ ๋‹ค์–‘ํ•œ ์ธ์‚ฌ์ดํŠธ๋“ค(insights)์„ ์ œ๊ณตํ•ด์ค€๋‹ค. ์—ฐ๊ตฌ 2์—์„œ๋Š” ์˜๋ฏธ ์‹ ๊ฒฝ๋ง(semantic network) ๋ถ„์„์„ ํ†ตํ•ด ์‚ฌ์šฉ์ž๊ฐ€ ์ž‘์„ฑํ•œ ์˜จ๋ผ์ธ ๋ฆฌ๋ทฐ๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ IPAs์˜ UX๋ฅผ ์กฐ์‚ฌํ•œ๋‹ค. ํ•ด๋‹น ์—ฐ๊ตฌ๋Š” ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์— ๋Œ€ํ•œ IPAs์˜ UX๋ฅผ ์กฐ์‚ฌํ•˜๊ธฐ ์ „์— ๋น„์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์— ๋Œ€ํ•œ IPAs์˜ UX๋ฅผ ์กฐ์‚ฌํ•˜์—ฌ IPAs์˜ UX์™€ ๊ด€๋ จํ•˜์—ฌ ์ค‘์š”ํ•˜๊ฒŒ ๊ณ ๋ คํ•ด์•ผ ํ•  ์š”์ธ๋“ค์„ ์ œ์•ˆํ•˜์˜€๋‹ค. ์ด๋ฅผ ์œ„ํ•ด ์‚ฌ๋ก€ ์—ฐ๊ตฌ๋กœ, ์ธํ„ฐ๋„ท์—์„œ ์Šค๋งˆํŠธ ์Šคํ”ผ์ปค์— ๋Œ€ํ•œ ์˜จ๋ผ์ธ ๋ฆฌ๋ทฐ๋“ค์„ ์ˆ˜์ง‘ํ•˜์˜€๋‹ค. ๊ทธ ํ›„, ํ…์ŠคํŠธ ๋ฐ์ดํ„ฐ๋ฅผ ์ „์ฒ˜๋ฆฌ ๋ฐ ๊ตฌ์กฐํ™”์˜€๊ณ , ์ด ๊ณผ์ •์—์„œ ์œ ์‚ฌํ•œ ์˜๋ฏธ๋ฅผ ๊ฐ–๋Š” ๋‹จ์–ด๋“ค์ด ์žˆ์„ ๊ฒฝ์šฐ ํ•˜๋‚˜์˜ ๋Œ€ํ‘œ ํ‚ค์›Œ๋“œ๋กœ ๋ณ€ํ™˜ํ•˜์˜€๋‹ค. ์ด ๊ณผ์ •์„ ๊ฑฐ์นœ ํ›„, ํ‚ค์›Œ๋“œ๋“ค์— ๋Œ€ํ•œ ๋นˆ๋„์ˆ˜๋ฅผ ๊ณ„์‚ฐํ•˜์—ฌ, ๋นˆ๋„์ˆ˜ ์ƒ์œ„ 50๊ฐœ์˜ ํ‚ค์›Œ๋“œ๋“ค์ด ํ•ต์‹ฌ ํ‚ค์›Œ๋“œ๋“ค๋กœ ๋ณด์˜€๊ธฐ์—, ๋นˆ๋„์ˆ˜ ์ƒ์œ„ 50๊ฐœ์˜ ํ‚ค์›Œ๋“œ๋“ค์„ ๋ถ„์„์— ์‚ฌ์šฉํ–ˆ๋‹ค. ์ด ํ‚ค์›Œ๋“œ๋“ค์„ ๋ฐ”ํƒ•์œผ๋กœ ๋„คํŠธ์›Œํฌ๋ฅผ ์‹œ๊ฐํ™” ํ•˜์˜€๊ณ  ์ค‘์‹ฌ์„ฑ(centrality)์„ ๊ณ„์‚ฐํ–ˆ๋‹ค. ์ด ์—ฐ๊ตฌ์˜ ๊ฒฐ๊ณผ๋“ค์€ ๋น„๋ก IPAs์˜ ์„ฑ๋Šฅ์— ๋Œ€ํ•ด์„œ ์™„์ „ํžˆ ์‹ ๋ขฐํ•˜์ง€๋Š” ๋ชป ํ•˜๊ณ  ์žˆ๋”๋ผ๋„ ๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ์šฉ์ž๋“ค์ด IPAs ์‚ฌ์šฉ์— ๋งŒ์กฑํ•˜๊ณ  ์žˆ์—ˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์ค€๋‹ค. ๋˜ํ•œ ์ด ์—ฐ๊ตฌ์˜ ๊ฒฐ๊ณผ๋“ค์€ IPAs์˜ ์‹ฌ๋ฏธ์  ์ธก๋ฉด๋“ค์ด ์‚ฌ์šฉ์ž๋“ค์˜ ์ฆ๊ฑฐ์›€๊ณผ ๋งŒ์กฑ์— ์ค‘์š”ํ•˜๋‹ค๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์ค€๋‹ค. ์ด ์—ฐ๊ตฌ๋Š” IPAs์˜ UX๋ฅผ ์œ„ํ•ด ๊ณ ๋ คํ•ด์•ผ ํ•  ์—ด ํ•œ ๊ฐœ์˜ ์ค‘์š” ์š”์ธ๋“ค์„ ์ œ์•ˆํ•˜๊ณ , ๊ทธ ์ค‘์—์„œ ์‚ฌ์šฉ์ž๋“ค์„ ๋งŒ์กฑ์‹œํ‚ค๊ณ  IPAs์˜ ๋””์ž์ธ ์‹œ ๊ณ ๋ คํ•ด์•ผ ํ•  ํ•„์š”๊ฐ€ ์žˆ๋Š” ์—ด ๊ฐœ์˜ ์š”์ธ๋“ค์„ ์‹œ์‚ฌํ•ด์ค€๋‹ค. ์—ฐ๊ตฌ 3์—์„œ๋Š”, ์—ฐ๊ตฌ 1๊ณผ ์—ฐ๊ตฌ 2๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ, ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ๋Œ€์ƒ์œผ๋กœ IPAs์˜ UX์— ๋Œ€ํ•ด ์กฐ์‚ฌํ•˜๊ณ  IPAs์˜ ์‚ฌ์šฉ์ด ๊ทธ๋“ค์˜ ์‚ถ์˜ ์งˆ(quality of life)์— ์–ด๋– ํ•œ ์˜ํ–ฅ์„ ์ฃผ๋Š”์ง€์— ๋Œ€ํ•ด์„œ ์•Œ์•„๋ณด๊ณ ์ž ํ•œ๋‹ค. ์ด ๊ณผ์ •์—์„œ ๋น„์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค๊ณผ ๋น„๊ต ๋˜ํ•œ ์ด๋ฃจ์–ด์กŒ๋‹ค. ํ•˜๋‚˜์ด์ƒ์˜ IPAs๋ฅผ ์‚ฌ์šฉํ•˜๊ณ  ์žˆ๋Š” ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค๊ณผ ๋น„์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ๋Œ€์ƒ์œผ๋กœ ์„ค๋ฌธ์กฐ์‚ฌ์™€ ์„œ๋ฉด ์ธํ„ฐ๋ทฐ๋ฅผ ์ง„ํ–‰ํ•˜์˜€๋‹ค. ์ˆ˜์ง‘๋œ ๋ฐ์ดํ„ฐ๋Š” ํ†ต๊ณ„์ ์œผ๋กœ ๊ทธ๋ฆฌ๊ณ  ์ •์„ฑ์ ์œผ๋กœ ๋ถ„์„ํ•˜์˜€๋‹ค. ์ด ์—ฐ๊ตฌ์˜ ๊ฒฐ๊ณผ๋“ค์€, ์žฅ์•  ์œ ๋ฌด์™€ ์ƒ๊ด€์—†์ด, ๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ์šฉ์ž๋“ค์ด ์ฃผ์š” IPAs์˜ UX๋ฅผ ๊ณต์œ ํ•˜๊ณ  ์žˆ์œผ๋ฉฐ IPAs ์‚ฌ์šฉ์— ํ˜œํƒ์„ ๋ˆ„๋ฆฌ๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์ค€๋‹ค. ๋˜ํ•œ, ์ด ์—ฐ๊ตฌ์˜ ๊ฒฐ๊ณผ๋“ค์€ ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์˜ ์ž…์žฅ์—์„œ IPAs์˜ UX์™€ ๊ด€๋ จ๋œ ๋‹ค์–‘ํ•œ ์ธ์‚ฌ์ดํŠธ๋“ค๊ณผ ํ•จ๊ป˜ ๋‘ ์‚ฌ์šฉ์ž ๊ทธ๋ฃน ๊ฐ„ ๋ช…ํ™•ํ•œ ์ฐจ์ด๊ฐ€ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์คŒ์œผ๋กœ์จ ์ด๋“ค์„ ๋Œ€์ƒ์œผ๋กœ ํ•œ ์—ฐ๊ตฌ์—์„œ ์ •์„ฑ์  ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•œ ๋ถ„์„์ด ํ•„์ˆ˜์ ์ด๋ผ๋Š” ์‚ฌ์‹ค์„ ๋ณด์—ฌ์ค€๋‹ค. ์ด ์—ฐ๊ตฌ๋Š”, ์—ฐ๊ตฌ 2์—์„œ ๋…ผ์˜๋œ ์š”์ธ๋“ค์„ ๋ฐ”ํƒ•์œผ๋กœ, ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค๊ณผ ๋น„์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ์œ„ํ•œ IPAs์˜ UX์— ์žˆ์–ด ์ค‘์š” ์š”์ธ๋“ค์„ ์ œ์•ˆํ•œ๋‹ค. ๋˜ํ•œ ์ด ์—ฐ๊ตฌ๋Š” IPA์˜ UX์— ๋Œ€ํ•œ ๋‹ค์–‘ํ•œ ๋””์ž์ธ ํ•จ์˜๋“ค(implications)์„ ๋…ผ์˜ํ•˜๊ณ , ์žฅ์• ๊ฐ€ ์žˆ๋Š” ์‚ฌ์šฉ์ž๋“ค๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ๋ชจ๋“  ์ž ์žฌ์  ์‚ฌ์šฉ์ž๋“ค์„ ๊ณ ๋ คํ•œ IPA์˜ ์ƒํ˜ธ ์ž‘์šฉ ์„ค๊ณ„์— ์ค‘์ ์„ ๋‘” ๊ตฌ์ฒด์ ์ธ ์„ธ ๊ฐœ์˜ ๋””์ž์ธ ํ•จ์˜๋“ค์„ ์ œ๊ณตํ•œ๋‹ค. ๊ฐ ์—ฐ๊ตฌ๋Š” ๋””์ž์ธ ํ•จ์˜๋“ค์„ ์ œ๊ณตํ•œ๋‹ค. ์—ฐ๊ตฌ 1์—์„œ๋Š” ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์„ ๋Œ€์ƒ์œผ๋กœ VUIs์˜ UX๋ฅผ ์œ„ํ•œ ๋””์ž์ธ ํ•จ์˜๋“ค์„ ๋…ผ์˜ํ•œ๋‹ค. ์—ฐ๊ตฌ 2์—์„œ๋Š” ๋น„์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค์—๊ฒŒ ์ดˆ์ ์„ ๋‘๊ณ  IPAs์˜ UX๋ฅผ ์œ„ํ•œ ๋””์ž์ธ ํ•จ์˜๋“ค์„ ์ œ์‹œํ•œ๋‹ค. ์—ฐ๊ตฌ 3์—์„œ๋Š” ์žฅ์• ์ธ ์‚ฌ์šฉ๋“ค๋งŒ์ด ์•„๋‹Œ ๋ชจ๋“  ์ž ์žฌ์  ์‚ฌ์šฉ์ž๋“ค์—๊ฒŒ ๋„์›€์ด ๋  ์ˆ˜ ์žˆ๋Š” ๋‹ค์–‘ํ•œ ๋””์ž์ธ ํ•จ์˜๋“ค์„ ๋…ผ์˜ํ•˜๊ณ  IPA์˜ ์ƒํ˜ธ ์ž‘์šฉ ์„ค๊ณ„์— ์ค‘์ ์„ ๋‘” ๊ตฌ์ฒด์ ์ธ ์„ธ ๊ฐœ์˜ ๋””์ž์ธ ํ•จ์˜๋“ค์„ ์ œ์•ˆํ•œ๋‹ค. ์ด๋Ÿฌํ•œ ํ•จ์˜๋“ค์„ IPAs์˜ ๋””์ž์ธ์— ๋ฐ˜์˜ํ•˜๋Š” ๊ฒƒ์€ ์žฅ์• ์ธ ์‚ฌ์šฉ์ž๋“ค๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ์ž ์žฌ์ ์ธ ๋ชจ๋“  ์‚ฌ์šฉ์ž๋“ค์—๊ฒŒ ๋„์›€์ด ๋  ๊ฒƒ์ด๋‹ค.ABSTRACT I CONTENTS V LIST OF TABLES VIII LIST OF FIGURES X CHAPTER 1 INTRODUCTION 1 1.1. Research Background 1 1.2. Research Objective 4 1.3. Outline of this Dissertation 7 CHAPTER 2 LITERATURE REVIEW 10 2.1. People with Disabilities and Research Methods for Them 10 2.1.1. People with Disabilities 10 2.1.2. Research Methods for People with Disabilities 11 2.2. Conceptual Frameworks 13 2.2.1. User Experience of Voice User Interfaces 13 2.2.2. Design Approaches for Accessibility 18 2.3. Related Work 22 2.3.1. Previous Studies Related to Voice User Interfaces 22 2.3.2. Previous Studies Related to Intelligent Personal Assistants 25 CHAPTER 3 INVESTIGATION ON USER EXPERIENCE OF VOICE USER INTERFACES FOR USERS WITH DISABILITIES BY EXAMINING ACCEPTANCE 31 3.1. Introduction 31 3.2. Method 35 3.2.1. Participants 35 3.2.2. Procedure 35 3.2.3. Questionnaire 36 3.2.4. Analysis 38 3.2.4.1. Statistical Analysis 38 3.2.4.1. Qualitative Analysis 38 3.3. Results 38 3.3.1. Reliability Analysis and Validity Analysis 38 3.3.2. Descriptive Analysis and Independent Two-Sample T-Test 39 3.3.3. Multiple Regression Analysis 39 3.3.4. Analysis on Comments of the Participants 44 3.4. Discussion 45 3.4.1. User Experience of Voice User Interfaces for Users with Disabilities 45 3.4.2. Reasons of Users with Disabilities for Using Voice User Interfaces or not 48 3.4.3. Design Implications on Voice User Interfaces for Users with Disabilities 50 3.5. Conclusion 51 CHAPTER 4 INVESTIGATION ON USER EXPERIENCE OF INTELLIGENT PERSONAL ASSISTANTS FROM ONLINE REVIEWS BY IDENTIFYING IMPORTANT FACTORS 54 4.1. Introduction 54 4.2. Method 56 4.2.1. Data Collection 56 4.2.2. Preprocessing and Structuring Data 57 4.2.3. Analysis 57 4.3. Results 60 4.3.1. Analysis on Frequency of the Keywords and Categorizing the Keywords 61 4.3.2. Visualization of the Network 61 4.3.3. Analysis on Centrality of the Keywords 65 4.4. Discussion 65 4.4.1. User Experience of Intelligent Personal Assistants through Semantic Network Analysis from Online Reviews 65 4.4.2. Important Factors for User Experience of Intelligent Personal Assistants and Design Implications 70 4.5. Conclusion 74 CHAPTER 5 INVESTIGATION ON USER EXPERIENCE OF INTELLIGENT PERSONAL ASSISTANTS AND EFFECTS ON QUALITY OF LIFE FOR USERS WITH DISABILITIES BY COMPARING WITH USERS WITH NON-DISABILITIES 76 5.1. Introduction 76 5.2. Method 78 5.2.1. Participants 78 5.2.2. Procedure 79 5.2.3. Questionnaire 79 5.2.4. Written Interview 81 5.2.5. Analysis 84 5.2.5.1. Statistical Analysis 84 5.2.5.2. Qualitative Analysis 84 5.3. Results 85 5.3.1. Reliability Analysis and Validity Analysis 85 5.3.2. Descriptive Analysis and Mann-Whitney U-test 85 5.3.2.1. User Experience of Intelligent Personal Assistants 85 5.3.2.2. Effects of the Use of Intelligent Personal Assistants on Quality of Life 87 5.3.3. Analysis on the Written Interview 89 5.3.3.1. Analysis on Issues Related to User Experience from the Written Interview 89 5.3.3.2. Semantic Network Analysis on the Written Interview 91 5.4. Discussion 99 5.4.1. User Experience of Intelligent Personal Assistants 99 5.4.1.1. Discussion on the Statistical Analysis 99 5.4.1.2. Discussion on the Analysis on the Written Interview 106 5.4.2. Effects of the Use of Intelligent Personal Assistants on Quality of Life 110 5.4.2.1. Discussion on the Statistical Analysis 110 5.4.2.2. Discussion on the Analysis on the Written Interview 111 5.4.3. Design Implications for User Experience of Intelligent Personal Assistants for Users with Disabilities 112 5.5. Conclusion 115 CHAPTER 6 DISCUSSION AND CONCLUSION 118 6.1. Summary of this Research 118 6.2. Contributions of this Research 121 6.3. Limitations of this Research and Future Studies 124 BIBLIOGRAPHY 126 APPENDIX 143 ABSTRACT IN KOREAN (๊ตญ๋ฌธ ์ดˆ๋ก) 181Docto

    Anรกlisis de interfaces basadas en movimientos de iris y de cabeza para personas con parรกlisis cerebral

    Get PDF
    Este documento es la memoria final del trabajo de fin de Mรกster para optar al tรญtulo de Mรกster en Ingenierรญa de Sistemas Electrรณnicos. El trabajo lleva por nombre โ€œAnรกlisis de interfaces basadas en movimientos de iris y de cabeza para personas con parรกlisis cerebralโ€. Ha sido desarrollado por el Ing. Alejandro Clemotte bajo la supervisiรณn del Dr. Rafael Raya, Dr. Ramรณn Ceres y el Dr. Ricardo de Cordoba en el periodo 2011-2013. El trabajo ha sido desarrollado en las instalaciones del grupo de bioingenierรญa del Consejo Superior de Investigaciones Cientรญficas (GBIO-CSIC) [1] en el marco de desarrollo del Mรกster en Ingenierรญa de Sistemas Electrรณnicos de la Universidad Politรฉcnica de Madrid. Los avances tecnolรณgicos permiten mejorar la calidad de vida de las personas. Sin embargo en ocasiones, la tecnologรญa no se encuentra al alcance de todos los seres humanos ya que quienes padecen de limitaciones motrices, auditivas, del habla, etc., no pueden acceder a estos beneficios por la falta de interfaces adaptadas a las capacidades de estos colectivos menores. En particular el ordenador es una herramienta tecnolรณgica que permite realizar infinitud de tareas, tanto sociales, de rehabilitaciรณn, del tipo lรบdicas, etc. [2], difรญcilmente accesible para personas con capacidades limitadas. Es por ello importante el desarrollo de esfuerzos que permitan la construcciรณn de herramientas de acceso universal. El trabajo realizado consiste en estudiar de forma prรกctica el desempeรฑo de personas con parรกlisis cerebral y sin discapacidad mientras que estas realizan tareas de alcance al objetivo. Las tareas serรกn realizadas con dos interfaces alternativas al ordenador. Se analizaran las limitaciones tรฉcnicas de cada una de ellas mediante la definiciรณn de mรฉtricas especiales y se realizarรก una propuesta conceptual para la reducciรณn de tales limitaciones, con el fin de mejorar la accesibilidad del computador para el grupo de personas con discapacidad. Calificaciรณn del tribunal: 10 con matrรญcula de hono

    Evaluation methodology for eye-trackers as alternative access device for people with cerebral palsy

    Full text link
    [ES] Los procedimientos de evaluaciรณn de los sistemas alternativos de acceso al ordenador son poco rigurosos, sistemรกticos y formales. Este trabajo presenta una metodologรญa para la valoraciรณn de la interacciรณn usuario-ordenador, cuando los sistemas de eye-tracking son utilizados como herramienta de acceso alternativo al ordenador por personas con Parรกlisis Cerebral (PC). Para ello, proponemos tres mรฉtricas de evaluaciรณn: tasa de fallos de calibraciรณn, tasa de error en el clic y tiempo de clic. Validamos la metodologรญa, comparando 3 (tres) eye- trackers, con 9 (nueve) participantes con PC con trastornos motores severos. Los resultados indican que la calibraciรณn es un proceso crรญtico en estos escenarios como refleja la alta tasa de fallos de calibraciรณn medida. Los participantes con PC tambiรฉn tienen una alta tasa de error del clic, lo que indica que el uso de eye-trackers para alcanzar un objeto en pantalla es un proceso complejo para estos niveles de discapacidad motriz. Los tiempos de clics son similares entre todos los eye-trackers y participantes. Este trabajo pretende establecer lรญneas metodolรณgicas para la efectiva evaluaciรณn de estos dispositivos, que pueden llegar a ser una interesante alternativa de acceso al ordenador para esta poblaciรณn.[EN] The procedures for evaluating alternative computer access systems are neither rigorous, systematic nor formal. We present a methodology to evaluate the user-computer interaction, based on three metrics, when people with cerebral palsy (CP) use eye- trackers as an alternative access device. We validated the methodology by comparing three commercial eye-trackers with nine participants seriously affected by CP. The results indicate that the calibration is a very critical process in these scenarios because of the high rate of calibration failures measured. The participants with CP also have a high click error rate, indicating that using eye- trackers to reach an object on the screen is a complex process for these levels of disability. The click times are similar between all eye-trackers and participants. This study is not intended to discourage the use of eye-trackers in the population with CP, but to establish methodological guidelines for the effective evaluation of commercial devices, which can be an interesting alternative to computer access for this population.ย Esta investigaciรณn fue financiada por el Proyecto INTERPLAY (RTC2014-1812-1) e INTERAAC (RTC-2015-4327-1).Clemotte, A.; Velasco, M.; Raya, R.; Ceres, R.; De Cรณrdoba, R.; Rocรณn, E. (2017). Metodologรญa de Evaluaciรณn de Eye-trackers como Dispositivos de Acceso Alternativo para Personas con Parรกlisis Cerebral. Revista Iberoamericana de Automรกtica e Informรกtica industrial. 14(4):384-393. https://doi.org/10.1016/j.riai.2017.07.004OJS384393144Bax, M., Goldstein, M., Rosenbaum, P., Leviton, A., Paneth, N., Dan, B., โ€ฆ Damiano, D. (2005). Proposed definition and classification of cerebral palsy. Developmental medicine & child neurology, 47(8), 571-576. http://doi.org/10.1017/S001216220500112XBorgestig, M., Sandqvist, J., Parsons, R., Falkmer, T., & Hemmingsson, H. (2016). Eye gaze performance for children with severe physical impairments using gaze-based assistive technology-A longitudinal study. Assistive Technology, 28(2), 93-102. http://doi.org/10.1080/10400435.2015.1092182Carpenter, R. H. S. (1988). Movements of the Eyes (2.a ed.). London, UK: Pion.Cerrolaza, J. J., Villanueva, A., Villanueva, M., & Cabeza, R. (2012). Error characterization and compensation in eye tracking systems. En Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12 (p. 205). New York, New York, USA: ACM Press. http://doi.org/10.1145/2168556.2168595Chin, C. A., Barreto, A., Cremades, J. G., & Adjouadi, M. (2008). Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. Journal of rehabilitation research and development, 45, 161-174. http://doi.org/10.1682/JRRD.2007.03.0050Clarke, M., & Price, K. (2012). Augmentative and alternative communication for children with cerebral palsy. Paediatrics and Child Health, 22(9), 367- 371. http://doi.org/10.1016/j.paed.2012.03.002Clemotte, A., Brunetti, F., Raya, R., Ceres, R., & Rocon, E. (2013). Interfaces alternativas de acceso al ordenador: principios comunes y propuesta de mรฉtricas de valoraciรณn. En VII Congreso Iberoamericano de Tecnologรญas de apoyo a la discapacidad, IBERDISCAP. Santo Domingo.Clemotte, A., Raya, R., & Ceres, R. (2014). Anรกlisis de interfaces basadas en movimientos de iris y de cabeza para personas con parรกlisis cerebral. Editorial Acadรฉmica Espaรฑola.Clemotte, A., Raya, R., Ceres, R., & Rocon, E. (2013). Preliminary Result from a Multimodal Interface for Cerebral Palsy Users Based on Eye Tracking and Inertial Technology. En J. L. Pons, D. Torricelli, & M. Pajaro (Eds.), Converging Clinical and Engineering Research on Neurorehabilitation (pp. 443-448). Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642- 34546-3_72Clemotte, A., Velasco, M. A., Torricelli, D., Raya, R., & Ceres, R. (2014). Accuracy and Precision of the Tobii X2-30 Eye-tracking under Non Ideal Conditions. En International Congress on Neurotechnology, Electronics and Informatics - Neurotechnix. Roma.Davies, C., Chau, T., Fehlings, D. L., Ameratunga, S., & Stott, N. S. (2010). Youth with cerebral palsy with differing upper limb abilities: how do they access computers? Archives of physical medicine and rehabilitation, 91(12), 1952-6. http://doi.org/10.1016/j.apmr.2010.08.013Ding, D., Rodriguez, S. P., Cooper, R. a., & Riviere, C. N. (2015). Improving Target Acquisition for Computer Users With Athetosis. Assistive Technology, 27(1), 52-58. http://doi.org/10.1080/10400435.2014.984260Drewes, H. (2014). Eye Gaze Tracking. En Interactive Displays (pp. 251-283). Chichester, UK: John Wiley & Sons, Ltd. http://doi.org/10.1002/9781118706237.ch8Duchowski, A. (2007). Eye tracking methodology: Theory and practice. Springer-Verlag New York Inc.Eliasson, A.-C., Krumlinde-Sundholm, L., Rรถsblad, B., Beckung, E., Arner, M., Ohrvall, A.-M., & Rosenbaum, P. (2006). The Manual Ability Classification System (MACS) for children with cerebral palsy: scale development and evidence of validity and reliability. Developmental medicine & child neurology, 48(7), 549-54. http://doi.org/10.1017/S0012162206001162Fager, S., Bardach, L., Russell, S., & Higginbotham, J. (2012). Access to augmentative and alternative communication: New technologies and clinical decision-making. Journal of Pediatric Rehabilitation Medicine, 5(September 2015), 53-61. http://doi.org/10.3233/PRM-2012-0196Guestrin, E. D. (2010). Remote, Non-Contact Gaze Estimation with Minimal Subject Cooperation. University of Toronto.Himabundu, J., Vivekanandan, S., Sourabh, K., & Emmanuel, D. (2011). Cursor Control Using Bio-potential Signals for People with Motor Disabilities. International Journal of Bioinformatics and Soft Computing, 1(1), 9-21.Jansen, A., Findlater, L., & Wobbrock, J. O. (2011). From the lab to the world: lessons from extending a pointing technique for real-world use. Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems - CHI EA '11, 1867-1872. http://doi.org/10.1145/1979742.1979888Kรคthner, I., Kรผbler, A., & Halder, S. (2015). Comparison of eye tracking, electrooculography and an auditory brain-computer interface for binary communication: a case study with a participant in the locked-in state. Journal of NeuroEngineering and Rehabilitation, 12(1), 76. http://doi.org/10.1186/s12984-015-0071-zKim, M., Kim, B. H., & Jo, S. (2015). Quantitative evaluation of a low-cost noninvasive hybrid interface based on EEG and eye movement. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(2), 159-68. http://doi.org/10.1109/TNSRE.2014.2365834Land, M., & Tatler, B. (2009). Looking and Acting: Vision and eye movements in natural behaviour (1.a ed.). Oxford University Press.Mackenzie, I. S. (1992). Fitts' law as a research and design tool in humancomputer interaction. Human-computer interaction, 7(1), 91-139.Majaranta, P., MacKenzie, I. S., Aula, A., & Rรคihรค, K.-J. (2006). Effects of feedback and dwell time on eye typing speed and accuracy. Universal Access in the Information Society, 5(2), 199-208. http://doi.org/10.1007/s10209-006-0034-zMajaranta, P., & Rรคihรค, K.-J. (2002). Twenty years of eye typing. En Proceedings of the symposium on Eye tracking research & applications - ETRA '02 (p. 15). New York, New York, USA: ACM Press. http://doi.org/10.1145/507072.507076Man, D. W. K., & Wong, M.-S. L. (2007). Evaluation of computer-access solutions for students with quadriplegic athetoid cerebral palsy. The American journal of occupational therapy, 61(3), 355-64.Mauri, C., Granollers, T., Solanas, A., & Lleida Solano, E. (2007). On the Assessment of the Interaction Quality of Users with Cerebral Palsy. En Second International Conference on Availability, Reliability and Security ARES (pp. 799-805). Vienna.McIntyre, S., Morgan, C., Walker, K., & Novak, I. (2011). Cerebral PalsyDon't Delay. Developmental Disabilities Research Reviews, 17(2), 114- 129. http://doi.org/10.1002/ddrr.1106Navallas, J., Ariz, M., Villanueva, A., San Agustรญn, J., & Cabeza, R. (2011). Optimizing interoperability between video-oculographic and electromyographic systems. Journal of Rehabilitation Research and Development, 48(3), 253. http://doi.org/10.1682/JRRD.2010.06.0112Nystrรถm, M., Andersson, R., Holmqvist, K., & van de Weijer, J. (2013). The influence of calibration method and eye physiology on eyetracking data quality. Behavior research methods, 45(1), 272-88. http://doi.org/10.3758/s13428-012-0247-4ร–hrvall, A.-M., Krumlinde-Sundholm, L., & Eliasson, A.-C. (2014). The stability of the Manual Ability Classification System over time. Developmental Medicine & Child Neurology, 56(2), 185-189. http://doi.org/10.1111/dmcn.12348Palisano, R. J., Rosenbaum, P., Bartlett, D., & Livingston, M. H. (2008). Content validity of the expanded and revised Gross Motor Function Classification System. Developmental Medicine & Child Neurology, 50(10), 744-750. http://doi.org/10.1111/j.1469-8749.2008.03089.xPennington, L. (2008). Cerebral palsy and communication. Paediatrics and Child Health, 18(9), 405-409. http://doi.org/10.1016/j.paed.2008.05.013Raya, R., Roa, J. O. J. O., Rocon, E., Ceres, R., & Pons, J. L. J. L. (2010). Wearable inertial mouse for children with physical and cognitive impairments. Sensors and Actuators A: Physical, 162(2), 248-259. http://doi.org/10.1016/j.sna.2010.04.019Saavedra, S., Joshi, A., Woollacott, M., & van Donkelaar, P. (2009). Eye hand coordination in children with cerebral palsy. Experimental Brain Research, 192(2), 155-165. http://doi.org/10.1007/s00221-008-1549-8Saz Torralba, O., Ricardo, W., Dueรฑas, R., & Solano, E. L. (2011). Development of Voice-Based Tools for Accessibility to Computer Services. Computaciรณn y Sistemas, 15(1), 7-15.Tien, G., & Atkins, M. S. (2008). Improving hands-free menu selection using eyegaze glances and fixations. En Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA '08 (p. 47). New York, New York, USA: ACM Press. http://doi.org/10.1145/1344471.1344482รšbeda, A., Iรกรฑez, E., & Azorรญn, J. M. (2011). Wireless and Portable EOG-Based Interface. IEEE/ASME Transactions on mechatronics, 16(5), 870-873.Velasco, M. a., Raya, R., Ceres, R., Clemotte, A., Ruiz Bedia, A., Gonzalez Franco, T., & Rocon, E. (2016). Positive and Negative Motor Signs of Head Motion in Cerebral Palsy: Assessment of Impairment and Task Performance. IEEE Systems Journal, 10(3), 967-973. http://doi.org/10.1109/JSYST.2014.2318075Wilkinson, K. M., & Mitchell, T. (2014). Eye Tracking Research to Answer Questions about Augmentative and Alternative Communication Assessment and Intervention. Augmentative and Alternative Communication, 30(2), 106-119. http://doi.org/10.3109/07434618.2014.904435Wobbrock, J. O., Fogarty, J., Liu, S., Kimuro, S., & Harada, S. (2009). The angle mouse: target-agnostic dynamic gain adjustment based on angular deviation. En SIGCHI Conference on Human Factors in Computing Systems (pp. 1401-1410). Boston, Massachusetts, USA: ACM. http://doi.org/10.1145/1518701.151891
    corecore