5,184 research outputs found

    Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction

    Get PDF
    This paper presents an investigation into the effects of different feedback modalities on mid-air gesture interaction for infotainment systems in cars. Car crashes and near-crash events are most commonly caused by driver distraction. Mid-air interaction is a way of reducing driver distraction by reducing visual demand from infotainment. Despite a range of available modalities, feedback in mid-air gesture systems is generally provided through visual displays. We conducted a simulated driving study to investigate how different types of multimodal feedback can support in-air gestures. The effects of different feedback modalities on eye gaze behaviour, and the driving and gesturing tasks are considered. We found that feedback modality influenced gesturing behaviour. However, drivers corrected falsely executed gestures more often in non-visual conditions. Our findings show that non-visual feedback can reduce visual distraction significantl

    Evaluating Simultaneous Visual Instructions with Kindergarten Children on Touchscreen Devices

    Full text link
    [EN] A myriad of educational applications using tablets and multi-touch technology for kindergarten children have been developed in the last decade. However, despite the possible benefits of using visual prompts to communicate information to kindergarteners, these visual techniques have not been fully studied yet. This article therefore investigates kindergarten children¿s abilities to understand and follow several visual prompts about how to proceed and interact in a virtual 2D world. The results show that kindergarteners are able to effectively understand several visual prompts with different communication purposes despite being used simultaneously. The results also show that the use of the evaluated visual prompts to communicate data when playing reduces the number of interferences about technical nature fostering dialogues related to the learning activity guided by the instructors or caregivers. Hence, this work is a starting point for designing dialogic learning scenarios tailored to kindergarten children.This work is supported by the Spanish Ministry of Economy and Competitiveness and funded by the European Development Regional Fund (EDRF-FEDER) with Project TIN2014-60077-R; by VALi+d program from Conselleria d¿Educació, Cultura i Esport (Generalitat Valenciana) under the fellowship ACIF/2014/214, and by the FPU program from Spanish Ministry of Education, Culture, and Sport under the fellowship FPU14/00136Nácher, V.; García-Sanjuan, F.; Jaén Martínez, FJ. (2020). Evaluating Simultaneous Visual Instructions with Kindergarten Children on Touchscreen Devices. International Journal of Human-Computer Interaction. 36(1):41-54. https://doi.org/10.1080/10447318.2019.1597576S4154361Allen, R., & Scofield, J. (2010). Word learning from videos: more evidence from 2-year-olds. Infant and Child Development, 19(6), 649-661. doi:10.1002/icd.712Cristia, A., & Seidl, A. (2015). Parental Reports on Touch Screen Use in Early Childhood. PLOS ONE, 10(6), e0128338. doi:10.1371/journal.pone.0128338Derboven, J., De Roeck, D., & Verstraete, M. (2012). Semiotic analysis of multi-touch interface design: The MuTable case study. International Journal of Human-Computer Studies, 70(10), 714-728. doi:10.1016/j.ijhcs.2012.05.005Egloff, T. H. (2004). Edutainment. Computers in Entertainment, 2(1), 13-13. doi:10.1145/973801.973822Fernández-López, Á., Rodríguez-Fórtiz, M. J., Rodríguez-Almendros, M. L., & Martínez-Segura, M. J. (2013). Mobile learning technology based on iOS devices to support students with special education needs. Computers & Education, 61, 77-90. doi:10.1016/j.compedu.2012.09.014Furió, D., González-Gancedo, S., Juan, M.-C., Seguí, I., & Rando, N. (2013). Evaluation of learning outcomes using an educational iPhone game vs. traditional game. Computers & Education, 64, 1-23. doi:10.1016/j.compedu.2012.12.001Hanna, L., Risden, K., & Alexander, K. (1997). Guidelines for usability testing with children. Interactions, 4(5), 9-14. doi:10.1145/264044.264045Honomichl, R. D., & Chen, Z. (2012). The role of guidance in children’s discovery learning. WIREs Cognitive Science, 3(6), 615-622. doi:10.1002/wcs.1199Hourcade, J. P. (2007). Interaction Design and Children. Foundations and Trends® in Human-Computer Interaction, 1(4), 277-392. doi:10.1561/1100000006Ioannou, A., Zaphiris, P., Loizides, F., & Vasiliou, C. (2013). Let’S Talk About Technology for Peace: A Systematic Assessment of Problem-Based Group Collaboration Around an Interactive Tabletop. Interacting with Computers, 27(2), 120-132. doi:10.1093/iwc/iwt061Keenan, T., Ruffman, T., & Olson, D. R. (1994). When do children begin to understand logical inference as a source of knowledge? Cognitive Development, 9(3), 331-353. doi:10.1016/0885-2014(94)90010-8Levine, S. C., Huttenlocher, J., Taylor, A., & Langrock, A. (1999). Early sex differences in spatial skill. Developmental Psychology, 35(4), 940-949. doi:10.1037/0012-1649.35.4.940Nacher, V., Garcia-Sanjuan, F., & Jaen, J. (2016). Interactive technologies for preschool game-based instruction: Experiences and future challenges. Entertainment Computing, 17, 19-29. doi:10.1016/j.entcom.2016.07.001Nacher, V., Jaen, J., & Catala, A. (2016). Evaluating Multitouch Semiotics to Empower Prekindergarten Instruction with Interactive Surfaces. Interacting with Computers, 29(2), 97-116. doi:10.1093/iwc/iww007Nacher, V., Jaen, J., Navarro, E., Catala, A., & González, P. (2015). Multi-touch gestures for pre-kindergarten children. International Journal of Human-Computer Studies, 73, 37-51. doi:10.1016/j.ijhcs.2014.08.004Nacher, V., Jurdi, S., Jaen, J., & Garcia-Sanjuan, F. (2019). Exploring visual prompts for communicating directional awareness to kindergarten children. International Journal of Human-Computer Studies, 126, 14-25. doi:10.1016/j.ijhcs.2019.01.003Neumann, M. M. (2017). Parent scaffolding of young children’s use of touch screen tablets. Early Child Development and Care, 188(12), 1654-1664. doi:10.1080/03004430.2016.1278215Pecora, N., Murray, J. P., & Wartella, E. A. (Eds.). (2009). Children and Television. doi:10.4324/9781410618047Plowman, L., Stevenson, O., Stephen, C., & McPake, J. (2012). Preschool children’s learning with technology at home. Computers & Education, 59(1), 30-37. doi:10.1016/j.compedu.2011.11.014Smith, S. P., Burd, E., & Rick, J. (2012). Developing, evaluating and deploying multi-touch systems. International Journal of Human-Computer Studies, 70(10), 653-656. doi:10.1016/j.ijhcs.2012.07.002Van der Meij, H., & van der Meij, J. (2014). A comparison of paper-based and video tutorials for software learning. Computers & Education, 78, 150-159. doi:10.1016/j.compedu.2014.06.003Vatavu, R.-D., Cramariuc, G., & Schipor, D. M. (2015). Touch interaction for children aged 3 to 6 years: Experimental findings and relationship to motor skills. International Journal of Human-Computer Studies, 74, 54-76. doi:10.1016/j.ijhcs.2014.10.00

    May the Force Be with You: Ultrasound Haptic Feedback for Mid-Air Gesture Interaction in Cars

    Get PDF
    The use of ultrasound haptic feedback for mid-air gestures in cars has been proposed to provide a sense of control over the user's intended actions and to add touch to a touchless interaction. However, the impact of ultrasound feedback to the gesturing hand regarding lane deviation, eyes-off-the-road time (EORT) and perceived mental demand has not yet been measured. This paper investigates the impact of uni- and multimodal presentation of ultrasound feedback on the primary driving task and the secondary gesturing task in a simulated driving environment. The multimodal combinations of ultrasound included visual, auditory, and peripheral lights. We found that ultrasound feedback presented uni-modally and bi-modally resulted in significantly less EORT compared to visual feedback. Our results suggest that multimodal ultrasound feedback for mid-air interaction decreases EORT whilst not compromising driving performance nor mental demand and thus can increase safety while driving

    Research on Application of Cognitive-Driven Human-Computer Interaction

    Get PDF
    Human-computer interaction is an important research content of intelligent manufacturing human factor engineering. Natural human-computer interaction conforms to the cognition of users' habits and can efficiently process inaccurate information interaction, thus improving user experience and reducing cognitive load. Through the analysis of the information interaction process, user interaction experience cognition and human-computer interaction principles in the human-computer interaction system, a cognitive-driven human-computer interaction information transmission model is established. Investigate the main interaction modes in the current human-computer interaction system, and discuss its application status, technical requirements and problems. This paper discusses the analysis and evaluation methods of interaction modes in human-computer system from three levels of subjective evaluation, physiological measurement and mathematical method evaluation, so as to promote the understanding of inaccurate information to achieve the effect of interaction self-adaptation and guide the design and optimization of human-computer interaction system. According to the development status of human-computer interaction in intelligent environment, the research hotspots, problems and development trends of human-computer interaction are put forward
    corecore