3,993 research outputs found

    Designing information feedback within hybrid physical/digital interactions

    Get PDF
    Whilst digital and physical interactions were once treated as separate design challenges, there is a growing need for them to be considered together to allow the creation of hybrid digital/physical experiences. For example, digital games can now include physical objects (with digital properties) or digital objects (with physical properties), both of which may be used to provide input, output, or in-game information in various combinations. In this paper we consider how users perceive and understand interactions that include physical/digital objects through the design of a novel game which allows us to consider: i) the character of the space/spaces in which we interact; ii) how users perceive their operation; and iii) how we can design such objects to extend the bandwidth of information we provide to the user/player. The prototype is used as the focus of a participatory design workshop in which players experimented with, and discussed physical ways of representing the virtual in-game information. The results have been used to provide a framing for designers approaching information feedback in this domain, and highlight the requirement for further design research

    Interactive spaces for children: gesture elicitation for controlling ground mini-robots

    Full text link
    [EN] Interactive spaces for education are emerging as a mechanism for fostering children's natural ways of learning by means of play and exploration in physical spaces. The advanced interactive modalities and devices for such environments need to be both motivating and intuitive for children. Among the wide variety of interactive mechanisms, robots have been a popular research topic in the context of educational tools due to their attractiveness for children. However, few studies have focused on how children would naturally interact and explore interactive environments with robots. While there is abundant research on full-body interaction and intuitive manipulation of robots by adults, no similar research has been done with children. This paper therefore describes a gesture elicitation study that identified the preferred gestures and body language communication used by children to control ground robots. The results of the elicitation study were used to define a gestural language that covers the different preferences of the gestures by age group and gender, with a good acceptance rate in the 6-12 age range. The study also revealed interactive spaces with robots using body gestures as motivating and promising scenarios for collaborative or remote learning activities.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by the Spanish MINECO (TIN2014-60077-R). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks are due to the children and teachers of the Col-legi Public Vicente Gaos for their valuable collaboration and dedication.Pons Tomás, P.; Jaén Martínez, FJ. (2020). Interactive spaces for children: gesture elicitation for controlling ground mini-robots. Journal of Ambient Intelligence and Humanized Computing. 11(6):2467-2488. https://doi.org/10.1007/s12652-019-01290-6S24672488116Alborzi H, Hammer J, Kruskal A et al (2000) Designing StoryRooms: interactive storytelling spaces for children. In: Proceedings of the conference on designing interactive systems processes, practices, methods, and techniques—DIS’00. ACM Press, New York, pp 95–104Antle AN, Corness G, Droumeva M (2009) What the body knows: exploring the benefits of embodied metaphors in hybrid physical digital environments. Interact Comput 21:66–75. https://doi.org/10.1016/j.intcom.2008.10.005Belpaeme T, Baxter PE, Read R et al (2013) Multimodal child–robot interaction: building social bonds. J Human-Robot Interact 1:33–53. https://doi.org/10.5898/JHRI.1.2.BelpaemeBenko H, Wilson AD, Zannier F, Benko H (2014) Dyadic projected spatial augmented reality. In: Proceedings of the 27th annual ACM symposium on user interface software and technology—UIST’14, pp 645–655Bobick AF, Intille SS, Davis JW et al (1999) The KidsRoom: a perceptually-based interactive and immersive story environment. Presence Teleoper Virtual Environ 8:367–391. https://doi.org/10.1162/105474699566297Bonarini A, Clasadonte F, Garzotto F, Gelsomini M (2015) Blending robots and full-body interaction with large screens for children with intellectual disability. In: Proceedings of the 14th international conference on interaction design and children—IDC’15. ACM Press, New York, pp 351–354Cauchard JR, E JL, Zhai KY, Landay JA (2015) Drone & me: an exploration into natural human–drone interaction. In: Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing—UbiComp’15. ACM Press, New York, pp 361–365Connell S, Kuo P-Y, Liu L, Piper AM (2013) A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. In: Proceedings of the 12th international conference on interaction design and children—IDC’13. ACM Press, New York, pp 277–280Derboven J, Van Mechelen M, Slegers K (2015) Multimodal analysis in participatory design with children. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems—CHI’15. ACM Press, New York, pp 2825–2828Dong H, Danesh A, Figueroa N, El Saddik A (2015) An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access 3:543–555. https://doi.org/10.1109/ACCESS.2015.2432679Druin A (1999) Cooperative inquiry: developing new technologies for children with children. In: Proceedings of the SIGCHI conference on human factors computer system CHI is limit—CHI’99, vol 14, pp 592–599. https://doi.org/10.1145/302979.303166Druin A (2002) The role of children in the design of new technology. Behav Inf Technol 21:1–25. https://doi.org/10.1080/01449290110108659Druin A, Bederson B, Boltman A et al (1999) Children as our technology design partners. In: Druin A (ed) The design of children’s technology. Morgan Kaufman, San Francisco, pp 51–72Epps J, Lichman S, Wu M (2006) A study of hand shape use in tabletop gesture interaction. CHI’06 extended abstracts on human factors in computing systems—CHI EA’06. ACM Press, New York, pp 748–753Fender AR, Benko H, Wilson A (2017) MeetAlive : room-scale omni-directional display system for multi-user content and control sharing. In: Proceedings of the 2017 ACM international conference on interactive surfaces and spaces, pp 106–115Fernandez RAS, Sanchez-Lopez JL, Sampedro C et al (2016) Natural user interfaces for human–drone multi-modal interaction. In: 2016 international conference on unmanned aircraft systems (ICUAS). IEEE, New York, pp 1013–1022Garcia-Sanjuan F, Jaen J, Nacher V, Catala A (2015) Design and evaluation of a tangible-mediated robot for kindergarten instruction. In: Proceedings of the 12th international conference on advances in computer entertainment technology—ACE’15. ACM Press, New York, pp 1–11Garcia-Sanjuan F, Jaen J, Jurdi S (2016) Towards encouraging communication in hospitalized children through multi-tablet activities. In: Proceedings of the XVII international conference on human computer interaction, pp 29.1–29.4Gindling J, Ioannidou A, Loh J et al (1995) LEGOsheets: a rule-based programming, simulation and manipulation environment for the LEGO programmable brick. In: Proceedings of symposium on visual languages. IEEE Computer Society Press, New York, pp 172–179Gonzalez B, Borland J, Geraghty K (2009) Whole body interaction for child-centered multimodal language learning. In: Proceedings of the 2nd workshop on child, computer and interaction—WOCCI’09. ACM Press, New York, pp 1–5Grønbæk K, Iversen OS, Kortbek KJ et al (2007) Interactive floor support for kinesthetic interaction in children learning environments. In: Human–computer interaction—INTERACT 2007. Lecture notes in computer science, pp 361–375Guha ML, Druin A, Chipman G et al (2005) Working with young children as technology design partners. Commun ACM 48:39–42. https://doi.org/10.1145/1039539.1039567Hansen JP, Alapetite A, MacKenzie IS, Møllenbach E (2014) The use of gaze to control drones. In: Proceedings of the symposium on eye tracking research and applications—ETRA’14. ACM Press, New York, pp 27–34Henkemans OAB, Bierman BPB, Janssen J et al (2017) Design and evaluation of a personal robot playing a self-management education game with children with diabetes type 1. Int J Hum Comput Stud 106:63–76. https://doi.org/10.1016/j.ijhcs.2017.06.001Horn MS, Crouser RJ, Bers MU (2011) Tangible interaction and learning: the case for a hybrid approach. Pers Ubiquitous Comput 16:379–389. https://doi.org/10.1007/s00779-011-0404-2Hourcade JP (2015) Child computer interaction. CreateSpace Independent Publishing Platform, North CharlestonHöysniemi J, Hämäläinen P, Turkki L (2004) Wizard of Oz prototyping of computer vision based action games for children. Proceeding of the 2004 conference on interaction design and children building a community—IDC’04. ACM Press, New York, pp 27–34Höysniemi J, Hämäläinen P, Turkki L, Rouvi T (2005) Children’s intuitive gestures in vision-based action games. Commun ACM 48:44–50. https://doi.org/10.1145/1039539.1039568Hsiao H-S, Chen J-C (2016) Using a gesture interactive game-based learning approach to improve preschool children’s learning performance and motor skills. Comput Educ 95:151–162. https://doi.org/10.1016/j.compedu.2016.01.005Jokela T, Rezaei PP, Väänänen K (2016) Using elicitation studies to generate collocated interaction methods. In: Proceedings of the 18th international conference on human–computer interaction with mobile devices and services adjunct, pp 1129–1133. https://doi.org/10.1145/2957265.2962654Jones B, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI conference on human factors in computing systems—CHI’13, pp 869–878Jones B, Shapira L, Sodhi R et al (2014) RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units. In: Proceedings of the 27th annual ACM symposium on user interface software and technology—UIST’14, pp 637–644Kaminski M, Pellino T, Wish J (2002) Play and pets: the physical and emotional impact of child-life and pet therapy on hospitalized children. Child Heal Care 31:321–335. https://doi.org/10.1207/S15326888CHC3104_5Karam M, Schraefel MC (2005) A taxonomy of gestures in human computer interactions. In: Technical report in electronics and computer science, pp 1–45Kistler F, André E (2013) User-defined body gestures for an interactive storytelling scenario. Lect Notes Comput Sci (including subser Lect Notes Artif Intell Lect Notes Bioinform) 8118:264–281. https://doi.org/10.1007/978-3-642-40480-1_17Konda KR, Königs A, Schulz H, Schulz D (2012) Real time interaction with mobile robots using hand gestures. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI’12. ACM Press, New York, pp 177–178Kray C, Nesbitt D, Dawson J, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services—MobileHCI’10. ACM Press, New York, pp 239–248Kurdyukova E, Redlin M, André E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. In: Proceedings of the 2012 ACM international conference on intelligent user interfaces—IUI’12. ACM Press, New York, pp 93–96Lambert V, Coad J, Hicks P, Glacken M (2014) Social spaces for young children in hospital. Child Care Health Dev 40:195–204. https://doi.org/10.1111/cch.12016Lee S-S, Chae J, Kim H et al (2013) Towards more natural digital content manipulation via user freehand gestural interaction in a living room. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing—UbiComp’13. ACM Press, New York, p 617Malinverni L, Mora-Guiard J, Pares N (2016) Towards methods for evaluating and communicating participatory design: a multimodal approach. Int J Hum Comput Stud 94:53–63. https://doi.org/10.1016/j.ijhcs.2016.03.004Mann HB, Whitney DR (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18:50–60. https://doi.org/10.1214/aoms/1177730491Marco J, Cerezo E, Baldassarri S et al (2009) Bringing tabletop technologies to kindergarten children. In: Proceedings of the 23rd British HCI Group annual conference on people and computers: celebrating people and technology, pp 103–111Michaud F, Caron S (2002) Roball, the rolling robot. Auton Robots 12:211–222. https://doi.org/10.1023/A:1014005728519Micire M, Desai M, Courtemanche A et al (2009) Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. In: Proceedings of the ACM international conference on interactive tabletops and surfaces—ITS’09. ACM Press, New York, pp 41–48Mora-Guiard J, Crowell C, Pares N, Heaton P (2016) Lands of fog: helping children with autism in social interaction through a full-body interactive experience. In: Proceedings of the 15th international conference on interaction design and children—IDC’16. ACM Press, New York, pp 262–274Morris MR (2012) Web on the wall: insights from a multimodal interaction elicitation study. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces. ACM Press, New York, pp 95–104Morris MR, Wobbrock JO, Wilson AD (2010) Understanding users’ preferences for surface gestures. Proc Graph Interface 2010:261–268Nacher V, Garcia-Sanjuan F, Jaen J (2016) Evaluating the usability of a tangible-mediated robot for kindergarten children instruction. In: 2016 IEEE 16th international conference on advanced learning technologies (ICALT). IEEE, New York, pp 130–132Nahapetyan VE, Khachumov VM (2015) Gesture recognition in the problem of contactless control of an unmanned aerial vehicle. Optoelectron Instrum Data Process 51:192–197. https://doi.org/10.3103/S8756699015020132Obaid M, Häring M, Kistler F et al (2012) User-defined body gestures for navigational control of a humanoid robot. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), pp 367–377Obaid M, Kistler F, Häring M et al (2014) A framework for user-defined body gestures to control a humanoid robot. Int J Soc Robot 6:383–396. https://doi.org/10.1007/s12369-014-0233-3Obaid M, Kistler F, Kasparavičiūtė G, et al (2016) How would you gesture navigate a drone?: a user-centered approach to control a drone. In: Proceedings of the 20th international academic Mindtrek conference—AcademicMindtrek’16. ACM Press, New York, pp 113–121Pares N, Soler M, Sanjurjo À et al (2005) Promotion of creative activity in children with severe autism through visuals in an interactive multisensory environment. In: Proceeding of the 2005 conference on interaction design and children—IDC’05. ACM Press, New York, pp 110–116Pfeil K, Koh SL, LaViola J (2013) Exploring 3D gesture metaphors for interaction with unmanned aerial vehicles. In: Proceedings of the 2013 international conference on intelligent user interfaces—IUI’13, pp 257–266. https://doi.org/10.1145/2449396.2449429Piaget J (1956) The child’s conception of space. Norton, New YorkPiaget J (1973) The child and reality: problems of genetic psychology. Grossman, New YorkPiumsomboon T, Clark A, Billinghurst M, Cockburn A (2013) User-defined gestures for augmented reality. CHI’13 extended abstracts on human factors in computing systems—CHI EA’13. ACM Press, New York, pp 955–960Pons P, Carrión A, Jaen J (2018) Remote interspecies interactions: improving humans and animals’ wellbeing through mobile playful spaces. Pervasive Mob Comput. https://doi.org/10.1016/j.pmcj.2018.12.003Puranam MB (2005) Towards full-body gesture analysis and recognition. University of Kentucky, LexingtonPyryeskin D, Hancock M, Hoey J (2012) Comparing elicited gestures to designer-created gestures for selection above a multitouch surface. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces—ITS’12. ACM Press, New York, pp 1–10Raffle HS, Parkes AJ, Ishii H (2004) Topobo: a constructive assembly system with kinetic memory. System 6:647–654. https://doi.org/10.1145/985692.985774Read JC, Markopoulos P (2013) Child–computer interaction. Int J Child-Comput Interact 1:2–6. https://doi.org/10.1016/j.ijcci.2012.09.001Read JC, Macfarlane S, Casey C (2002) Endurability, engagement and expectations: measuring children’s fun. In: Interaction design and children, pp 189–198Read JC, Markopoulos P, Parés N et al (2008) Child computer interaction. In: Proceeding of the 26th annual CHI conference extended abstracts on human factors in computing systems—CHI’08. ACM Press, New York, pp 2419–2422Robins B, Dautenhahn K (2014) Tactile interactions with a humanoid robot: novel play scenario implementations with children with autism. Int J Soc Robot 6:397–415. https://doi.org/10.1007/s12369-014-0228-0Robins B, Dautenhahn K, Te Boekhorst R, Nehaniv CL (2008) Behaviour delay and robot expressiveness in child–robot interactions: a user study on interaction kinesics. In: Proceedings of the 3rd ACMIEEE international conference on human robot interaction, pp 17–24. https://doi.org/10.1145/1349822.1349826Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 annual conference on human factors in computing systems—CHI’11. ACM Press, New York, p 197Rust K, Malu M, Anthony L, Findlater L (2014) Understanding childdefined gestures and children’s mental models for touchscreen tabletop interaction. In: Proceedings of the 2014 conference on interaction design and children—IDC’14. ACM Press, New York, pp 201–204Salter T, Dautenhahn K, Te Boekhorst R (2006) Learning about natural human-robot interaction styles. Robot Auton Syst 54:127–134. https://doi.org/10.1016/j.robot.2005.09.022Sanghvi J, Castellano G, Leite I et al (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Proceedings of the 6th international conference on human–robot interaction—HRI’11. ACM Press, New York, pp 305–311Sanna A, Lamberti F, Paravati G, Manuri F (2013) A Kinect-based natural interface for quadrotor control. Entertain Comput 4:179–186. https://doi.org/10.1016/j.entcom.2013.01.001Sato E, Yamaguchi T, Harashima F (2007) Natural interface using pointing behavior for human–robot gestural interaction. IEEE Trans Ind Electron 54:1105–1112. https://doi.org/10.1109/TIE.2007.892728Schaper M-M, Pares N (2016) Making sense of body and space through full-body interaction design. In: Proceedings of the 15th international conference on interaction design and children—IDC’16. ACM Press, New York, pp 613–618Schaper M-M, Malinverni L, Pares N (2015) Sketching through the body: child-generated gestures in full-body interaction design. In: Proceedings of the 14th international conference on interaction design and children—IDC’15. ACM Press, New York, pp 255–258Seyed T, Burns C, Costa Sousa M et al (2012) Eliciting usable gestures for multi-display environments. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces—ITS’12. ACM Press, New York, p 41Shimon SSA, Morrison-Smith S, John N et al (2015) Exploring user-defined back-of-device gestures for mobile devices. In: Proceedings of the 17th international conference on human–computer interaction with mobile devices and services—MobileHCI’15. ACM Press, New York, pp 227–232Sipitakiat A, Nusen N (2012) Robo-blocks: a tangible programming system with debugging for children. In: Proceedings of the 11th international conference on interaction design and children—IDC’12. ACM Press, New York, p 98Soler-Adillon J, Ferrer J, Pares N (2009) A novel approach to interactive playgrounds: the interactive slide project. In: Proceedings of the 8th international conference on interaction design and children—IDC’09. ACM Press, New York, pp 131–139Stiefelhagen R, Fogen C, Gieselmann P et al (2004) Natural human–robot interaction using speech, head pose and gestures. In: 2004 IEEE/RSJ international conference on intelligent robots and systems (IROS) (IEEE Cat. No. 04CH37566). IEEE, New York, pp 2422–2427Subrahmanyam K, Greenfield PM (1994) Effect of video game practice on spatial skills in girls and boys. J Appl Dev Psychol 15:13–32. https://doi.org/10.1016/0193-3973(94)90004-3Sugiyama J, Tsetserukou D, Miura J (2011) NAVIgoid: robot navigation with haptic vision. In: SIGGRAPH Asia 2011 emerging technologies SA’11, vol 15, p 4503. https://doi.org/10.1145/2073370.2073378Takahashi T, Morita M, Tanaka F (2012) Evaluation of a tricycle-style teleoperational interface for children: a comparative experiment with a video game controller. In: 2012 IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication. IEEE, New York, pp 334–338Tanaka F, Takahashi T (2012) A tricycle-style teleoperational interface that remotely controls a robot for classroom children. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI’12. ACM Press, New York, pp 255–256Tjaden L, Tong A, Henning P et al (2012) Children’s experiences of dialysis: a systematic review of qualitative studies. Arch Dis Child 97:395–402. https://doi.org/10.1136/archdischild-2011-300639Vatavu R-D (2012) User-defined gestures for free-hand TV control. In: Proceedings of the 10th European conference on interactive TV and video—EuroiTV’12. ACM Press, New York, pp 45–48Vatavu R-D (2017) Smart-Pockets: body-deictic gestures for fast access to personal data during ambient interactions. Int J Hum Comput Stud 103:1–21. https://doi.org/10.1016/j.ijhcs.2017.01.005Vatavu R-D, Wobbrock JO (2015) Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems—CHI’15. ACM Press, New York, pp 1325–1334Vatavu R-D, Wobbrock JO (2016) Between-subjects elicitation studies: formalization and tool support. In: Proceedings of the 2016 CHI conference on human factors in computing systems—CHI’16. ACM Press, New York, pp 3390–3402Voyer D, Voyer S, Bryden MP (1995) Magnitude of sex differences in spatial abilities: a meta-analysis and consideration of critical variables. Psychol Bull 117:250–270. https://doi.org/10.1037/0033-2909.117.2.250Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6:183–199. https://doi.org/10.1109/TAMD.2014.2303116Wang Y, Zhang L (2015) A track-based gesture recognition algorithm for Kinect. Appl Mech Mater 738–7399:334–338. https://doi.org/10.4028/www.scientific.net/AMM.738-739.334

    Using mixed-reality to develop smart environments

    Get PDF
    Smart homes, smart cars, smart classrooms are now a reality as the world becomes increasingly interconnected by ubiquitous computing technology. The next step is to interconnect such environments, however there are a number of significant barriers to advancing research in this area, most notably the lack of available environments, standards and tools etc. A possible solution is the use of simulated spaces, nevertheless as realistic as strive to make them, they are, at best, only approximations to the real spaces, with important differences such as utilising idealised rather than noisy sensor data. In this respect, an improvement to simulation is emulation, which uses specially adapted physical components to imitate real systems and environments. In this paper we present our work-in-progress towards the creation of a development tool for intelligent environments based on the interconnection of simulated, emulated and real intelligent spaces using a distributed model of mixed reality. To do so, we propose the use of physical/virtual components (xReality objects) able to be combined through a 3D graphical user interface, sharing real-time information. We present three scenarios of interconnected real and emulated spaces, used for education, achieving integration between real and virtual worlds

    Tangible Interaction and Learning: The Case for a Hybrid Approach

    Get PDF
    Research involving tangible interaction and children has often focused on how tangibles might sup- port or improve learning compared to more traditional methods. In this paper, we review three of our research studies involving tangible computer programming that have addressed this question in a variety of learning environments with a diverse population of children. Through these studies, we identify situations in which tangible interaction seems to offer advantages for learning; how- ever, we have also identify situations in which tangible interaction proves less useful and an alternative interaction style provides a more appropriate medium for learning. Thus, we advocate for a hybrid approach—one that offers teachers and learners the flexibility to select the most appropriate interaction style to meet the needs of a specific situation

    Towards Intelligent Playful Environments for Animals based on Natural User Interfaces

    Full text link
    Tesis por compendioEl estudio de la interacción de los animales con la tecnología y el desarrollo de sistemas tecnológicos centrados en el animal está ganando cada vez más atención desde la aparición del área de Animal Computer Interaction (ACI). ACI persigue mejorar el bienestar de los animales en diferentes entornos a través del desarrollo de tecnología adecuada para ellos siguiendo un enfoque centrado en el animal. Entre las líneas de investigación que ACI está explorando, ha habido bastante interés en la interacción de los animales con la tecnología basada en el juego. Las actividades de juego tecnológicas tienen el potencial de proveer estimulación mental y física a los animales en diferentes contextos, pudiendo ayudar a mejorar su bienestar. Mientras nos embarcamos en la era de la Internet de las Cosas, las actividades de juego tecnológicas actuales para animales todavía no han explorado el desarrollo de soluciones pervasivas que podrían proveerles de más adaptación a sus preferencias a la vez que ofrecer estímulos tecnológicos más variados. En su lugar, estas actividades están normalmente basadas en interacciones digitales en lugar de explorar dispositivos tangibles o aumentar las interacciones con otro tipo de estímulos. Además, estas actividades de juego están ya predefinidas y no cambian con el tiempo, y requieren que un humano provea el dispositivo o la tecnología al animal. Si los humanos pudiesen centrarse más en su participación como jugadores de un sistema interactivo para animales en lugar de estar pendientes de sujetar un dispositivo para el animal o de mantener el sistema ejecutándose, esto podría ayudar a crear lazos más fuertes entre especies y promover mejores relaciones con los animales. Asimismo, la estimulación mental y física de los animales son aspectos importantes que podrían fomentarse si los sistemas de juego diseñados para ellos pudieran ofrecer un variado rango de respuestas, adaptarse a los comportamientos del animal y evitar que se acostumbre al sistema y pierda el interés. Por tanto, esta tesis propone el diseño y desarrollo de entornos tecnológicos de juego basados en Interfaces Naturales de Usuario que puedan adaptarse y reaccionar a las interacciones naturales de los animales. Estos entornos pervasivos permitirían a los animales jugar por si mismos o con una persona, ofreciendo actividades de juego más dinámicas y atractivas capaces de adaptarse con el tiempo.L'estudi de la interacció dels animals amb la tecnologia i el desenvolupament de sistemes tecnològics centrats en l'animal està guanyant cada vegada més atenció des de l'aparició de l'àrea d'Animal Computer Interaction (ACI) . ACI persegueix millorar el benestar dels animals en diferents entorns a través del desenvolupament de tecnologia adequada per a ells amb un enfocament centrat en l'animal. Entre totes les línies d'investigació que ACI està explorant, hi ha hagut prou interès en la interacció dels animals amb la tecnologia basada en el joc. Les activitats de joc tecnològiques tenen el potencial de proveir estimulació mental i física als animals en diferents contextos, podent ajudar a millorar el seu benestar. Mentre ens embarquem en l'era de la Internet de les Coses, les activitats de joc tecnològiques actuals per a animals encara no han explorat el desenvolupament de solucions pervasives que podrien proveir-los de més adaptació a les seues preferències al mateix temps que oferir estímuls tecnològics més variats. En el seu lloc, estes activitats estan normalment basades en interaccions digitals en compte d'explorar dispositius tangibles o augmentar les interaccions amb estímuls de diferent tipus. A més, aquestes activitats de joc estan ja predefinides i no canvien amb el temps, mentre requereixen que un humà proveïsca el dispositiu o la tecnologia a l'animal. Si els humans pogueren centrar-se més en la seua participació com a jugadors actius d'un sistema interactiu per a animals en compte d'estar pendents de subjectar un dispositiu per a l'animal o de mantenir el sistema executant-se, açò podria ajudar a crear llaços més forts entre espècies i promoure millors relacions amb els animals. Així mateix, l'estimulació mental i física dels animals són aspectes importants que podrien fomentar-se si els sistemes de joc dissenyats per a ells pogueren oferir un rang variat de respostes, adaptar-se als comportaments de l'animal i evitar que aquest s'acostume al sistema i perda l'interès. Per tant, esta tesi proposa el disseny i desenvolupament d'entorns tecnològics de joc basats en Interfícies Naturals d'Usuari que puguen adaptar-se i reaccionar a les interaccions naturals dels animals. Aquestos escenaris pervasius podrien permetre als animals jugar per si mateixos o amb una persona, oferint activitats de joc més dinàmiques i atractives que siguen capaces d'adaptar-se amb el temps.The study of animals' interactions with technology and the development of animal-centered technological systems is gaining attention since the emergence of the research area of Animal Computer Interaction (ACI). ACI aims to improve animals' welfare and wellbeing in several scenarios by developing suitable technology for the animal following an animal-centered approach. Among all the research lines ACI is exploring, there has been significant interest in animals' playful interactions with technology. Technologically mediated playful activities have the potential to provide mental and physical stimulation for animals in different environmental contexts, which could in turn help to improve their wellbeing. As we embark in the era of the Internet of Things, current technological playful activities for animals have not yet explored the development of pervasive solutions that could provide animals with more adaptation to their preferences as well as offering varied technological stimuli. Instead, playful technology for animals is usually based on digital interactions rather than exploring tangible devices or augmenting the interactions with different stimuli. In addition, these playful activities are already predefined and do not change over time, while they require that a human has to be the one providing the device or technology to the animal. If humans could focus more on their participation as active players of an interactive system aimed for animals instead of being concerned about holding a device for the animal or keep the system running, this might help to create stronger bonds between species and foster better relationships with animals. Moreover, animals' mental and physical stimulation are important aspects that could be fostered if the playful systems designed for animals could offer a varied range of outputs, be tailored to the animal's behaviors and prevented the animal to get used to the system and lose interest. Therefore, this thesis proposes the design and development of technological playful environments based on Natural User Interfaces that could adapt and react to the animals' natural interactions. These pervasive scenarios would allow animals to play by themselves or with a human, providing more engaging and dynamic playful activities that are capable of adapting over time.Pons Tomás, P. (2018). Towards Intelligent Playful Environments for Animals based on Natural User Interfaces [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/113075TESISCompendi

    Tangible user interfaces : past, present and future directions

    Get PDF
    In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research

    CoVR: A Large-Scale Force-Feedback Robotic Interface for Non-Deterministic Scenarios in VR

    Full text link
    We present CoVR, a novel robotic interface providing strong kinesthetic feedback (100 N) in a room-scale VR arena. It consists of a physical column mounted on a 2D Cartesian ceiling robot (XY displacements) with the capacity of (1) resisting to body-scaled users' actions such as pushing or leaning; (2) acting on the users by pulling or transporting them as well as (3) carrying multiple potentially heavy objects (up to 80kg) that users can freely manipulate or make interact with each other. We describe its implementation and define a trajectory generation algorithm based on a novel user intention model to support non-deterministic scenarios, where the users are free to interact with any virtual object of interest with no regards to the scenarios' progress. A technical evaluation and a user study demonstrate the feasibility and usability of CoVR, as well as the relevance of whole-body interactions involving strong forces, such as being pulled through or transported.Comment: 10 pages (without references), 14 pages tota

    Computational Thinking Unplugged: Comparing the Impact on Confidence and Competence from Analog and Digital Resources in Computer Science Professional Development for Elementary Teachers

    Get PDF
    The demand for computer science instruction is increasing across the K-12 spectrum, but in many cases elementary teachers are ill prepared to teach the subject. Based on prior research showing a preference for analog interfaces, this study compared the impact of analog and digital interface modalities on teachers’ confidence and competence gains in professional development on computational thinking conceived within the framework of cognitive acceleration. The analog group used the Robot Turtles board game and the digital group used the Scratch Jr. app on iPads while receiving the same professional development content. A single-case experimental design approach with a multiple-baseline approach to establish control and appropriate randomization techniques was used to allow for generalization of findings and identification of a functional relationship. Teachers were assessed using the Elementary Teacher Computer Programming Self-Efficacy Scale for confidence and the Computational Thinking Test for competence. The results indicated a significant and higher effect size on confidence for the analog cases as compared to the digital. Visual analysis confirmed these findings and provided emerging support for a functional relationship. Recommendations for modifications to current professional development, classroom instruction, and policy making practices to adopt an analog-first approach to computer science based on the foundational concepts of computational thinking were identified based on these findings

    Children s Acceptance of a Collaborative Problem Solving Game Based on Physical Versus Digital Learning Spaces

    Full text link
    [EN] Collaborative problem solving (CPS) is an essential soft skill that should be fostered from a young age. Research shows that a good way of teaching such skills is through video games; however, the success and viability of this method may be affected by the technological platform used. In this work we propose a gameful approach to train CPS skills in the form of the CPSbot framework and describe a study involving 80 primary school children on user experience and acceptance of a game, Quizbot, using three different technological platforms: two purely digital (tabletop and handheld tablets) and another based on tangible interfaces and physical spaces. The results show that physical spaces proved to be more effective than the screen-based platforms in several ways, as well as being considered more fun and easier to use by the children. Finally, we propose a set of design considerations for future gameful CPS systems based on the observations made during this study.Spanish Ministry of Economy and Competitiveness and the European Regional Development Fund (project TIN2014-60077-R); Spanish Ministry of Education, Culture and Sport (with fellowship FPU14/00136) and Conselleria d'Educacio, Cultura i Esport (Generalitat Valenciana, Spain) (grant ACIF/2014/214).Jurdi, S.; García Sanjuan, F.; Nácher-Soler, VE.; Jaén Martínez, FJ. (2018). Children s Acceptance of a Collaborative Problem Solving Game Based on Physical Versus Digital Learning Spaces. Interacting with Computers. 30(3):187-206. https://doi.org/10.1093/iwc/iwy006S18720630
    corecore