2,329 research outputs found
Remote interspecies interactions: Improving humans and animals wellbeing through mobile playful spaces
[EN] Play is an essential activity for both humans and animals as it provides stimulation and favors cognitive, physical and social development. This paper proposes a novel pervasive playful environment that allows hospitalized children to participate in remote interspecies play with dogs in a dog daycare facility, while it also allows the dogs to play by themselves with the pervasive system. The aim of this playful interactive space is to help improving both childrenÂżs and animalÂżs wellbeing and their relationships by means of technologically mediated play, while creating a solid knowledge base to define the future of pervasive interactive environments for animals.This work is supported by the European Development Regional Fund (EDRF-FEDER), Spain and Spanish MINECO (TIN2014-60077-R). The work of Patricia Pons is supported by the Spanish MECD (FPU13/03831). Special thanks to the dogs and children who participated in our study, the dogs' owners and the children's families. The authors also gratefully acknowledge the teachers of the Unidad Pedagogica Hospitalaria La Fe and Oncologia Pediatrica La Fe and also Olga and Astrid from Buma's Doggy Daycare facility, for their invaluable support, collaboration and dedication.Pons TomĂĄs, P.; Carrion-Plaza, A.; JaĂŠn MartĂnez, FJ. (2019). Remote interspecies interactions: Improving humans and animals wellbeing through mobile playful spaces. Pervasive and Mobile Computing. 52:113-130. https://doi.org/10.1016/j.pmcj.2018.12.003S1131305
Towards Intelligent Playful Environments for Animals based on Natural User Interfaces
Tesis por compendioEl estudio de la interacciĂłn de los animales con la tecnologĂa y el desarrollo de sistemas tecnolĂłgicos centrados en el animal estĂĄ ganando cada vez mĂĄs atenciĂłn desde la apariciĂłn del ĂĄrea de Animal Computer Interaction (ACI). ACI persigue mejorar el bienestar de los animales en diferentes entornos a travĂŠs del desarrollo de tecnologĂa adecuada para ellos siguiendo un enfoque centrado en el animal. Entre las lĂneas de investigaciĂłn que ACI estĂĄ explorando, ha habido bastante interĂŠs en la interacciĂłn de los animales con la tecnologĂa basada en el juego. Las actividades de juego tecnolĂłgicas tienen el potencial de proveer estimulaciĂłn mental y fĂsica a los animales en diferentes contextos, pudiendo ayudar a mejorar su bienestar.
Mientras nos embarcamos en la era de la Internet de las Cosas, las actividades de juego tecnolĂłgicas actuales para animales todavĂa no han explorado el desarrollo de soluciones pervasivas que podrĂan proveerles de mĂĄs adaptaciĂłn a sus preferencias a la vez que ofrecer estĂmulos tecnolĂłgicos mĂĄs variados. En su lugar, estas actividades estĂĄn normalmente basadas en interacciones digitales en lugar de explorar dispositivos tangibles o aumentar las interacciones con otro tipo de estĂmulos. AdemĂĄs, estas actividades de juego estĂĄn ya predefinidas y no cambian con el tiempo, y requieren que un humano provea el dispositivo o la tecnologĂa al animal. Si los humanos pudiesen centrarse mĂĄs en su participaciĂłn como jugadores de un sistema interactivo para animales en lugar de estar pendientes de sujetar un dispositivo para el animal o de mantener el sistema ejecutĂĄndose, esto podrĂa ayudar a crear lazos mĂĄs fuertes entre especies y promover mejores relaciones con los animales. Asimismo, la estimulaciĂłn mental y fĂsica de los animales son aspectos importantes que podrĂan fomentarse si los sistemas de juego diseĂąados para ellos pudieran ofrecer un variado rango de respuestas, adaptarse a los comportamientos del animal y evitar que se acostumbre al sistema y pierda el interĂŠs.
Por tanto, esta tesis propone el diseĂąo y desarrollo de entornos tecnolĂłgicos de juego basados en Interfaces Naturales de Usuario que puedan adaptarse y reaccionar a las interacciones naturales de los animales. Estos entornos pervasivos permitirĂan a los animales jugar por si mismos o con una persona, ofreciendo actividades de juego mĂĄs dinĂĄmicas y atractivas capaces de adaptarse con el tiempo.L'estudi de la interacciĂł dels animals amb la tecnologia i el desenvolupament de sistemes tecnològics centrats en l'animal estĂ guanyant cada vegada mĂŠs atenciĂł des de l'apariciĂł de l'Ă rea d'Animal Computer Interaction (ACI) . ACI persegueix millorar el benestar dels animals en diferents entorns a travĂŠs del desenvolupament de tecnologia adequada per a ells amb un enfocament centrat en l'animal. Entre totes les lĂnies d'investigaciĂł que ACI estĂ explorant, hi ha hagut prou interès en la interacciĂł dels animals amb la tecnologia basada en el joc. Les activitats de joc tecnològiques tenen el potencial de proveir estimulaciĂł mental i fĂsica als animals en diferents contextos, podent ajudar a millorar el seu benestar.
Mentre ens embarquem en l'era de la Internet de les Coses, les activitats de joc tecnològiques actuals per a animals encara no han explorat el desenvolupament de solucions pervasives que podrien proveir-los de mĂŠs adaptaciĂł a les seues preferències al mateix temps que oferir estĂmuls tecnològics mĂŠs variats. En el seu lloc, estes activitats estan normalment basades en interaccions digitals en compte d'explorar dispositius tangibles o augmentar les interaccions amb estĂmuls de diferent tipus. A mĂŠs, aquestes activitats de joc estan ja predefinides i no canvien amb el temps, mentre requereixen que un humĂ proveĂŻsca el dispositiu o la tecnologia a l'animal. Si els humans pogueren centrar-se mĂŠs en la seua participaciĂł com a jugadors actius d'un sistema interactiu per a animals en compte d'estar pendents de subjectar un dispositiu per a l'animal o de mantenir el sistema executant-se, açò podria ajudar a crear llaços mĂŠs forts entre espècies i promoure millors relacions amb els animals. AixĂ mateix, l'estimulaciĂł mental i fĂsica dels animals sĂłn aspectes importants que podrien fomentar-se si els sistemes de joc dissenyats per a ells pogueren oferir un rang variat de respostes, adaptar-se als comportaments de l'animal i evitar que aquest s'acostume al sistema i perda l'interès.
Per tant, esta tesi proposa el disseny i desenvolupament d'entorns tecnològics de joc basats en InterfĂcies Naturals d'Usuari que puguen adaptar-se i reaccionar a les interaccions naturals dels animals. Aquestos escenaris pervasius podrien permetre als animals jugar per si mateixos o amb una persona, oferint activitats de joc mĂŠs dinĂ miques i atractives que siguen capaces d'adaptar-se amb el temps.The study of animals' interactions with technology and the development of animal-centered technological systems is gaining attention since the emergence of the research area of Animal Computer Interaction (ACI). ACI aims to improve animals' welfare and wellbeing in several scenarios by developing suitable technology for the animal following an animal-centered approach. Among all the research lines ACI is exploring, there has been significant interest in animals' playful interactions with technology. Technologically mediated playful activities have the potential to provide mental and physical stimulation for animals in different environmental contexts, which could in turn help to improve their wellbeing.
As we embark in the era of the Internet of Things, current technological playful activities for animals have not yet explored the development of pervasive solutions that could provide animals with more adaptation to their preferences as well as offering varied technological stimuli. Instead, playful technology for animals is usually based on digital interactions rather than exploring tangible devices or augmenting the interactions with different stimuli. In addition, these playful activities are already predefined and do not change over time, while they require that a human has to be the one providing the device or technology to the animal. If humans could focus more on their participation as active players of an interactive system aimed for animals instead of being concerned about holding a device for the animal or keep the system running, this might help to create stronger bonds between species and foster better relationships with animals. Moreover, animals' mental and physical stimulation are important aspects that could be fostered if the playful systems designed for animals could offer a varied range of outputs, be tailored to the animal's behaviors and prevented the animal to get used to the system and lose interest.
Therefore, this thesis proposes the design and development of technological playful environments based on Natural User Interfaces that could adapt and react to the animals' natural interactions. These pervasive scenarios would allow animals to play by themselves or with a human, providing more engaging and dynamic playful activities that are capable of adapting over time.Pons Tomås, P. (2018). Towards Intelligent Playful Environments for Animals based on Natural User Interfaces [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/113075TESISCompendi
Interactive spaces for children: gesture elicitation for controlling ground mini-robots
[EN] Interactive spaces for education are emerging as a mechanism for fostering children's natural ways of learning by means of play and exploration in physical spaces. The advanced interactive modalities and devices for such environments need to be both motivating and intuitive for children. Among the wide variety of interactive mechanisms, robots have been a popular research topic in the context of educational tools due to their attractiveness for children. However, few studies have focused on how children would naturally interact and explore interactive environments with robots. While there is abundant research on full-body interaction and intuitive manipulation of robots by adults, no similar research has been done with children. This paper therefore describes a gesture elicitation study that identified the preferred gestures and body language communication used by children to control ground robots. The results of the elicitation study were used to define a gestural language that covers the different preferences of the gestures by age group and gender, with a good acceptance rate in the 6-12 age range. The study also revealed interactive spaces with robots using body gestures as motivating and promising scenarios for collaborative or remote learning activities.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by the Spanish MINECO (TIN2014-60077-R). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks are due to the children and teachers of the Col-legi Public Vicente Gaos for their valuable collaboration and dedication.Pons TomĂĄs, P.; JaĂŠn MartĂnez, FJ. (2020). Interactive spaces for children: gesture elicitation for controlling ground mini-robots. Journal of Ambient Intelligence and Humanized Computing. 11(6):2467-2488. https://doi.org/10.1007/s12652-019-01290-6S24672488116Alborzi H, Hammer J, Kruskal A et al (2000) Designing StoryRooms: interactive storytelling spaces for children. In: Proceedings of the conference on designing interactive systems processes, practices, methods, and techniquesâDISâ00. ACM Press, New York, pp 95â104Antle AN, Corness G, Droumeva M (2009) What the body knows: exploring the benefits of embodied metaphors in hybrid physical digital environments. Interact Comput 21:66â75. https://doi.org/10.1016/j.intcom.2008.10.005Belpaeme T, Baxter PE, Read R et al (2013) Multimodal childârobot interaction: building social bonds. J Human-Robot Interact 1:33â53. https://doi.org/10.5898/JHRI.1.2.BelpaemeBenko H, Wilson AD, Zannier F, Benko H (2014) Dyadic projected spatial augmented reality. In: Proceedings of the 27th annual ACM symposium on user interface software and technologyâUISTâ14, pp 645â655Bobick AF, Intille SS, Davis JW et al (1999) The KidsRoom: a perceptually-based interactive and immersive story environment. Presence Teleoper Virtual Environ 8:367â391. https://doi.org/10.1162/105474699566297Bonarini A, Clasadonte F, Garzotto F, Gelsomini M (2015) Blending robots and full-body interaction with large screens for children with intellectual disability. In: Proceedings of the 14th international conference on interaction design and childrenâIDCâ15. ACM Press, New York, pp 351â354Cauchard JR, E JL, Zhai KY, Landay JA (2015) Drone & me: an exploration into natural humanâdrone interaction. In: Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computingâUbiCompâ15. ACM Press, New York, pp 361â365Connell S, Kuo P-Y, Liu L, Piper AM (2013) A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. In: Proceedings of the 12th international conference on interaction design and childrenâIDCâ13. ACM Press, New York, pp 277â280Derboven J, Van Mechelen M, Slegers K (2015) Multimodal analysis in participatory design with children. In: Proceedings of the 33rd annual ACM conference on human factors in computing systemsâCHIâ15. ACM Press, New York, pp 2825â2828Dong H, Danesh A, Figueroa N, El Saddik A (2015) An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access 3:543â555. https://doi.org/10.1109/ACCESS.2015.2432679Druin A (1999) Cooperative inquiry: developing new technologies for children with children. In: Proceedings of the SIGCHI conference on human factors computer system CHI is limitâCHIâ99, vol 14, pp 592â599. https://doi.org/10.1145/302979.303166Druin A (2002) The role of children in the design of new technology. Behav Inf Technol 21:1â25. https://doi.org/10.1080/01449290110108659Druin A, Bederson B, Boltman A et al (1999) Children as our technology design partners. In: Druin A (ed) The design of childrenâs technology. Morgan Kaufman, San Francisco, pp 51â72Epps J, Lichman S, Wu M (2006) A study of hand shape use in tabletop gesture interaction. CHIâ06 extended abstracts on human factors in computing systemsâCHI EAâ06. ACM Press, New York, pp 748â753Fender AR, Benko H, Wilson A (2017) MeetAliveâŻ: room-scale omni-directional display system for multi-user content and control sharing. In: Proceedings of the 2017 ACM international conference on interactive surfaces and spaces, pp 106â115Fernandez RAS, Sanchez-Lopez JL, Sampedro C et al (2016) Natural user interfaces for humanâdrone multi-modal interaction. In: 2016 international conference on unmanned aircraft systems (ICUAS). IEEE, New York, pp 1013â1022Garcia-Sanjuan F, Jaen J, Nacher V, Catala A (2015) Design and evaluation of a tangible-mediated robot for kindergarten instruction. In: Proceedings of the 12th international conference on advances in computer entertainment technologyâACEâ15. ACM Press, New York, pp 1â11Garcia-Sanjuan F, Jaen J, Jurdi S (2016) Towards encouraging communication in hospitalized children through multi-tablet activities. In: Proceedings of the XVII international conference on human computer interaction, pp 29.1â29.4Gindling J, Ioannidou A, Loh J et al (1995) LEGOsheets: a rule-based programming, simulation and manipulation environment for the LEGO programmable brick. In: Proceedings of symposium on visual languages. IEEE Computer Society Press, New York, pp 172â179Gonzalez B, Borland J, Geraghty K (2009) Whole body interaction for child-centered multimodal language learning. In: Proceedings of the 2nd workshop on child, computer and interactionâWOCCIâ09. ACM Press, New York, pp 1â5GrønbĂŚk K, Iversen OS, Kortbek KJ et al (2007) Interactive floor support for kinesthetic interaction in children learning environments. In: Humanâcomputer interactionâINTERACT 2007. Lecture notes in computer science, pp 361â375Guha ML, Druin A, Chipman G et al (2005) Working with young children as technology design partners. Commun ACM 48:39â42. https://doi.org/10.1145/1039539.1039567Hansen JP, Alapetite A, MacKenzie IS, Møllenbach E (2014) The use of gaze to control drones. In: Proceedings of the symposium on eye tracking research and applicationsâETRAâ14. ACM Press, New York, pp 27â34Henkemans OAB, Bierman BPB, Janssen J et al (2017) Design and evaluation of a personal robot playing a self-management education game with children with diabetes type 1. Int J Hum Comput Stud 106:63â76. https://doi.org/10.1016/j.ijhcs.2017.06.001Horn MS, Crouser RJ, Bers MU (2011) Tangible interaction and learning: the case for a hybrid approach. Pers Ubiquitous Comput 16:379â389. https://doi.org/10.1007/s00779-011-0404-2Hourcade JP (2015) Child computer interaction. CreateSpace Independent Publishing Platform, North CharlestonHĂśysniemi J, Hämäläinen P, Turkki L (2004) Wizard of Oz prototyping of computer vision based action games for children. Proceeding of the 2004 conference on interaction design and children building a communityâIDCâ04. ACM Press, New York, pp 27â34HĂśysniemi J, Hämäläinen P, Turkki L, Rouvi T (2005) Childrenâs intuitive gestures in vision-based action games. Commun ACM 48:44â50. https://doi.org/10.1145/1039539.1039568Hsiao H-S, Chen J-C (2016) Using a gesture interactive game-based learning approach to improve preschool childrenâs learning performance and motor skills. Comput Educ 95:151â162. https://doi.org/10.1016/j.compedu.2016.01.005Jokela T, Rezaei PP, Väänänen K (2016) Using elicitation studies to generate collocated interaction methods. In: Proceedings of the 18th international conference on humanâcomputer interaction with mobile devices and services adjunct, pp 1129â1133. https://doi.org/10.1145/2957265.2962654Jones B, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI conference on human factors in computing systemsâCHIâ13, pp 869â878Jones B, Shapira L, Sodhi R et al (2014) RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units. In: Proceedings of the 27th annual ACM symposium on user interface software and technologyâUISTâ14, pp 637â644Kaminski M, Pellino T, Wish J (2002) Play and pets: the physical and emotional impact of child-life and pet therapy on hospitalized children. Child Heal Care 31:321â335. https://doi.org/10.1207/S15326888CHC3104_5Karam M, Schraefel MC (2005) A taxonomy of gestures in human computer interactions. In: Technical report in electronics and computer science, pp 1â45Kistler F, AndrĂŠ E (2013) User-defined body gestures for an interactive storytelling scenario. Lect Notes Comput Sci (including subser Lect Notes Artif Intell Lect Notes Bioinform) 8118:264â281. https://doi.org/10.1007/978-3-642-40480-1_17Konda KR, KĂśnigs A, Schulz H, Schulz D (2012) Real time interaction with mobile robots using hand gestures. In: Proceedings of the seventh annual ACM/IEEE international conference on humanârobot interactionâHRIâ12. ACM Press, New York, pp 177â178Kray C, Nesbitt D, Dawson J, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and servicesâMobileHCIâ10. ACM Press, New York, pp 239â248Kurdyukova E, Redlin M, AndrĂŠ E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. In: Proceedings of the 2012 ACM international conference on intelligent user interfacesâIUIâ12. ACM Press, New York, pp 93â96Lambert V, Coad J, Hicks P, Glacken M (2014) Social spaces for young children in hospital. Child Care Health Dev 40:195â204. https://doi.org/10.1111/cch.12016Lee S-S, Chae J, Kim H et al (2013) Towards more natural digital content manipulation via user freehand gestural interaction in a living room. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computingâUbiCompâ13. ACM Press, New York, p 617Malinverni L, Mora-Guiard J, Pares N (2016) Towards methods for evaluating and communicating participatory design: a multimodal approach. Int J Hum Comput Stud 94:53â63. https://doi.org/10.1016/j.ijhcs.2016.03.004Mann HB, Whitney DR (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18:50â60. https://doi.org/10.1214/aoms/1177730491Marco J, Cerezo E, Baldassarri S et al (2009) Bringing tabletop technologies to kindergarten children. In: Proceedings of the 23rd British HCI Group annual conference on people and computers: celebrating people and technology, pp 103â111Michaud F, Caron S (2002) Roball, the rolling robot. Auton Robots 12:211â222. https://doi.org/10.1023/A:1014005728519Micire M, Desai M, Courtemanche A et al (2009) Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. In: Proceedings of the ACM international conference on interactive tabletops and surfacesâITSâ09. ACM Press, New York, pp 41â48Mora-Guiard J, Crowell C, Pares N, Heaton P (2016) Lands of fog: helping children with autism in social interaction through a full-body interactive experience. In: Proceedings of the 15th international conference on interaction design and childrenâIDCâ16. ACM Press, New York, pp 262â274Morris MR (2012) Web on the wall: insights from a multimodal interaction elicitation study. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces. ACM Press, New York, pp 95â104Morris MR, Wobbrock JO, Wilson AD (2010) Understanding usersâ preferences for surface gestures. Proc Graph Interface 2010:261â268Nacher V, Garcia-Sanjuan F, Jaen J (2016) Evaluating the usability of a tangible-mediated robot for kindergarten children instruction. In: 2016 IEEE 16th international conference on advanced learning technologies (ICALT). IEEE, New York, pp 130â132Nahapetyan VE, Khachumov VM (2015) Gesture recognition in the problem of contactless control of an unmanned aerial vehicle. Optoelectron Instrum Data Process 51:192â197. https://doi.org/10.3103/S8756699015020132Obaid M, Häring M, Kistler F et al (2012) User-defined body gestures for navigational control of a humanoid robot. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), pp 367â377Obaid M, Kistler F, Häring M et al (2014) A framework for user-defined body gestures to control a humanoid robot. Int J Soc Robot 6:383â396. https://doi.org/10.1007/s12369-014-0233-3Obaid M, Kistler F, KasparaviÄiĹŤtÄ G, et al (2016) How would you gesture navigate a drone?: a user-centered approach to control a drone. In: Proceedings of the 20th international academic Mindtrek conferenceâAcademicMindtrekâ16. ACM Press, New York, pp 113â121Pares N, Soler M, Sanjurjo Ă et al (2005) Promotion of creative activity in children with severe autism through visuals in an interactive multisensory environment. In: Proceeding of the 2005 conference on interaction design and childrenâIDCâ05. ACM Press, New York, pp 110â116Pfeil K, Koh SL, LaViola J (2013) Exploring 3D gesture metaphors for interaction with unmanned aerial vehicles. In: Proceedings of the 2013 international conference on intelligent user interfacesâIUIâ13, pp 257â266. https://doi.org/10.1145/2449396.2449429Piaget J (1956) The childâs conception of space. Norton, New YorkPiaget J (1973) The child and reality: problems of genetic psychology. Grossman, New YorkPiumsomboon T, Clark A, Billinghurst M, Cockburn A (2013) User-defined gestures for augmented reality. CHIâ13 extended abstracts on human factors in computing systemsâCHI EAâ13. ACM Press, New York, pp 955â960Pons P, CarriĂłn A, Jaen J (2018) Remote interspecies interactions: improving humans and animalsâ wellbeing through mobile playful spaces. Pervasive Mob Comput. https://doi.org/10.1016/j.pmcj.2018.12.003Puranam MB (2005) Towards full-body gesture analysis and recognition. University of Kentucky, LexingtonPyryeskin D, Hancock M, Hoey J (2012) Comparing elicited gestures to designer-created gestures for selection above a multitouch surface. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfacesâITSâ12. ACM Press, New York, pp 1â10Raffle HS, Parkes AJ, Ishii H (2004) Topobo: a constructive assembly system with kinetic memory. System 6:647â654. https://doi.org/10.1145/985692.985774Read JC, Markopoulos P (2013) Childâcomputer interaction. Int J Child-Comput Interact 1:2â6. https://doi.org/10.1016/j.ijcci.2012.09.001Read JC, Macfarlane S, Casey C (2002) Endurability, engagement and expectations: measuring childrenâs fun. In: Interaction design and children, pp 189â198Read JC, Markopoulos P, ParĂŠs N et al (2008) Child computer interaction. In: Proceeding of the 26th annual CHI conference extended abstracts on human factors in computing systemsâCHIâ08. ACM Press, New York, pp 2419â2422Robins B, Dautenhahn K (2014) Tactile interactions with a humanoid robot: novel play scenario implementations with children with autism. Int J Soc Robot 6:397â415. https://doi.org/10.1007/s12369-014-0228-0Robins B, Dautenhahn K, Te Boekhorst R, Nehaniv CL (2008) Behaviour delay and robot expressiveness in childârobot interactions: a user study on interaction kinesics. In: Proceedings of the 3rd ACMIEEE international conference on human robot interaction, pp 17â24. https://doi.org/10.1145/1349822.1349826Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 annual conference on human factors in computing systemsâCHIâ11. ACM Press, New York, p 197Rust K, Malu M, Anthony L, Findlater L (2014) Understanding childdefined gestures and childrenâs mental models for touchscreen tabletop interaction. In: Proceedings of the 2014 conference on interaction design and childrenâIDCâ14. ACM Press, New York, pp 201â204Salter T, Dautenhahn K, Te Boekhorst R (2006) Learning about natural human-robot interaction styles. Robot Auton Syst 54:127â134. https://doi.org/10.1016/j.robot.2005.09.022Sanghvi J, Castellano G, Leite I et al (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Proceedings of the 6th international conference on humanârobot interactionâHRIâ11. ACM Press, New York, pp 305â311Sanna A, Lamberti F, Paravati G, Manuri F (2013) A Kinect-based natural interface for quadrotor control. Entertain Comput 4:179â186. https://doi.org/10.1016/j.entcom.2013.01.001Sato E, Yamaguchi T, Harashima F (2007) Natural interface using pointing behavior for humanârobot gestural interaction. IEEE Trans Ind Electron 54:1105â1112. https://doi.org/10.1109/TIE.2007.892728Schaper M-M, Pares N (2016) Making sense of body and space through full-body interaction design. In: Proceedings of the 15th international conference on interaction design and childrenâIDCâ16. ACM Press, New York, pp 613â618Schaper M-M, Malinverni L, Pares N (2015) Sketching through the body: child-generated gestures in full-body interaction design. In: Proceedings of the 14th international conference on interaction design and childrenâIDCâ15. ACM Press, New York, pp 255â258Seyed T, Burns C, Costa Sousa M et al (2012) Eliciting usable gestures for multi-display environments. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfacesâITSâ12. ACM Press, New York, p 41Shimon SSA, Morrison-Smith S, John N et al (2015) Exploring user-defined back-of-device gestures for mobile devices. In: Proceedings of the 17th international conference on humanâcomputer interaction with mobile devices and servicesâMobileHCIâ15. ACM Press, New York, pp 227â232Sipitakiat A, Nusen N (2012) Robo-blocks: a tangible programming system with debugging for children. In: Proceedings of the 11th international conference on interaction design and childrenâIDCâ12. ACM Press, New York, p 98Soler-Adillon J, Ferrer J, Pares N (2009) A novel approach to interactive playgrounds: the interactive slide project. In: Proceedings of the 8th international conference on interaction design and childrenâIDCâ09. ACM Press, New York, pp 131â139Stiefelhagen R, Fogen C, Gieselmann P et al (2004) Natural humanârobot interaction using speech, head pose and gestures. In: 2004 IEEE/RSJ international conference on intelligent robots and systems (IROS) (IEEE Cat. No. 04CH37566). IEEE, New York, pp 2422â2427Subrahmanyam K, Greenfield PM (1994) Effect of video game practice on spatial skills in girls and boys. J Appl Dev Psychol 15:13â32. https://doi.org/10.1016/0193-3973(94)90004-3Sugiyama J, Tsetserukou D, Miura J (2011) NAVIgoid: robot navigation with haptic vision. In: SIGGRAPH Asia 2011 emerging technologies SAâ11, vol 15, p 4503. https://doi.org/10.1145/2073370.2073378Takahashi T, Morita M, Tanaka F (2012) Evaluation of a tricycle-style teleoperational interface for children: a comparative experiment with a video game controller. In: 2012 IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication. IEEE, New York, pp 334â338Tanaka F, Takahashi T (2012) A tricycle-style teleoperational interface that remotely controls a robot for classroom children. In: Proceedings of the seventh annual ACM/IEEE international conference on humanârobot interactionâHRIâ12. ACM Press, New York, pp 255â256Tjaden L, Tong A, Henning P et al (2012) Childrenâs experiences of dialysis: a systematic review of qualitative studies. Arch Dis Child 97:395â402. https://doi.org/10.1136/archdischild-2011-300639Vatavu R-D (2012) User-defined gestures for free-hand TV control. In: Proceedings of the 10th European conference on interactive TV and videoâEuroiTVâ12. ACM Press, New York, pp 45â48Vatavu R-D (2017) Smart-Pockets: body-deictic gestures for fast access to personal data during ambient interactions. Int J Hum Comput Stud 103:1â21. https://doi.org/10.1016/j.ijhcs.2017.01.005Vatavu R-D, Wobbrock JO (2015) Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the 33rd annual ACM conference on human factors in computing systemsâCHIâ15. ACM Press, New York, pp 1325â1334Vatavu R-D, Wobbrock JO (2016) Between-subjects elicitation studies: formalization and tool support. In: Proceedings of the 2016 CHI conference on human factors in computing systemsâCHIâ16. ACM Press, New York, pp 3390â3402Voyer D, Voyer S, Bryden MP (1995) Magnitude of sex differences in spatial abilities: a meta-analysis and consideration of critical variables. Psychol Bull 117:250â270. https://doi.org/10.1037/0033-2909.117.2.250Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6:183â199. https://doi.org/10.1109/TAMD.2014.2303116Wang Y, Zhang L (2015) A track-based gesture recognition algorithm for Kinect. Appl Mech Mater 738â7399:334â338. https://doi.org/10.4028/www.scientific.net/AMM.738-739.334
Potilaskokemus kuvina: ValokuvausmenetelmaĚ lasten kokemusten tutkimiseksi Lastensairaalassa
Healthcare organizations have recently started to collect information about the experiences of their patients. The current feedback methods have been developed mainly for adult participants and the organizations are lacking appropriate instruments for collecting perceptions of children. Also, childrenâs patient experience has yet received little attention in the academic literature.
This thesis focused on exploring how childrenâs patient experience can be studied at the Childrenâs Hospital. The study was conducted as a part of LAPSUS research project. The research questions were the following: (1) Which research approaches and techniques are applicable for studying 6- to 10-year-old childrenâs patient experience? (2) Based on the empirical study, how suitable is the photo elicitation technique for studying childrenâs patient experience?
Four potential research techniques were identified in the literature study and evaluated with medical experts. For the empirical study, photo elicitation technique was chosen and tested in two different units of Childrenâs Hospital in Helsinki. Eight child patients participated in the study. The data consisted of qualitative photo elicitation interviews and 64 photographs portraying childrenâs positive and negative experiences during hospitalization. In the analysis, the data were thematically categorized and additionally, feedback from the participants and hospital personnel were scrutinized.
The results of the study demonstrate the importance of engaging children in studies pertaining to their care. Children have unique experiences which can be utilized in improving the healthcare service. The positive photographs indicated that children value toys and other entertainment, good hospital facilities, friendly nursing staff and painless procedures. Respectively, the negative photographs emphasized the unpleasant nature of invasive operations and the hospital environment.
This thesis provided Childrenâs Hospital a novel way to access patientsâ perceptions in an age-appropriate manner. The photo elicitation technique will help the hospital identify gaps in the service and improve the child-friendliness of the care. Utilizing photography is applicable and fun both from the perspectives of patients and hospital personnel. Future work is needed to fit the photographing technique to the routines in the hospital and to make necessary adjustments to the instrument.Terveyspalveluita tuottavat organisaatiot ovat alkaneet keraĚtaĚ tietoa potilaiden kokemuksista palveluidensa parantamiseksi. Nykyiset potilastyytyvaĚisyyskyselyt keskittyvaĚt palautteen keraĚaĚmiseen aikuisilta, mutta organisaatioilta puuttuvat instrumentit lapsipotilaiden kokemusten selvittaĚmiseen. LisaĚksi akateemisesta kirjallisuudesta loĚytyy vain vaĚhaĚn tietoa lasten potilaskokemuksesta ja sen tutkimisesta.
TaĚssaĚ diplomityoĚssaĚ selvitettiin miten Lastensairaala voi keraĚtaĚ tietoa lasten potilaskokemuksesta. TyoĚ toteutettiin osana LAPSUSâtutkimushanketta. Tutkimuskysymykset olivat: (1) MitkaĚ tutkimusmenetelmaĚt soveltuvat kirjallisuuden mukaan 6â10-vuotiaiden lasten potilaskokemuksen tutkimiseen? (2) Kuinka hyvin valokuvamenetelmaĚ soveltuu empiirisen tiedon perusteella lasten potilaskokemuksen tutkimiseen?
Kirjallisuudesta loĚydettiin neljaĚ potentiaalista tutkimusmenetelmaĚaĚ, joista asiantuntijahaastatteluiden perusteella valokuvausmenetelmaĚ valittiin empiirisen tutkimuksen kohteeksi. Tutkimukseen osallistui kahdeksan lasta Helsingin Lastensairaalan kahdesta yksikoĚstaĚ. Aineisto koostui laadullisista valokuvahaastatteluista sekaĚ 64 valokuvasta, jotka kuvasivat lasten positiivisia ja negatiivisia kokemuksia sairaalassa. Tulokset analysoitiin teemoittelua hyoĚdyntaĚen.
Tutkimuksen tulokset vahvistavat kaĚsitystaĚ lasten osallistamisen merkityksestaĚ. Lapsilla on ainutlaatuisia kokemuksia, joita voidaan hyoĚdyntaĚaĚ organisaation toiminnan kehittaĚmisessaĚ. Lasten positiivisissa kokemuksissa korostuivat lelut ja muut viihdykkeet, tilat ja palvelut, ystaĚvaĚllinen hoitohenkiloĚkunta sekaĚ kivuttomat hoitotoimenpiteet. Vastaavasti negatiivisissa kokemuksissa painottuivat etenkin invasiiviset operaatiot sekaĚ sairaalaympaĚristoĚn tylsyys.
ValokuvamenetelmaĚ tarjoaa Lastensairaalalle uudenlaisen ja lapsilaĚhtoĚisen tavan keraĚtaĚ palautetta potilailta. MenetelmaĚn avulla voidaan tunnistaa palvelun ongelmakohtia ja kehittaĚaĚ sairaalan toimintaa lapsiystaĚvaĚllisemmaĚksi. Jatkotutkimuksen aiheita ovat valokuvausmenetelmaĚn kaĚyttoĚoĚnoton suunnittelu sekaĚ tutkimusinstrumentin jatkokehitys
A systematic review of game technologies for pediatric patients
[EN] Children in hospital are subjected to multiple negative stimuli that may hinder their development and social interactions. Although game technologies are thought to improve children's experience in hospital, there is a lack of information on how they can be used effectively. This paper presents a systematic review of the literature on the existing approaches in this context to identify gaps for future research. A total of 1305 studies were identified, of which 75 were thoroughly analyzed according to our review protocol. The results show that the most common approach is to design mono-user games with traditional computers or monitor-based video consoles, which serve as a distractor or a motivator for physical rehabilitation for primary school children undergoing fearful procedures such as venipuncture, or those suffering chronic, neurological, or traumatic diseases/injures. We conclude that, on the one hand, game technologies seem to present physical and psychological benefits to pediatric patients, but more research is needed on this. On the other hand, future designers of games for pediatric hospitalization should consider: 1. The development for kindergarten patients and adolescents, 2. Address the psychological impact caused by long-term hospitalization, 3. Use collaboration as an effective game strategy to reduce patient isolation, 4. Have purposes other than distraction, such as socialization, coping with emotions, or fostering physical mobility, 5. Include parents/caregivers and hospital staff in the game activities; and 6. Exploit new technological artifacts such as robots and tangible interactive elements to encourage intrinsic motivation.This work is supported by the Spanish Ministry of Economy and Competitiveness and the European Development Regional Fund (EDRF-FEDER) with Project TIN2014-60077-R.El Jurdi, S.; Montaner-Marco, J.; GarcĂa Sanjuan, F.; JaĂŠn MartĂnez, FJ.; NĂĄcher-Soler, VE. (2018). A systematic review of game technologies for pediatric patients. Computers in Biology and Medicine. 97:89-112. https://doi.org/10.1016/j.compbiomed.2018.04.019S891129
Recommended from our members
Patient Engagement to Improve Medication Safety in the Hospital
Purpose: There is a pressing need to enhance patient safety in the hospital environment. While there are many initiatives that focus on improving patient safety, few have studied engaging patients themselves to participate in patient safety efforts. This work was motived by the belief that patients can contribute valuable information to their care and when equipped with the right tools, can play a role in improving medication safety in the hospital.
Methods: This research had three aims and used a mixed-methods approach to better understand the concept of engaging patients to improve medication safety. In order to gain insight into whether patients could beneficially contribute to the safety of their hospital care, my first aim was to understand current perspectives on the sharing of clinical information with patients while they were in the hospital. To accomplish this aim, I conducted surveys with clinicians and enrolled patients in a short field study in which they received full access to their clinical chart. In Aim 2, I conducted a study to establish that the Patient Activation Measure (PAM), a common measure of patient engagement in the outpatient setting, showed reliability and validity in the inpatient setting. Building on the knowledge from Aim 1 and using the PAM instrument from Aim 2, my third aim evaluated the impact of providing patients with access to a medication review tool while they were preparing to be admitted to the hospital. Aim 3 was achieved through a randomized controlled trial (RCT) involving 65 patients I recruited from the emergency department at Columbia University Medical Center. I also conducted a survey of admitting clinicians who had patients participate in the trial to identify the impact on clinician practices and to elicit feedback on their perceptions of the intervention.
Results: My research findings suggest that increased patient information sharing in the inpatient setting is beneficial and desirable to patients, and generally acceptable to clinicians. The clinician survey from Aim 1 showed that most respondents were comfortable with the idea of providing patients with their clinical information. Some expressed reservations that patients might misunderstand information and become unnecessarily alarmed or offended. In the patient field study from Aim 1, patients reported perceiving the information they received as highly useful, even if they did not fully understand complex medical terms. My primary contribution in Aim 2 was to provide sound evidence that the Patient Activation Measure is a valid and reliable tool for use in the inpatient setting. Establishing the validity and reliability of the PAM instrument in inpatient setting was essential for conducting the RCT in Aim 3, and it will provide a foundation for future clinicians and research investigators to measure and understand hospital patientsâ levels of engagement.
The results from the RCT in Aim 3 did not support my primary hypothesis that clinicians who had patients participate in their medication review process using an informatics tool would make more changes to the home medication list than clinicians who had patients in the control group. However, the results did suggest that most hospital patients are knowledgeable, willing, and able to contribute useful and important information to the medication reconciliation process. Interestingly, the clinicians I surveyed seemed far less convinced that their patients would be able to beneficially participate in the medication reconciliation process due to low health literacy and other barriers. Nevertheless, the clinicians did seem to believe that in theory, at least, patient involvement in the medication reconciliation process could have positive impacts on their workflow and potentially save them time.
Conclusion: The overall theme resulting from my research is that patients can be a valuable resource to improve patient safety in the hospital. Patients are generally knowledgeable and willing to more actively participate in their hospital care. By developing the structures and processes to facilitate greater patient engagement, hospitals can provide an extra layer of safety and error prevention, particularly with respect to the medications patients take at home. As with any medical treatment, active participation in patient safety efforts may not be possible for all patients. However, I believe that if the culture of a hospital encourages openness and transparency, and if patients are given the proper tools and information, the quality and safety of hospital care will improve
Rafigh: A Living Media System for Motivating Target Application Use for Children
Digital living media systems combine living media such as plants, animals and fungi with computational components. In this dissertation, I respond to the question of how can digital living media systems better motivate children to use target applications (i.e., learning and/or therapeutic applications)? To address this question, I employed a participatory design approach where I incorporated input from children, parents, speech language pathologists and teachers into the design of a new system. Rafigh is a digital embedded system that uses the growth of a living mushrooms colony to provide positive reinforcements to children when they conduct target activities. The growth of the mushrooms is affected by the amount of water administered to them, which in turn corresponds to the time children spend on target applications.
I used an iterative design process to develop and evaluate three Rafigh prototypes. The evaluations showed that the system must be robust, customizable, and should include compelling engagement mechanisms to keep the children interested. I evaluated Rafigh using two case studies conducted in participants homes. In each case study, two siblings and their parent interacted with Rafigh over two weeks and the parents identified a series of target applications that Rafigh should motivate the children to use. The study showed that Rafigh motivated the children to spend significantly more time on target applications during the intervention phase and that it successfully engaged one out of two child participants in each case study who showed signs of responsibility, empathy and curiosity towards the living media. The study showed that the majority of participants described the relationship between using target applications and mushrooms growth correctly. Further, Rafigh encouraged more communication and collaboration between the participants. Rafighs slow responsivity did not impact the engagement of one out of two child participants in each case study and might even have contributed to their investment in the project. Finally, Rafighs presence as an ambient physical object allowed users to interact with it freely and as part of their home environment
Wearable continuous vital sign monitoring for deterioration detection and clinical outcomes in hospitalised patients
 Current practice uses physiological early warning scoring (EWS) systems to monitor âstandardâ vital signs, including heart rate (HR), respiratory rate (RR), blood pressure (BP), oxygen saturations (SpO2) and temperature, coupled with a graded response such as referral for a senior review or increasing monitoring frequency. Early detection of the deteriorating patient is a known challenge within hospital environments, as EWS is dependent on correct frequency of physiological observations tailored to specific patient needs, that can be time consuming for healthcare professionals, resulting in missed or incomplete observations. Wearable monitoring systems (WMS) may bring the potential to fill the gap in vital sign monitoring between traditional intermittent manual measurements and continuous automatic monitoring. However, evidence on the feasibility and impact of WMS implementation remains scarce. The virtual High Dependency Unit (vHDU) project was designed to develop and test the feasibility of deploying a WMS system in the hospital ward environment. This doctoral work aims to critically analyse the roadmap work of the vHDU project, containing ten publications distributed throughout 7 chapters. Chapter 1 (with 3 publications) includes a systematic review and meta-analysis identifying the lack of statistical evidence of the impact of WMS in early deterioration detection and associated clinical outcomes, highlighting the need for high-quality randomised controlled trials (RCTs). It also supports the use of WMS as a complement, and not a substitute, for standard and direct care. Chapter 2 explores clinical staff and patient perceptions of current vital sign monitoring practices, as well as their early thoughts on the use of WMS in the hospital environment through a qualitative interview study. WMS were seen positively by both clinical and patient groups as a potential tool to bridge the gap between manual observations and the traditional wired continuous automatic systems, as long as it does not add more noise to the wards nor replaces direct contact from the clinical staff. In chapter 3, the wearability of 7 commercially available wearables (monitoring HR, RR and SpO2) was assessed, advocating for the use of pulse oximeters without a fingertip probe and a small chest patch to improve worn times from the patients. Out of these, five devices were submitted to measurement accuracy testing (chapter 4, with 3 publications) under movement and controlled hypoxaemia, resulting in the validation of a chest patch (monitoring HR and RR) and proving the diagnostic accuracy of 3 pulse oximeters (monitoring pulse rate, PR and SpO2) under test. These results were timely for the final selection of the devices to be integrated in our WMS, namely vHDU system, explored in chapter 5, outlining the process for its development and rapid deployment in COVID-19 isolation wards in our local hospital during the pandemic. This work is now converging in the design of a feasibility RCT to test the impact of the vHDU system (now augmented with blood pressure and temperature monitoring, completing all 5 vital signs) versus standard care in an unbiased environment (chapter 6). This will also ascertain the feasibility for a multicentre RCT, that may in the future, contribute with the much-needed statistical evidence to my systematic review and meta-analysis research question, highlighted in chapter 1. Finally, chapter 7 includes a critical reflection of the vHDU project and overall doctoral work, as well as its contributions to the field of wearable monitoring.<p class="MsoNormal"/
- âŚ