8 research outputs found

    Developing a depth-based tracking systems for interactive playful environments with animals

    Full text link
    © ACM 2015. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM. Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology (p. 59). http://dx.doi.org/10.1145/2832932.2837007.[EN] Digital games for animals within Animal Computer Interaction are usually single-device oriented, however richer interactions could be delivered by considering multimodal environments and expanding the number of technological elements involved. In these playful ecosystems, animals could be either alone or accompanied by human beings, but in both cases the system should react properly to the interactions of all the players, creating more engaging and natural games. Technologically-mediated playful scenarios for animals will therefore require contextual information about the game participants, such as their location or body posture, in order to suitably adapt the system reactions. This paper presents a depth-based tracking system for cats capable of detecting their location, body posture and field of view. The proposed system could also be extended to locate and detect human gestures and track small robots, becoming a promising component in the creation of intelligent interspecies playful environments.Work supported by the Spanish Ministry of Economy and Competitiveness and funded by the EDRF-FEDER (TIN2014-60077-R). The work of Patricia Pons has been supported by a national grant from the Spanish MECD (FPU13/03831). Alejandro Catalá also received support from a VALi+d fellowship from the GVA (APOSTD/2013/013). Special thanks to our cat participants, their owners, and our feline caretakers and therapistsPons Tomás, P.; JaĂ©n MartĂ­nez, FJ.; Catalá BolĂłs, A. (2015). Developing a depth-based tracking systems for interactive playful environments with animals. ACM. https://doi.org/10.1145/2832932.2837007SJan Bednarik and David Herman. 2015. Human gesture recognition using top view depth data obtained from Kinect sensor.Excel. - Student Conf. Innov. Technol. Sci. IT, 1--8.Hrvoje Benko, Andrew D. Wilson, Federico Zannier, and Hrvoje Benko. 2014. Dyadic projected spatial augmented reality.Proc. 27th Annu. ACM Symp. User interface Softw. Technol. - UIST '14, 645--655.Alper Bozkurt, David L Roberts, Barbara L Sherman, et al. 2014. Toward Cyber-Enhanced Working Dogs for Search and Rescue.IEEE Intell. Syst. 29, 6, 32--39.Rita Brugarolas, Robert T. Loftin, Pu Yang, David L. Roberts, Barbara Sherman, and Alper Bozkurt. 2013. Behavior recognition based on machine learning algorithms for a wireless canine machine interface.2013 IEEE Int. Conf. Body Sens. Networks, 1--5.Adrian David Cheok, Roger Thomas K C Tan, R. L. Peiris, et al. 2011. Metazoa Ludens: Mixed-Reality Interaction and Play for Small Pets and Humans.IEEE Trans. Syst. Man, Cybern. - Part A Syst. Humans41, 5, 876--891.Amanda Hodgson, Natalie Kelly, and David Peel. 2013. Unmanned aerial vehicles (UAVs) for surveying Marine Fauna: A dugong case study.PLoS One8, 11, 1--15.Gang Hu, Derek Reilly, Mohammed Alnusayri, Ben Swinden, and Qigang Gao. 2014. DT-DT: Top-down Human Activity Analysis for Interactive Surface Applications.Proc. Ninth ACM Int. Conf. Interact. Tabletops Surfaces - ITS '14, 167--176.Brett R Jones, Hrvoje Benko, Eyal Ofek, and Andrew D. Wilson. 2013. IllumiRoom: Peripheral Projected Illusions for Interactive Experiences.Proc. SIGCHI Conf. Hum. Factors Comput. Syst. - CHI '13, 869--878.Brett Jones, Lior Shapira, Rajinder Sodhi, et al. 2014. RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units.Proc. 27th Annu. ACM Symp. User Interface Softw. Technol. - UIST '14, 637--644.Cassim Ladha, Nils Hammerla, Emma Hughes, Patrick Olivier, and Thomas Ploetz. 2013. Dog's Life: Wearable Activity Recognition for Dogs.Proc. 2013 ACM Int. Jt. Conf. Pervasive Ubiquitous Comput. - UbiComp'13, 415.Shang Ping Lee, Adrian David Cheok, Teh Keng Soon James, et al. 2006. A mobile pet wearable computer and mixed reality system for human--poultry interaction through the internet.Pers. Ubiquitous Comput. 10, 5, 301--317.Clara Mancini, Janet van der Linden, Jon Bryan, and Andrew Stuart. 2012. Exploring interspecies sensemaking: Dog Tracking Semiotics and Multispecies Ethnography.Proc. 2012 ACM Conf. Ubiquitous Comput. - UbiComp '12, 143--152.Clara Mancini. 2011. Animal-computer interaction: a manifesto.Mag. Interact. 18, 4, 69--73.Clara Mancini. 2013. Animal-computer interaction (ACI): changing perspective on HCI, participation and sustainability.CHI '13 Ext. Abstr. Hum. Factors Comput. Syst., 2227--2236.Steve North, Carol Hall, Amanda Roshier, and Clara Mancini. 2015. HABIT: Horse Automated Behaviour Identification Tool -- A Position Paper.Proc. Br. Hum. Comput. Interact. Conf. - Anim. Comput. Interact. Work., 1--4.Mikko Paldanius, Tuula Kärkkäinen, Kaisa Väänänen-Vainio-Mattila, Oskar Juhlin, and Jonna Häkkilä. 2011. Communication technology for human-dog interaction: exploration of dog owners' experiences and expectations.Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2641--2650.Patricia Pons, Javier Jaen, and Alejandro Catala. Multimodality and Interest Grabbing: Are Cats Ready for the Game?Submitt. to Int. J. Human-Computer Stud. Spec. Issue Anim. Comput. Interact. (under Rev).Patricia Pons, Javier Jaen, and Alejandro Catala. 2014. Animal Ludens: Building Intelligent Playful Environments for Animals.Proc. 2014 Work. Adv. Comput. Entertain. Conf. - ACE '14 Work., 1--6.Patricia Pons, Javier Jaen, and Alejandro Catala. 2015. Envisioning Future Playful Interactive Environments for Animals. InMore Playful User Interfaces, Anton Nijholt (ed.). Springer, 121--150.Rui Trindade, Micaela Sousa, Cristina Hart, Nádia Vieira, Roberto Rodrigues, and João França. 2015. Purrfect Crime.Proc. 33rd Annu. ACM Conf. Ext. Abstr. Hum. Factors Comput. Syst. - CHI EA '15, 93--96.Jessica van Vonderen. 2015. Drones with heat-tracking cameras used to monitor koala population. Retrieved July 1, 2015 from http://www.abc.net.au/news/2015-02-24/drones-to-help-threatened-species-koalas-qut/6256558Alexandra Weilenmann and Oskar Juhlin. 2011. Understanding people and animals: the use of a positioning system in ordinary human-canine interaction.Proc. 2011 Annu. Conf. Hum. factors Comput. Syst. - CHI '11, 2631--2640.Chadwick A. Wingrave, J. Rose, Todd Langston, and Joseph J. Jr. LaViola. 2010. Early explorations of CAT: canine amusement and training.CHI '10 Ext. Abstr. Hum. Factors Comput. Syst., 2661--2669.Kyoko Yonezawa, Takashi Miyaki, and Jun Rekimoto. 2009. Cat@Log: sensing device attachable to pet cats for supporting human-pet interaction.Proc. Int. Conf. Adv. Comput. Enterntainment Technol. - ACE '09, 149--156.2013. ZOO Boomer balls. Retrieved July 1, 2015 from https://www.youtube.com/watch?v=Od_Lm8U5W4

    Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review

    Get PDF
    Animals play a profoundly important and intricate role in our lives today. Dogs have been human companions for thousands of years, but they now work closely with us to assist the disabled, and in combat and search and rescue situations. Farm animals are a critical part of the global food supply chain, and there is increasing consumer interest in organically fed and humanely raised livestock, and how it impacts our health and environmental footprint. Wild animals are threatened with extinction by human induced factors, and shrinking and compromised habitat. This review sets the goal to systematically survey the existing literature in smart computing and sensing technologies for domestic, farm and wild animal welfare. We use the notion of \emph{animal welfare} in broad terms, to review the technologies for assessing whether animals are healthy, free of pain and suffering, and also positively stimulated in their environment. Also the notion of \emph{smart computing and sensing} is used in broad terms, to refer to computing and sensing systems that are not isolated but interconnected with communication networks, and capable of remote data collection, processing, exchange and analysis. We review smart technologies for domestic animals, indoor and outdoor animal farming, as well as animals in the wild and zoos. The findings of this review are expected to motivate future research and contribute to data, information and communication management as well as policy for animal welfare

    Pushing boundaries of RE:Requirement elicitation for non-human users

    Get PDF

    Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking

    Full text link
    [EN] There is growing interest in the automatic detection of animals' behaviors and body postures within the field of Animal Computer Interaction, and the benefits this could bring to animal welfare, enabling remote communication, welfare assessment, detection of behavioral patterns, interactive and adaptive systems, etc. Most of the works on animals' behavior recognition rely on wearable sensors to gather information about the animals' postures and movements, which are then processed using machine learning techniques. However, non-wearable mechanisms such as depth-based tracking could also make use of machine learning techniques and classifiers for the automatic detection of animals' behavior. These systems also offer the advantage of working in set-ups in which wearable devices would be difficult to use. This paper presents a depth-based tracking system for the automatic detection of animals' postures and body parts, as well as an exhaustive evaluation on the performance of several classification algorithms based on both a supervised and a knowledge-based approach. The evaluation of the depth -based tracking system and the different classifiers shows that the system proposed is promising for advancing the research on animals' behavior recognition within and outside the field of Animal Computer Interaction. (C) 2017 Elsevier Ltd. All rights reserved.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by Spanish MINECO with Project TIN2014-60077-R. It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d'Educacio, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catala (APOSTD/2013/013). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks to our cat participants and their owners, and many thanks to our feline caretakers and therapists, Olga, Asier and Julia, for their valuable collaboration and their dedication to animal wellbeing.Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2017). Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking. Expert Systems with Applications. 86:235-246. https://doi.org/10.1016/j.eswa.2017.05.063S2352468

    Remote interspecies interactions: Improving humans and animals wellbeing through mobile playful spaces

    Full text link
    [EN] Play is an essential activity for both humans and animals as it provides stimulation and favors cognitive, physical and social development. This paper proposes a novel pervasive playful environment that allows hospitalized children to participate in remote interspecies play with dogs in a dog daycare facility, while it also allows the dogs to play by themselves with the pervasive system. The aim of this playful interactive space is to help improving both children¿s and animal¿s wellbeing and their relationships by means of technologically mediated play, while creating a solid knowledge base to define the future of pervasive interactive environments for animals.This work is supported by the European Development Regional Fund (EDRF-FEDER), Spain and Spanish MINECO (TIN2014-60077-R). The work of Patricia Pons is supported by the Spanish MECD (FPU13/03831). Special thanks to the dogs and children who participated in our study, the dogs' owners and the children's families. The authors also gratefully acknowledge the teachers of the Unidad Pedagogica Hospitalaria La Fe and Oncologia Pediatrica La Fe and also Olga and Astrid from Buma's Doggy Daycare facility, for their invaluable support, collaboration and dedication.Pons Tomás, P.; Carrion-Plaza, A.; Jaén Martínez, FJ. (2019). Remote interspecies interactions: Improving humans and animals wellbeing through mobile playful spaces. Pervasive and Mobile Computing. 52:113-130. https://doi.org/10.1016/j.pmcj.2018.12.003S1131305

    Developing for non-human users:Reflecting on practical implications in the ubiquitous computing era

    Get PDF
    Advances in modern technology, such as the Internet of Things (IoT) and ubiquitous computing, open up new exciting opportunities for technology for animals. This is evidenced by the explosion of products and gadgets available for pets, digital enrichment for captive animals in zoos, sensor based smart farming, etc. At the same time, the emerging discipline of Animal-Computer Interaction (ACI) marks a new era in the design and development of animal technologies, promoting a more animal-centric approach, considering the needs of the animal in the development process. In this article, we reflect on the ways in which ideas of animal-centric development may impact the development of technology for animals in practice. We start by looking at the process of development for and with animals, and propose a development model facilitating the principles of Agility, Welfare of Animals, and eXperts’ involvement (AWAX) within the development lifecycle. While promoting the animal-centric approach, it is important to acknowledge that an animal usually uses technology through humans and in a particular environment. We further extend the AWAX model to include considerations of the human in the loop and the environment, and discuss some practical implications of this view, including aspects such as security and privacy
    corecore