545 research outputs found

    Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review

    Get PDF
    Animals play a profoundly important and intricate role in our lives today. Dogs have been human companions for thousands of years, but they now work closely with us to assist the disabled, and in combat and search and rescue situations. Farm animals are a critical part of the global food supply chain, and there is increasing consumer interest in organically fed and humanely raised livestock, and how it impacts our health and environmental footprint. Wild animals are threatened with extinction by human induced factors, and shrinking and compromised habitat. This review sets the goal to systematically survey the existing literature in smart computing and sensing technologies for domestic, farm and wild animal welfare. We use the notion of \emph{animal welfare} in broad terms, to review the technologies for assessing whether animals are healthy, free of pain and suffering, and also positively stimulated in their environment. Also the notion of \emph{smart computing and sensing} is used in broad terms, to refer to computing and sensing systems that are not isolated but interconnected with communication networks, and capable of remote data collection, processing, exchange and analysis. We review smart technologies for domestic animals, indoor and outdoor animal farming, as well as animals in the wild and zoos. The findings of this review are expected to motivate future research and contribute to data, information and communication management as well as policy for animal welfare

    Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking

    Full text link
    [EN] There is growing interest in the automatic detection of animals' behaviors and body postures within the field of Animal Computer Interaction, and the benefits this could bring to animal welfare, enabling remote communication, welfare assessment, detection of behavioral patterns, interactive and adaptive systems, etc. Most of the works on animals' behavior recognition rely on wearable sensors to gather information about the animals' postures and movements, which are then processed using machine learning techniques. However, non-wearable mechanisms such as depth-based tracking could also make use of machine learning techniques and classifiers for the automatic detection of animals' behavior. These systems also offer the advantage of working in set-ups in which wearable devices would be difficult to use. This paper presents a depth-based tracking system for the automatic detection of animals' postures and body parts, as well as an exhaustive evaluation on the performance of several classification algorithms based on both a supervised and a knowledge-based approach. The evaluation of the depth -based tracking system and the different classifiers shows that the system proposed is promising for advancing the research on animals' behavior recognition within and outside the field of Animal Computer Interaction. (C) 2017 Elsevier Ltd. All rights reserved.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by Spanish MINECO with Project TIN2014-60077-R. It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d'Educacio, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catala (APOSTD/2013/013). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks to our cat participants and their owners, and many thanks to our feline caretakers and therapists, Olga, Asier and Julia, for their valuable collaboration and their dedication to animal wellbeing.Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2017). Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking. Expert Systems with Applications. 86:235-246. https://doi.org/10.1016/j.eswa.2017.05.063S2352468

    Developing a depth-based tracking systems for interactive playful environments with animals

    Full text link
    © ACM 2015. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM. Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology (p. 59). http://dx.doi.org/10.1145/2832932.2837007.[EN] Digital games for animals within Animal Computer Interaction are usually single-device oriented, however richer interactions could be delivered by considering multimodal environments and expanding the number of technological elements involved. In these playful ecosystems, animals could be either alone or accompanied by human beings, but in both cases the system should react properly to the interactions of all the players, creating more engaging and natural games. Technologically-mediated playful scenarios for animals will therefore require contextual information about the game participants, such as their location or body posture, in order to suitably adapt the system reactions. This paper presents a depth-based tracking system for cats capable of detecting their location, body posture and field of view. The proposed system could also be extended to locate and detect human gestures and track small robots, becoming a promising component in the creation of intelligent interspecies playful environments.Work supported by the Spanish Ministry of Economy and Competitiveness and funded by the EDRF-FEDER (TIN2014-60077-R). The work of Patricia Pons has been supported by a national grant from the Spanish MECD (FPU13/03831). Alejandro Catalá also received support from a VALi+d fellowship from the GVA (APOSTD/2013/013). Special thanks to our cat participants, their owners, and our feline caretakers and therapistsPons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2015). Developing a depth-based tracking systems for interactive playful environments with animals. ACM. https://doi.org/10.1145/2832932.2837007SJan Bednarik and David Herman. 2015. Human gesture recognition using top view depth data obtained from Kinect sensor.Excel. - Student Conf. Innov. Technol. Sci. IT, 1--8.Hrvoje Benko, Andrew D. Wilson, Federico Zannier, and Hrvoje Benko. 2014. Dyadic projected spatial augmented reality.Proc. 27th Annu. ACM Symp. User interface Softw. Technol. - UIST '14, 645--655.Alper Bozkurt, David L Roberts, Barbara L Sherman, et al. 2014. Toward Cyber-Enhanced Working Dogs for Search and Rescue.IEEE Intell. Syst. 29, 6, 32--39.Rita Brugarolas, Robert T. Loftin, Pu Yang, David L. Roberts, Barbara Sherman, and Alper Bozkurt. 2013. Behavior recognition based on machine learning algorithms for a wireless canine machine interface.2013 IEEE Int. Conf. Body Sens. Networks, 1--5.Adrian David Cheok, Roger Thomas K C Tan, R. L. Peiris, et al. 2011. Metazoa Ludens: Mixed-Reality Interaction and Play for Small Pets and Humans.IEEE Trans. Syst. Man, Cybern. - Part A Syst. Humans41, 5, 876--891.Amanda Hodgson, Natalie Kelly, and David Peel. 2013. Unmanned aerial vehicles (UAVs) for surveying Marine Fauna: A dugong case study.PLoS One8, 11, 1--15.Gang Hu, Derek Reilly, Mohammed Alnusayri, Ben Swinden, and Qigang Gao. 2014. DT-DT: Top-down Human Activity Analysis for Interactive Surface Applications.Proc. Ninth ACM Int. Conf. Interact. Tabletops Surfaces - ITS '14, 167--176.Brett R Jones, Hrvoje Benko, Eyal Ofek, and Andrew D. Wilson. 2013. IllumiRoom: Peripheral Projected Illusions for Interactive Experiences.Proc. SIGCHI Conf. Hum. Factors Comput. Syst. - CHI '13, 869--878.Brett Jones, Lior Shapira, Rajinder Sodhi, et al. 2014. RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units.Proc. 27th Annu. ACM Symp. User Interface Softw. Technol. - UIST '14, 637--644.Cassim Ladha, Nils Hammerla, Emma Hughes, Patrick Olivier, and Thomas Ploetz. 2013. Dog's Life: Wearable Activity Recognition for Dogs.Proc. 2013 ACM Int. Jt. Conf. Pervasive Ubiquitous Comput. - UbiComp'13, 415.Shang Ping Lee, Adrian David Cheok, Teh Keng Soon James, et al. 2006. A mobile pet wearable computer and mixed reality system for human--poultry interaction through the internet.Pers. Ubiquitous Comput. 10, 5, 301--317.Clara Mancini, Janet van der Linden, Jon Bryan, and Andrew Stuart. 2012. Exploring interspecies sensemaking: Dog Tracking Semiotics and Multispecies Ethnography.Proc. 2012 ACM Conf. Ubiquitous Comput. - UbiComp '12, 143--152.Clara Mancini. 2011. Animal-computer interaction: a manifesto.Mag. Interact. 18, 4, 69--73.Clara Mancini. 2013. Animal-computer interaction (ACI): changing perspective on HCI, participation and sustainability.CHI '13 Ext. Abstr. Hum. Factors Comput. Syst., 2227--2236.Steve North, Carol Hall, Amanda Roshier, and Clara Mancini. 2015. HABIT: Horse Automated Behaviour Identification Tool -- A Position Paper.Proc. Br. Hum. Comput. Interact. Conf. - Anim. Comput. Interact. Work., 1--4.Mikko Paldanius, Tuula Kärkkäinen, Kaisa Väänänen-Vainio-Mattila, Oskar Juhlin, and Jonna Häkkilä. 2011. Communication technology for human-dog interaction: exploration of dog owners' experiences and expectations.Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2641--2650.Patricia Pons, Javier Jaen, and Alejandro Catala. Multimodality and Interest Grabbing: Are Cats Ready for the Game?Submitt. to Int. J. Human-Computer Stud. Spec. Issue Anim. Comput. Interact. (under Rev).Patricia Pons, Javier Jaen, and Alejandro Catala. 2014. Animal Ludens: Building Intelligent Playful Environments for Animals.Proc. 2014 Work. Adv. Comput. Entertain. Conf. - ACE '14 Work., 1--6.Patricia Pons, Javier Jaen, and Alejandro Catala. 2015. Envisioning Future Playful Interactive Environments for Animals. InMore Playful User Interfaces, Anton Nijholt (ed.). Springer, 121--150.Rui Trindade, Micaela Sousa, Cristina Hart, Nádia Vieira, Roberto Rodrigues, and João França. 2015. Purrfect Crime.Proc. 33rd Annu. ACM Conf. Ext. Abstr. Hum. Factors Comput. Syst. - CHI EA '15, 93--96.Jessica van Vonderen. 2015. Drones with heat-tracking cameras used to monitor koala population. Retrieved July 1, 2015 from http://www.abc.net.au/news/2015-02-24/drones-to-help-threatened-species-koalas-qut/6256558Alexandra Weilenmann and Oskar Juhlin. 2011. Understanding people and animals: the use of a positioning system in ordinary human-canine interaction.Proc. 2011 Annu. Conf. Hum. factors Comput. Syst. - CHI '11, 2631--2640.Chadwick A. Wingrave, J. Rose, Todd Langston, and Joseph J. Jr. LaViola. 2010. Early explorations of CAT: canine amusement and training.CHI '10 Ext. Abstr. Hum. Factors Comput. Syst., 2661--2669.Kyoko Yonezawa, Takashi Miyaki, and Jun Rekimoto. 2009. Cat@Log: sensing device attachable to pet cats for supporting human-pet interaction.Proc. Int. Conf. Adv. Comput. Enterntainment Technol. - ACE '09, 149--156.2013. ZOO Boomer balls. Retrieved July 1, 2015 from https://www.youtube.com/watch?v=Od_Lm8U5W4

    Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction

    Get PDF
    As technologies diversify and become embedded in everyday lives, the technologies we expose to animals, and the new technologies being developed for animals within the field of Animal Computer Interaction (ACI) are increasing. As we approach seven years since the ACI manifesto, which grounded the field within Human Computer Interaction and Computer Science, this thematic literature review looks at the technologies developed for (non-human) animals. Technologies that are analysed include tangible and physical, haptic and wearable, olfactory, screen technology and tracking systems. The conversation explores what exactly ACI is whilst questioning what it means to be animal by considering the impact and loop between machine and animal interactivity. The findings of this review are expected to form the first grounding foundation of ACI technologies informing future research in animal computing as well as suggesting future areas for exploratio

    Improved Activity Recognition Combining Inertial Motion Sensors and Electroencephalogram Signals

    Get PDF
    Human activity recognition and neural activity analysis are the basis for human computational neureoethology research dealing with the simultaneous analysis of behavioral ethogram descriptions and neural activity measurements. Wireless electroencephalography (EEG) and wireless inertial measurement units (IMU) allow the realization of experimental data recording with improved ecological validity where the subjects can be carrying out natural activities while data recording is minimally invasive. Specifically, we aim to show that EEG and IMU data fusion allows improved human activity recognition in a natural setting. We have defined an experimental protocol composed of natural sitting, standing and walking activities, and we have recruited subjects in two sites: in-house (N = 4) and out-house (N = 12) populations with different demographics. Experimental protocol data capture was carried out with validated commercial systems. Classifier model training and validation were carried out with scikit-learn open source machine learning python package. EEG features consist of the amplitude of the standard EEG frequency bands. Inertial features were the instantaneous position of the body tracked points after a moving average smoothing to remove noise. We carry out three validation processes: a 10-fold cross-validation process per experimental protocol repetition, (b) the inference of the ethograms, and (c) the transfer learning from each experimental protocol repetition to the remaining repetitions. The in-house accuracy results were lower and much more variable than the out-house sessions results. In general, random forest was the best performing classifier model. Best cross-validation results, ethogram accuracy, and transfer learning were achieved from the fusion of EEG and IMUs data. Transfer learning behaved poorly compared to classification on the same protocol repetition, but it has accuracy still greater than 0.75 on average for the out-house data sessions. Transfer leaning accuracy among repetitions of the same subject was above 0.88 on average. Ethogram prediction accuracy was above 0.96 on average. Therefore, we conclude that wireless EEG and IMUs allow for the definition of natural experimental designs with high ecological validity toward human computational neuroethology research. The fusion of both EEG and IMUs signals improves activity and ethogram recognitionThis work has been partially supported by FEDER funds through MINECO Project TIN2017-85827-P. Special thanks to Naiara Vidal from IMH who conducted the recruitment process in the framework of Langileok project funded by the Elkartek program. This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 777720
    corecore