770 research outputs found

    Exploring Audio Sensing in Detecting Social Interactions Using Smartphone Devices

    Get PDF
    In recent years, the fast proliferation of smartphones devices has provided powerful and portable methodologies for integrating sensing systems which can run continuously and provide feedback in real-time. The mobile crowd-sensing of human behaviour is an emerging computing paradigm that offers a challenge of sensing everyday social interactions performed by people who carry smartphone devices upon themselves. Typical smartphone sensors and the mobile crowd-sensing paradigm compose a process where the sensors present, such as the microphone, are used to infer social relationships between people in diverse social settings, where environmental factors can be dynamic and the infrastructure of buildings can vary. The typical approaches in detecting social interactions between people consider the use of co-location as a proxy for real-world interactions. Such approaches can under-perform in challenging situations where multiple social interactions can occur within close proximity to each other, for example when people are in a queue at the supermarket but not a part of the same social interaction. Other approaches involve a limitation where all participants of a social interaction must carry a smartphone device with themselves at all times and each smartphone must have the sensing app installed. The problem here is the feasibility of the sensing system, which relies heavily on each participant's smartphone acting as nodes within a social graph, connected together with weighted edges of proximity between the devices; when users uninstall the app or disable background sensing, the system is unable to accurately determine the correct number of participants. In this thesis, we present two novel approaches to detecting co-located social interac- tions using smartphones. The first relies on the use of WiFi signals and audio signals to distinguish social groups interacting within a few meters from each other with 88% precision. We orchestrated preliminary experiments using WiFi as a proxy for co-location between people who are socially interacting. Initial results showed that in more challenging scenarios, WiFi is not accurate enough to determine if people are socially interacting within the same social group. We then made use of audio as a second modality to capture the sound patterns of conversations to identify and segment social groups within close proximity to each other. Through a range of real-world experiments (social interactions in meeting scenarios, coffee shop scenarios, conference scenarios), we demonstrate a technique that utilises WiFi fingerprinting, along with sound fingerprinting to identify these social groups. We built a system which performs well, and then optimized the power consumption and improved the performance to 88% precision in the most challenging scenarios using duty cycling and data averaging techniques. The second approach explores the feasibility of detecting social interactions without the need of all social contacts to carry a social sensing device. This work explores the use of supervised and unsupervised Deep Learning techniques before concluding on the use of an Autoencoder model to perform a Speaker Identification task. We demonstrate how machine learning can be used with the audio data collected from a singular device as a speaker identification framework. Speech from people is used as the input to our Autoencoder model and then classified against a list of "social contacts" to determine if the user has spoken a person before or not. By doing this, the system can count the number of social contacts belonging to the user, and develop a database of common social contacts. Through the use 100 randomly-generated social conversations and the use of state-of-the-art Deep Learning techniques, we demonstrate how this system can accurately distinguish new and existing speakers from a data set of voices, to count the number of daily social interactions a user encounters with a precision of 75%. We then optimize the model using Hyperparameter Optimization to ensure that the model is most optimal for the task. Unlike most systems in the literature, this approach would work without the need to modify the existing infrastructure of a building, and without all participants needing to install the same ap

    Multimedia sensors embedded in smartphones for ambient assisted living and e-health

    Full text link
    The final publication is available at link.springer.com[EN] Nowadays, it is widely extended the use of smartphones to make human life more comfortable. Moreover, there is a special interest on Ambient Assisted Living (AAL) and e-Health applications. The sensor technology is growing and amount of embedded sensors in the smartphones can be very useful for AAL and e-Health. While some sensors like the accelerometer, gyroscope or light sensor are very used in applications such as motion detection or light meter, there are other ones, like the microphone and camera which can be used as multimedia sensors. This paper reviews the published papers focused on showing proposals, designs and deployments of that make use of multimedia sensors for AAL and e-health. We have classified them as a function of their main use. They are the sound gathered by the microphone and image recorded by the camera. We also include a comparative table and analyze the gathered information.Parra-Boronat, L.; Sendra, S.; Jimenez, JM.; Lloret, J. (2016). Multimedia sensors embedded in smartphones for ambient assisted living and e-health. Multimedia Tools and Applications. 75(21):13271-13297. doi:10.1007/s11042-015-2745-8S13271132977521Acampora G, Cook DJ, Rashidi P, Vasilakos AV (2013) A survey on ambient intelligence in healthcare. Proc IEEE 101(12):2470–2494Al-Attas R, Yassine A, Shirmohammadi S (2012) Tele-Medical Applications in Home-Based Health Care. In proceeding of the 2012 I.E. International Conference on Multimedia and Expo Workshops (ICMEW 2012). Jul. 9–13, 2012. Melbourne, Australia. (pp. 441–446)Alemdar H, Ersoy C (2010) Wireless sensor networks for healthcare: a survey. Comput Netw 54(15):2688–2710Alqassim S, Ganesh M, Khoja S, Zaidi M, Aloul F, Sagahyroon A (2012) Sleep apnea monitoring using mobile phones. In proceedings of the 14th International Conference on e-Health Networking, Applications and Services (Healthcom 2012). Oct. 10 – 13, 2012. Beijing, China. (pp. 443–446)Anderson G, Horvath J (2004) The growing burden of chronic disease in America. Public Health Rep 119(3):263–270Aquilano M, Cavallo F, Bonaccorsi M, Esposito R, Rovini E, Filippi M, Carrozza MC (2012) Ambient assisted living and ageing: Preliminary results of RITA project. In proceedings of 34th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2012), Aug. 28-Sept. 1, 2012. San Diego USA. (pp. 5823–5826)Bellini P, Bruno I, Cenni D, Fuzier A, Nesi P, Paolucci M (2012) Mobile Medicine: semantic computing management for health care applications on desktop and mobile devices. Multimed Tools Appl 58(1):41–79Boulos MN, Wheeler S, Tavares C, Jones R (2011) How smartphones are changing the face of mobile and participatory healthcare: an overview, with example from eCAALYX. Biomed Eng Online 10(1):24Bourouis A, Feham M, Hossain MA, Zhang L (2014) An intelligent mobile based decision support system for retinal disease diagnosis. Decis Support Syst 59(2014):341–350Bourouis A, Zerdazi A, Feham M, Bouchachia A (2013) M-health: skin disease analysis system using Smartphone’s camera. Procedia Comput Sci 19(2013):1116–1120M.W. Brault, (2010). Americans With Disabilities: 2010. Household Economic Studies. In United States Census Bureau website. Available at: www.census.gov/prod/2012pubs/p70-131.pdf Last Access 16 Dec 2014Breath Counter App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=com.softrove.app.bc Last Access 30 Nov 2014Cardinaux F, Bhowmik D, Abhayaratne C, Hawley MS (2011) Video based technology for ambient assisted living: a review of the literature. J Ambient Intell Smart Environ 3(3):253–269Cardiograph App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=com.macropinch.hydra.android . Last Access 30 Nov 2014Chaaraoui AA, Climent-Pérez P, Flórez-Revuelta F (2012) A review on vision techniques applied to human behaviour analysis for ambient-assisted living. Expert Syst Appl 39(12):10873–10888Chen NC, Wang KC, Chu HH (2012) Listen-to-nose: a low-cost system to record nasal symptoms in daily life. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UBIComp 2012). Sep. 05–08, 2012. Pittsburgh, USA. (pp. 590–591)Chiarini G, Ray P, Akter S, Masella C, Ganz A (2013) mHealth technologies for chronic diseases and elders: a systematic review. IEEE J Sel Areas Commun 31(9):6–18Color Detector App In Google Play website. Available at: //play.google.com/store/apps/details?id = com.mobialia.colordetector. Last Access 30 Nov 2014Colorblind Assitant App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=com.unclechromedome.colorblindassistant . Last Access 30 Nov 2014Dale O, Solheim I, Halbach T, Schulz T, Spiru L, Turcu I (2013) What seniors want in a mobile Help-On-Demand service. In proceedings of the Fifth International Conference on eHealth, Telemedicine, and Social Medicine (eTELEMED 2013). Feb. 24 – Mar. 1, 2013. Nice, France. (pp. 96–101)Estepa AJ, Estepa R, Vozmediano J, Carrillo P (2014) Dynamic VoIP codec selection on smartphones. Netw Protoc Algoritm 6(2):22–37Falk TH, Maier M (2013) Context awareness in WBANs: a survey on medical and non-medical applications. IEEE Wirel Commun 20(4):30–37Franco C, Fleury A, Guméry PY, Diot B, Demongeot J, Vuillerme N (2013) iBalance-ABF: a smartphone-based audio-biofeedback balance system. IEEE Trans Biomed Eng 60(1):211–215García M, Lloret J, Bellver I, Tomás J (2013) Intelligent IPTV Distribution for Smart Phones (Book Chapter 13). In Intelligent Multimedia Technologies for Networking Applications. IGI GlobalGregoski MJ, Mueller M, Vertegel A, Shaporev A, Jackson BB, Frenzel RM, Treiber FA (2012) Development and validation of a smartphone heart rate acquisition application for health promotion and wellness telehealth applications. Int J Telemed Appl 2012, 1. Article ID 696324Grimaldi D, Kurylyak Y, Lamonaca F, Nastro A (2011) Photoplethysmography detection by smartphone’s videocamera. In proceedings of the 6th International Conference on Intelligent Data Acquisition and Advanced Computing Systems (IEEE IDAACS 2011), Sep. 15–17, 2011. Prague, Czech Republic. (Vol. 1, pp. 488–491)Gurrin C, Qiu Z, Hughes M, Caprani N, Doherty AR, Hodges SE, Smeaton AF (2013) The smartphone as a platform for wearable cameras in health research. Am J Prev Med 44(3):308–313Haché G, Lemaire ED, Baddour N (2011) Wearable mobility monitoring using a multimedia smartphone platform. IEEE Trans Instrum Meas 60(9):3153–3161Heathers JA (2013) Smartphone-enabled pulse rate variability: an alternative methodology for the collection of heart rate variability in psychophysiological research. Int J Psychophysiol 89(3):297–304Hoseini-Tabatabaei SA, Gluhak A, Tafazolli R (2013) A survey on smartphone-based systems for opportunistic user context recognition. ACM Comput Surv (CSUR) 45(3):1–51, Paper No. 27Illiger K, Hupka M, von Jan U, Wichelhaus D, Albrecht UV (2014) Mobile technologies: expectancy, usage, and acceptance of clinical staff and patients at a University Medical Center. JMIR mHealth uHealth 2(4), e42Kanjo E (2012) Tools and architectural support for mobile phones based crowd control systems. Netw Protoc Algoritm 4(3):4–14Kawano Y, Yanai K (2014) FoodCam: a real-time food recognition system on a smartphone. Multimedia Tools and Applications,Published online:April 2014: 1–25Khan FH, Khan ZH (2010) A systematic approach for developing mobile information system based on location based services. Netw Protoc Algoritm 2(2):54–65Kochanov D, Jonas S, Hamadeh N, Yalvac E, Slijp H, Deserno TM (2014) Urban Positioning Using Smartphone-Based Imaging. In Bildverarbeitung für die Medizin, 2014: 186–191Kurniawan S (2008) Older people and mobile phones: a multi-method investigation. Int J Human-Comput Stud 66(12):889–901Lacuesta R, Lloret J, Sendra S, Peñalver L (2014) Spontaneous Ad Hoc mobile cloud computing network. Sci World J 2014:1–19Lakens D (2013) Using a Smartphone to measure heart rate changes during relived happiness and anger. IEEE Trans Affect Comput 5(3):217–226Larson EC, Goel M, Boriello G, Heltshe S, Rosenfeld M, Patel SN (2012) Spirosmart: using a microphone to measure lung function on a mobile phone, In proceedings of the 2012 ACM Conference on Ubiquitous Computing (UBIComp 2012). Sep. 05–08, 2012. Pittsburgh, USA. (pp. 280–289)Lee J, Reyes BA, McManus DD, Mathias O, Chon KH (2012) Atrial fibrillation detection using a smart phone. In proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2012). Aug.28-Sep.1, 2012. San Diego, (pp. 1177–1180)Lloret J, Garcia M, Bri D, Diaz JR (2009) A cluster-based architecture to structure the topology of parallel wireless sensor networks. Sensors (Basel) 9(12):10513–10544Lu H, Frauendorfer D, Rabbi M, Mast MS, Chittaranjan GT, Campbell AT, Gatica-Perez D, Choudhury T (2012) StressSense: detecting stress in unconstrained acoustic environments using smartphones. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UBIComp 2012). Sep. 05–08, 2012. Pittsburgh, USA. (pp. 351–360)Macías E, Abdelfatah H, Suárez A, Cánovas A (2011) Full geo-localized mobile video in Android mobile telephones. Netw Protoc Algoritm 3(1):64–81Macias E, Lloret J, Suarez A, Garcia M (2012) Architecture and protocol of a semantic system designed for video tagging with sensor data in mobile devices. Sensors 12(2):2062–2087Macias E, Suarez A, Lloret J (2013) Mobile sensing systems. Sensors 13(12):17292–17321MedCam App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=com.cupel.MedCam . Last Access 30 Nov 2014Monteiro DM, Rodrigues JJ, Lloret J, Sendra S (2014) A hybrid NFC–Bluetooth secure protocol for Credit Transfer among mobile phones. Secur Commun Netw 7(2):325–337Mosa ASM, Yoo I, Sheets L (2012) A systematic review of healthcare applications for smartphones. BMC Med Inform Decis Mak 12(1):67MyEarDroid App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=com.tecnalia.health.myeardroid . Last Access 30 Nov 2014O’Grady MJ, Muldoon C, Dragone M, Tynan R, O’Hare GM (2010) Towards evolutionary ambient assisted living systems. J Ambient Intell Humaniz Comput 1(1):15–29Poppe R (2010) A survey on vision-based human action recognition. Image Vis Comput 28(6):976–990Quit Snoring App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=com.ptech_hm.qs . Last Access 30 Nov 2014Rahman MA, Hossain MS, El Saddik A (2013) Context-aware multimedia services modeling: an e-Health perspective. Multimed Tools Appl 73(3):1147–1176Sendra S, Granell E, Lloret J, Rodrigues JJPC (2014) Smart collaborative mobile system for taking care of disabled and elderly people. Mob Netw Appl 19(3):287–302Smartphone Milestone: Half of Mobile Subscribers Ages 55+ Own Smartphones Mobile. Online report.(April 22,2014). In the Nielsen Company website. Available at: http://www.nielsen.com/us/en/insights/news/2014/smartphone-milestone-half-of-americans-ages-55-own-smartphones.html Last Access 25 Nov 2014Smith A (2013) Smartphone Ownership 2013. On-line Report June 5, 2013. In Pew Research Center’s Internet & American Life Project website. Available at: http://www.pewinternet.org/2013/06/05/smartphone-ownership-2013/ Last Access 25 Nov 2014SnoreClock App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=de.ralphsapps.snorecontrol Last Access 30 Nov 2014Storf H, Kleinberger T, Becker M, Schmitt M, Bomarius F, Prueckner S (2009) An event-driven approach to activity recognition in ambient assisted living. Lect Notes Comput Sci 5859:123–132Su X, Tong H, Ji P (2014) Activity recognition with smartphone sensors. Tsinghua Sci Technol 19(3):235–249Tapu R, Mocanu B, Bursuc A, Zaharia T (2013) A smartphone-based obstacle detection and classification system for assisting visually impaired people. In proceedings of the 2013 I.E. International Conference on Computer Vision Workshops (ICCVW 2013). Dec. 2–8, 2013. Sydney, Australia. (pp. 444–451)The vOICe for Android App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=vOICe.vOICe . Last Access 30 Nov 2014Tudzarov A, Janevski T (2011) Protocols and algorithms for the next generation 5G mobile systems. Netw Protoc Algoritm 3(1):94–114Tyagi A, Miller K, Cockburn M (2012) e-Health tools for targeting and improving melanoma screening: a review. J Skin Cancer 2012, Article ID 437502Voice Cam for Blind App. In Google Play website. Available at: https://play.google.com/store/apps/details?id=com.prod.voice.cam Last Access 30 Nov 2014Wadhawan T, Situ N, Rui H, Lancaster K, Yuan X, Zouridakis G (2011) Implementation of the 7-point checklist for melanoma detection on smart handheld devices. In proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, (EMBC 2011). Aug. 30- Sep 03, 2011. Boston, MA, USA (pp. 3180–3183)Xiong H, Zhang D, Zhang D, Gauthier V (2012) Predicting mobile phone user locations by exploiting collective behavioral patterns. In proceedings of the 9th International Conference on Ubiquitous Intelligence & Computing and 9th International Conference on Autonomic & Trusted Computing (UIC/ATC). 4–7 Sept. 2012. Fukuoka, Japan. (pp. 164–171)Xu X, Shu L, Guizani M, Liu M, Lu J (2014) A survey on energy harvesting and integrated data sharing in wireless body area networks. Int J Distrib Sens Netw. Article ID 438695Yu W, Su X, Hansen J (2012) A smartphone design approach to user communication interface for administering storage system network. Netw Protoc Algoritm 4(4):126–155Zhang D, Vasilakos AV, Xiong H (2012) Predicting location using mobile phone calls. ACM SIGCOMM Comput Commun Rev 42(4):295–296Zhang D, Xiong H, Yang L, Gauither V (2013) NextCell: predicting location using social interplay from cell phone traces. EEE Trans Comput 64(2):452–46

    Context Awareness for Navigation Applications

    Get PDF
    This thesis examines the topic of context awareness for navigation applications and asks the question, “What are the benefits and constraints of introducing context awareness in navigation?” Context awareness can be defined as a computer’s ability to understand the situation or context in which it is operating. In particular, we are interested in how context awareness can be used to understand the navigation needs of people using mobile computers, such as smartphones, but context awareness can also benefit other types of navigation users, such as maritime navigators. There are countless other potential applications of context awareness, but this thesis focuses on applications related to navigation. For example, if a smartphone-based navigation system can understand when a user is walking, driving a car, or riding a train, then it can adapt its navigation algorithms to improve positioning performance. We argue that the primary set of tools available for generating context awareness is machine learning. Machine learning is, in fact, a collection of many different algorithms and techniques for developing “computer systems that automatically improve their performance through experience” [1]. This thesis examines systematically the ability of existing algorithms from machine learning to endow computing systems with context awareness. Specifically, we apply machine learning techniques to tackle three different tasks related to context awareness and having applications in the field of navigation: (1) to recognize the activity of a smartphone user in an indoor office environment, (2) to recognize the mode of motion that a smartphone user is undergoing outdoors, and (3) to determine the optimal path of a ship traveling through ice-covered waters. The diversity of these tasks was chosen intentionally to demonstrate the breadth of problems encompassed by the topic of context awareness. During the course of studying context awareness, we adopted two conceptual “frameworks,” which we find useful for the purpose of solidifying the abstract concepts of context and context awareness. The first such framework is based strongly on the writings of a rhetorician from Hellenistic Greece, Hermagoras of Temnos, who defined seven elements of “circumstance”. We adopt these seven elements to describe contextual information. The second framework, which we dub the “context pyramid” describes the processing of raw sensor data into contextual information in terms of six different levels. At the top of the pyramid is “rich context”, where the information is expressed in prose, and the goal for the computer is to mimic the way that a human would describe a situation. We are still a long way off from computers being able to match a human’s ability to understand and describe context, but this thesis improves the state-of-the-art in context awareness for navigation applications. For some particular tasks, machine learning has succeeded in outperforming humans, and in the future there are likely to be tasks in navigation where computers outperform humans. One example might be the route optimization task described above. This is an example of a task where many different types of information must be fused in non-obvious ways, and it may be that computer algorithms can find better routes through ice-covered waters than even well-trained human navigators. This thesis provides only preliminary evidence of this possibility, and future work is needed to further develop the techniques outlined here. The same can be said of the other two navigation-related tasks examined in this thesis
    • …
    corecore