11,988 research outputs found

    Context-awareness for mobile sensing: a survey and future directions

    Get PDF
    The evolution of smartphones together with increasing computational power have empowered developers to create innovative context-aware applications for recognizing user related social and cognitive activities in any situation and at any location. The existence and awareness of the context provides the capability of being conscious of physical environments or situations around mobile device users. This allows network services to respond proactively and intelligently based on such awareness. The key idea behind context-aware applications is to encourage users to collect, analyze and share local sensory knowledge in the purpose for a large scale community use by creating a smart network. The desired network is capable of making autonomous logical decisions to actuate environmental objects, and also assist individuals. However, many open challenges remain, which are mostly arisen due to the middleware services provided in mobile devices have limited resources in terms of power, memory and bandwidth. Thus, it becomes critically important to study how the drawbacks can be elaborated and resolved, and at the same time better understand the opportunities for the research community to contribute to the context-awareness. To this end, this paper surveys the literature over the period of 1991-2014 from the emerging concepts to applications of context-awareness in mobile platforms by providing up-to-date research and future research directions. Moreover, it points out the challenges faced in this regard and enlighten them by proposing possible solutions

    Deep Thermal Imaging: Proximate Material Type Recognition in the Wild through Deep Learning of Spatial Surface Temperature Patterns

    Get PDF
    We introduce Deep Thermal Imaging, a new approach for close-range automatic recognition of materials to enhance the understanding of people and ubiquitous technologies of their proximal environment. Our approach uses a low-cost mobile thermal camera integrated into a smartphone to capture thermal textures. A deep neural network classifies these textures into material types. This approach works effectively without the need for ambient light sources or direct contact with materials. Furthermore, the use of a deep learning network removes the need to handcraft the set of features for different materials. We evaluated the performance of the system by training it to recognise 32 material types in both indoor and outdoor environments. Our approach produced recognition accuracies above 98% in 14,860 images of 15 indoor materials and above 89% in 26,584 images of 17 outdoor materials. We conclude by discussing its potentials for real-time use in HCI applications and future directions.Comment: Proceedings of the 2018 CHI Conference on Human Factors in Computing System

    Understanding face and eye visibility in front-facing cameras of smartphones used in the wild

    Get PDF
    Commodity mobile devices are now equipped with high-resolution front-facing cameras, allowing applications in biometrics (e.g., FaceID in the iPhone X), facial expression analysis, or gaze interaction. However, it is unknown how often users hold devices in a way that allows capturing their face or eyes, and how this impacts detection accuracy. We collected 25,726 in-the-wild photos, taken from the front-facing camera of smartphones as well as associated application usage logs. We found that the full face is visible about 29% of the time, and that in most cases the face is only partially visible. Furthermore, we identified an influence of users' current activity; for example, when watching videos, the eyes but not the entire face are visible 75% of the time in our dataset. We found that a state-of-the-art face detection algorithm performs poorly against photos taken from front-facing cameras. We discuss how these findings impact mobile applications that leverage face and eye detection, and derive practical implications to address state-of-the art's limitations

    A Smartphone-Based System for Outdoor Data Gathering Using a Wireless Beacon Network and GPS Data: From Cyber Spaces to Senseable Spaces

    Get PDF
    Information and Communication Technologies (ICTs) and mobile devices are deeply influencing all facets of life, directly affecting the way people experience space and time. ICTs are also tools for supporting urban development, and they have also been adopted as equipment for furnishing public spaces. Hence, ICTs have created a new paradigm of hybrid space that can be defined as Senseable Spaces. Even if there are relevant cases where the adoption of ICT has made the use of public open spaces more “smart”, the interrelation and the recognition of added value need to be further developed. This is one of the motivations for the research presented in this paper. The main goal of the work reported here is the deployment of a system composed of three different connected elements (a real-world infrastructure, a data gathering system, and a data processing and analysis platform) for analysis of human behavior in the open space of Cardeto Park, in Ancona, Italy. For this purpose, and because of the complexity of this task, several actions have been carried out: the deployment of a complete real-world infrastructure in Cardeto Park, the implementation of an ad-hoc smartphone application for the gathering of participants’ data, and the development of a data pre-processing and analysis system for dealing with all the gathered data. A detailed description of these three aspects and the way in which they are connected to create a unique system is the main focus of this paper.This work has been supported by the Cost Action TU1306, called CYBERPARKS: Fostering knowledge about the relationship between Information and Communication Technologies and Public Spaces supported by strategies to improve their use and attractiveness, the Spanish Ministry of Economy and Competitiveness under the ESPHIA project (ref. TIN2014-56042-JIN) and the TARSIUS project (ref. TIN2015-71564-C4-4-R), and the Basque Country Department of Education under the BLUE project (ref. PI-2016-0010). The authors would also like to thank the staff of UbiSive s.r.l. for the support in developing the application
    corecore