256 research outputs found

    Towards Egocentric Person Re-identification and Social Pattern Analysis

    Full text link
    Wearable cameras capture a first-person view of the daily activities of the camera wearer, offering a visual diary of the user behaviour. Detection of the appearance of people the camera user interacts with for social interactions analysis is of high interest. Generally speaking, social events, lifestyle and health are highly correlated, but there is a lack of tools to monitor and analyse them. We consider that egocentric vision provides a tool to obtain information and understand users social interactions. We propose a model that enables us to evaluate and visualize social traits obtained by analysing social interactions appearance within egocentric photostreams. Given sets of egocentric images, we detect the appearance of faces within the days of the camera wearer, and rely on clustering algorithms to group their feature descriptors in order to re-identify persons. Recurrence of detected faces within photostreams allows us to shape an idea of the social pattern of behaviour of the user. We validated our model over several weeks recorded by different camera wearers. Our findings indicate that social profiles are potentially useful for social behaviour interpretation

    Lifestyle understanding through the analysis of egocentric photo-streams

    Get PDF
    At 8:15, before going to work, Rose puts on her pullover and attaches to it the small portable camera that looks like a hanger. The camera will take two images per minute throughout the day and will record almost everything Rose experiences: the people she meets, how long she sits in front of her computer, what she eats, where she goes, etc. These images show an objective description of Rose's experiences. This thesis addresses the development of automatic computer vision tools for the study of people's behaviours. To this end, we rely on the analysis of the visual data offered by these collected sequences of images by wearable cameras. Our developed models have demonstrated to be a powerful tool for the extraction of information about the behaviours of people in society. Examples of applications: 1) selected images as cues to trigger autobiographical memory about past events for prevention of cognitive and functional decline and memory enhancement in elderly people. 2) Self-monitoring devices as people want to increase their self-knowledge through quantitative analysis, expecting that it will lead to psychological well-being and the improvement of their lifestyle. 3) businesses are already making use of such data regarding information about their employees and clients, in order to improve productivity, well-being and customer satisfaction. The ultimate goal is to help people like Rose to improve the quality of our life by creating awareness about our habits and life balance

    Visual Object Tracking in First Person Vision

    Get PDF
    The understanding of human-object interactions is fundamental in First Person Vision (FPV). Visual tracking algorithms which follow the objects manipulated by the camera wearer can provide useful information to effectively model such interactions. In the last years, the computer vision community has significantly improved the performance of tracking algorithms for a large variety of target objects and scenarios. Despite a few previous attempts to exploit trackers in the FPV domain, a methodical analysis of the performance of state-of-the-art trackers is still missing. This research gap raises the question of whether current solutions can be used “off-the-shelf” or more domain-specific investigations should be carried out. This paper aims to provide answers to such questions. We present the first systematic investigation of single object tracking in FPV. Our study extensively analyses the performance of 42 algorithms including generic object trackers and baseline FPV-specific trackers. The analysis is carried out by focusing on different aspects of the FPV setting, introducing new performance measures, and in relation to FPV-specific tasks. The study is made possible through the introduction of TREK-150, a novel benchmark dataset composed of 150 densely annotated video sequences. Our results show that object tracking in FPV poses new challenges to current visual trackers. We highlight the factors causing such behavior and point out possible research directions. Despite their difficulties, we prove that trackers bring benefits to FPV downstream tasks requiring short-term object tracking. We expect that generic object tracking will gain popularity in FPV as new and FPV-specific methodologies are investigated

    Mining reality to explore the 21st century student experience

    Get PDF
    Understanding student experience is a key aspect of higher education research. To date, the dominant methods for advancing this area have been the use of surveys and interviews, methods that typically rely on post-event recollections or perceptions, which can be incomplete and unreliable. Advances in mobile sensor technologies afford the opportunity to capture continuous, naturally-occurring student activity. In this thesis, I propose a new research approach for higher education that redefines student experience in terms of objective activity observation, rather than a construct of perception. I argue that novel, technologically driven research practices such as ‘Reality Mining’—continuous capture of digital data from wearable devices and the use of multi-modal datasets captured over prolonged periods, offer a deeper, more accurate representation of students’ lived experience. To explore the potential of these new methods, I implemented and evaluated three approaches to gathering student activity and behaviour data. I collected data from 21 undergraduate health science students at the University of Otago, over the period of a single semester (approximately four months). The data captured included GPS trace data from a smartphone app to explore student spaces and movements; photo data from a wearable auto-camera (that takes a photo from the wearer’s point-of-view, every 30 seconds) to investigate student activities; and computer usage data captured via the RescueTime software to gain insight into students’ digital practices. I explored the findings of these three datasets, visualising the student experience in different ways to demonstrate different perspectives on student activity, and utilised a number of new analytical approaches (such as Computer Vision algorithms for automatically categorising photostream data) to make sense of the voluminous data generated. To help future researchers wanting to utilise similar techniques, I also outlined the limitations and challenges encountered in using these new methods/devices for research. The findings of the three method explorations offer some insights into various aspects of the student experience, but serve mostly to highlight the idiographic nature of student life. The principal finding of this research is that these types of ‘student analytics’ are most readily useful to the students themselves, for highlighting their practices and informing self-improvement. I look at this aspect through the lens of a movement called the ‘Quantified Self’, which promotes the use of self-tracking technologies for personal development. To conclude my thesis, I discuss broadly how these methods could feature in higher education research, for researchers, for the institution, and, most importantly, for the students themselves. To this end, I develop a conceptual framework derived from Tschumi’s (1976) Space-Event-Movement framework. At the same time, I also take a critical perspective about the role of these types of personal analytics in the future of higher education, and question how involved the institution should be in the capture and utilisation of these data. Ultimately, there is value in exploring these data capture methods further, but always keeping the ‘student’ placed squarely at the centre of the ‘student experience’

    The You-Turn in Philosophy of Mind: On the Significance of Experiences that Aren’t Mine.

    Get PDF
    Ph.D. Thesis. University of Hawaiʻi at Mānoa 2018
    • 

    corecore