167 research outputs found

    Unsupervised routine discovery in egocentric photo-streams

    Full text link
    The routine of a person is defined by the occurrence of activities throughout different days, and can directly affect the person's health. In this work, we address the recognition of routine related days. To do so, we rely on egocentric images, which are recorded by a wearable camera and allow to monitor the life of the user from a first-person view perspective. We propose an unsupervised model that identifies routine related days, following an outlier detection approach. We test the proposed framework over a total of 72 days in the form of photo-streams covering around 2 weeks of the life of 5 different camera wearers. Our model achieves an average of 76% Accuracy and 68% Weighted F-Score for all the users. Thus, we show that our framework is able to recognise routine related days and opens the door to the understanding of the behaviour of people

    Topic modelling for routine discovery from egocentric photo-streams

    Get PDF
    Developing tools to understand and visualize lifestyle is of high interest when addressing the improvement of habits and well-being of people. Routine, defined as the usual things that a person does daily, helps describe the individuals' lifestyle. With this paper, we are the first ones to address the development of novel tools for automatic discovery of routine days of an individual from his/her egocentric images. In the proposed model, sequences of images are firstly characterized by semantic labels detected by pre-trained CNNs. Then, these features are organized in temporal-semantic documents to later be embedded into a topic models space. Finally, Dynamic-Time-Warping and Spectral-Clustering methods are used for final day routine/non-routine discrimination. Moreover, we introduce a new EgoRoutine-dataset, a collection of 104 egocentric days with more than 100.000 images recorded by 7 users. Results show that routine can be discovered and behavioural patterns can be observed

    Behaviour understanding through the analysis of image sequences collected by wearable cameras

    Get PDF
    Describing people's lifestyle has become a hot topic in the field of artificial intelligence. Lifelogging is described as the process of collecting personal activity data describing the daily behaviour of a person. Nowadays, the development of new technologies and the increasing use of wearable sensors allow to automatically record data from our daily living. In this paper, we describe our developed automatic tools for the analysis of collected visual data that describes the daily behaviour of a person. For this analysis, we rely on sequences of images collected by wearable cameras, which are called egocentric photo-streams. These images are a rich source of information about the behaviour of the camera wearer since they show an objective and first-person view of his or her lifestyle

    Towards Eating Habits Discovery in Egocentric Photo-Streams

    Get PDF
    Eating habits are learned throughout the early stages of our lives. However, it is not easy to be aware of how our food-related routine affects our healthy living. In this work, we address the unsupervised discovery of nutritional habits from egocentric photo-streams. We build a food-related behavioral pattern discovery model, which discloses nutritional routines from the activities performed throughout the days. To do so, we rely on Dynamic-Time-Warping for the evaluation of similarity among the collected days. Within this framework, we present a simple, but robust and fast novel classification pipeline that outperforms the state-of-the-art on food-related image classification with a weighted accuracy and F-score of 70% and 63%, respectively. Later, we identify days composed of nutritional activities that do not describe the habits of the person as anomalies in the daily life of the user with the Isolation Forest method. Furthermore, we show an application for the identification of food-related scenes when the camera wearer eats in isolation. Results have shown the good performance of the proposed model and its relevance to visualize the nutritional habits of individuals

    Towards Egocentric Person Re-identification and Social Pattern Analysis

    Full text link
    Wearable cameras capture a first-person view of the daily activities of the camera wearer, offering a visual diary of the user behaviour. Detection of the appearance of people the camera user interacts with for social interactions analysis is of high interest. Generally speaking, social events, lifestyle and health are highly correlated, but there is a lack of tools to monitor and analyse them. We consider that egocentric vision provides a tool to obtain information and understand users social interactions. We propose a model that enables us to evaluate and visualize social traits obtained by analysing social interactions appearance within egocentric photostreams. Given sets of egocentric images, we detect the appearance of faces within the days of the camera wearer, and rely on clustering algorithms to group their feature descriptors in order to re-identify persons. Recurrence of detected faces within photostreams allows us to shape an idea of the social pattern of behaviour of the user. We validated our model over several weeks recorded by different camera wearers. Our findings indicate that social profiles are potentially useful for social behaviour interpretation

    Lifestyle understanding through the analysis of egocentric photo-streams

    Get PDF
    At 8:15, before going to work, Rose puts on her pullover and attaches to it the small portable camera that looks like a hanger. The camera will take two images per minute throughout the day and will record almost everything Rose experiences: the people she meets, how long she sits in front of her computer, what she eats, where she goes, etc. These images show an objective description of Rose's experiences. This thesis addresses the development of automatic computer vision tools for the study of people's behaviours. To this end, we rely on the analysis of the visual data offered by these collected sequences of images by wearable cameras. Our developed models have demonstrated to be a powerful tool for the extraction of information about the behaviours of people in society. Examples of applications: 1) selected images as cues to trigger autobiographical memory about past events for prevention of cognitive and functional decline and memory enhancement in elderly people. 2) Self-monitoring devices as people want to increase their self-knowledge through quantitative analysis, expecting that it will lead to psychological well-being and the improvement of their lifestyle. 3) businesses are already making use of such data regarding information about their employees and clients, in order to improve productivity, well-being and customer satisfaction. The ultimate goal is to help people like Rose to improve the quality of our life by creating awareness about our habits and life balance

    Multimodal Egocentric Analysis of Focused Interactions

    Get PDF
    Continuous detection of social interactions from wearable sensor data streams has a range of potential applications in domains, including health and social care, security, and assistive technology. We contribute an annotated, multimodal data set capturing such interactions using video, audio, GPS, and inertial sensing. We present methods for automatic detection and temporal segmentation of focused interactions using support vector machines and recurrent neural networks with features extracted from both audio and video streams. The focused interaction occurs when the co-present individuals, having the mutual focus of attention, interact by first establishing the face-to-face engagement and direct conversation. We describe an evaluation protocol, including framewise, extended framewise, and event-based measures, and provide empirical evidence that the fusion of visual face track scores with audio voice activity scores provides an effective combination. The methods, contributed data set, and protocol together provide a benchmark for the future research on this problem

    Detección automática de patrones rutinarios con imágenes egocéntricas

    Get PDF
    En los últimos años se ha comprobado que el registro de las actividades diarias de una persona compuesto principalmente por un conjunto de imágenes y otras magnitudes provenientes de sensores como GPS y acelerómetros, conforma un volumen de información extenso, el cual puede contribuir a tratamientos contra enfermedades o condiciones relacionadas a la pérdida de memoria. Una forma de capturar estos eventos es mediante el uso de cámaras egocéntricas, un tipo de cámaras que intentan imitar el campo visual de un individuo. Gracias a los avances en el área de la inteligencia artificial, los algoritmos de Deep Learning para la clasificación de imágenes convierten la tarea de clasificar estas imágenes en una tarea semi-automática. En base a esta premisa, se pretende estudiar la viabilidad de una solución que sea capaz de detectar e identificar actividades rutinarias a partir de un conjunto de imágenes tomadas a lo largo de un período de tiempo determinado.Sociedad Argentina de Informática e Investigación Operativ
    • …
    corecore