86 research outputs found

    Augmenting human memory using personal lifelogs

    Get PDF
    Memory is a key human facility to support life activities, including social interactions, life management and problem solving. Unfortunately, our memory is not perfect. Normal individuals will have occasional memory problems which can be frustrating, while those with memory impairments can often experience a greatly reduced quality of life. Augmenting memory has the potential to make normal individuals more effective, and those with significant memory problems to have a higher general quality of life. Current technologies are now making it possible to automatically capture and store daily life experiences over an extended period, potentially even over a lifetime. This type of data collection, often referred to as a personal life log (PLL), can include data such as continuously captured pictures or videos from a first person perspective, scanned copies of archival material such as books, electronic documents read or created, and emails and SMS messages sent and received, along with context data of time of capture and access and location via GPS sensors. PLLs offer the potential for memory augmentation. Existing work on PLLs has focused on the technologies of data capture and retrieval, but little work has been done to explore how these captured data and retrieval techniques can be applied to actual use by normal people in supporting their memory. In this paper, we explore the needs for augmenting human memory from normal people based on the psychology literature on mechanisms about memory problems, and discuss the possible functions that PLLs can provide to support these memory augmentation needs. Based on this, we also suggest guidelines for data for capture, retrieval needs and computer-based interface design. Finally we introduce our work-in-process prototype PLL search system in the iCLIPS project to give an example of augmenting human memory with PLLs and computer based interfaces

    Guidelines for the presentation and visualisation of lifelog content

    Get PDF
    Lifelogs offer rich voluminous sources of personal and social data for which visualisation is ideally suited to providing access, overview, and navigation. We explore through examples of our visualisation work within the domain of lifelogging the major axes on which lifelogs operate, and therefore, on which their visualisations should be contingent. We also explore the concept of ‘events’ as a way to significantly reduce the complexity of the lifelog for presentation and make it more human-oriented. Finally we present some guidelines and goals which should be considered when designing presentation modes for lifelog conten

    Semantically enhancing multimedia lifelog events

    Get PDF
    Lifelogging is the digital recording of our everyday behaviour in order to identify human activities and build applications that support daily life. Lifelogs represent a unique form of personal multimedia content in that they are temporal, synchronised, multi-modal and composed of multiple media. Analysing lifelogs with a view to supporting content-based access, presents many challenges. These include the integration of heterogeneous input streams from different sensors, structuring a lifelog into events, representing events, and interpreting and understanding lifelogs. In this paper we demonstrate the potential of semantic web technologies for analysing lifelogs by automatically augmenting descriptions of lifelog events. We report on experiments and demonstrate how our re- sults yield rich descriptions of multi-modal, multimedia lifelog content, opening up even greater possibilities for managing and using lifelogs

    Mining user activity as a context source for search and retrieval

    Get PDF
    Nowadays in information retrieval it is generally accepted that if we can better understand the context of users then this could help the search process, either at indexing time by including more metadata or at retrieval time by better modelling the user context. In this work we explore how activity recognition from tri-axial accelerometers can be employed to model a user's activity as a means of enabling context-aware information retrieval. In this paper we discuss how we can gather user activity automatically as a context source from a wearable mobile device and we evaluate the accuracy of our proposed user activity recognition algorithm. Our technique can recognise four kinds of activities which can be used to model part of an individual's current context. We discuss promising experimental results, possible approaches to improve our algorithms, and the impact of this work in modelling user context toward enhanced search and retrieval

    From lifelog to diary: a timeline view for memory reminiscence

    Get PDF
    As digital recording sensors and lifelogging devices become more prevalent, the suitability of lifelogging tools to act as a reminiscence supporting tool has become an important research challenge. This paper aims to describe a rst- generation memory reminiscence tool that utilises lifelog- ging sensors to record a digital diary of user activities and presents it as a narrative description of user activities. The automatically recognised daily activities are shown chronologically in the timeline view

    Exploring the technical challenges of large-scale lifelogging

    Get PDF
    Ambiently and automatically maintaining a lifelog is an activity that may help individuals track their lifestyle, learning, health and productivity. In this paper we motivate and discuss the technical challenges of developing real-world lifelogging solutions, based on seven years of experience. The gathering, organisation, retrieval and presentation challenges of large-scale lifelogging are dis- cussed and we show how this can be achieved and the benefits that may accrue

    Extended mind and artifactual autobiographical memory

    Get PDF
    In this paper, I describe how artifacts and autobiographical memory are integrated into new systemic wholes, allowing us to remember our personal past in a more reliable and detailed manner. After discussing some empirical work on lifelogging technology, I elaborate on the dimension of autobiographical dependency, which is the degree to which we depend on an object to be able to remember a personal experience. When this dependency is strong, we integrate information in the embodied brain and in an object to reconstruct an autobiographical memory. In such cases, autobiographical memory is extended or distributed

    Distributed selves: Personal identity and extended memory systems

    Get PDF
    This paper explores the implications of extended and distributed cognition theory for our notions of personal identity. On an extended and distributed approach to cognition, external information is under certain conditions constitutive of memory. On a narrative approach to personal identity, autobiographical memory is constitutive of our diachronic self. In this paper, I bring these two approaches together and argue that external information can be constitutive of one’s autobiographical memory and thus also of one’s diachronic self. To develop this claim, I draw on recent empirical work in human-computer interaction, looking at lifelogging technologies in both healthcare and everyday contexts. I argue that personal identity can neither be reduced to psychological structures instantiated by the brain nor by biological structures instantiated by the organism, but should be seen as an environmentally-distributed and relational construct. In other words, the complex web of cognitive relations we develop and maintain with other people and technological artifacts partly determines our self. This view has conceptual, methodological, and normative implications: we should broaden our concepts of the self as to include social and artifactual structures, focus on external memory systems in the (empirical) study of personal identity, and not interfere with people’s distributed minds and selves

    Automatically Augmenting Lifelog Events Using Pervasively Generated Content from Millions of People

    Get PDF
    In sensor research we take advantage of additional contextual sensor information to disambiguate potentially erroneous sensor readings or to make better informed decisions on a single sensor’s output. This use of additional information reinforces, validates, semantically enriches, and augments sensed data. Lifelog data is challenging to augment, as it tracks one’s life with many images including the places they go, making it non-trivial to find associated sources of information. We investigate realising the goal of pervasive user-generated content based on sensors, by augmenting passive visual lifelogs with “Web 2.0” content collected by millions of other individuals
    • 

    corecore