199 research outputs found

    Baseline analysis of a conventional and virtual reality lifelog retrieval system

    Get PDF
    Continuous media capture via a wearable devices is currently one of the most popular methods to establish a comprehensive record of the entirety of an individual's life experience, referred to in the research community as a lifelog. These vast multimodal corpora include visual and other sensor data and are enriched by content analysis, to generate as extensive a record of an individual's life experience. However, interfacing with such datasets remains an active area of research, and despite the advent of new technology and a plethora of competing mediums for processing digital information, there has been little focus on newly emerging platforms such as virtual reality. In this work, we suggest that the increase in immersion and spatial dimensions provided by virtual reality could provide significant benefits to users when compared to more conventional access methodologies. Hence, we motivate virtual reality as a viable method of exploring multimedia archives (specifically lifelogs) by performing a baseline comparative analysis using a novel application prototype built for the HTC Vive and a conventional prototype built for a standard personal computer

    NTCIR Lifelog: The First Test Collection for Lifelog Research

    Get PDF
    Test collections have a long history of supporting repeatable and comparable evaluation in Information Retrieval (IR). However, thus far, no shared test collection exists for IR systems that are designed to index and retrieve multimodal lifelog data. In this paper we introduce the first test col- lection for personal lifelog data. The requirements for such a test collection are motivated, the process of creating the test collection is described, along with an overview of the test collection and finally suggestions are given for possible applications of the test collection, which has been employed for the NTCIR12-Lifelog task

    LEMoRe: A lifelog engine for moments retrieval at the NTCIR-lifelog LSAT task

    Get PDF
    Semantic image retrieval from large amounts of egocentric visual data requires to leverage powerful techniques for filling in the semantic gap. This paper introduces LEMoRe, a Lifelog Engine for Moments Retrieval, developed in the context of the Lifelog Semantic Access Task (LSAT) of the the NTCIR-12 challenge and discusses its performance variation on different trials. LEMoRe integrates classical image descriptors with high-level semantic concepts extracted by Convolutional Neural Networks (CNN), powered by a graphic user interface that uses natural language processing. Although this is just a first attempt towards interactive image retrieval from large egocentric datasets and there is a large room for improvement of the system components and the user interface, the structure of the system itself and the way the single components cooperate are very promising.Postprint (published version

    Report on the Information Retrieval Festival (IRFest2017)

    Get PDF
    The Information Retrieval Festival took place in April 2017 in Glasgow. The focus of the workshop was to bring together IR researchers from the various Scottish universities and beyond in order to facilitate more awareness, increased interaction and reflection on the status of the field and its future. The program included an industry session, research talks, demos and posters as well as two keynotes. The first keynote was delivered by Prof. Jaana Kekalenien, who provided a historical, critical reflection of realism in Interactive Information Retrieval Experimentation, while the second keynote was delivered by Prof. Maarten de Rijke, who argued for more Artificial Intelligence usage in IR solutions and deployments. The workshop was followed by a "Tour de Scotland" where delegates were taken from Glasgow to Aberdeen for the European Conference in Information Retrieval (ECIR 2017

    Information access tasks and evaluation for personal lifelogs

    Get PDF
    Emerging personal lifelog (PL) collections contain permanent digital records of information associated with individuals’ daily lives. This can include materials such as emails received and sent, web content and other documents with which they have interacted, photographs, videos and music experienced passively or created, logs of phone calls and text messages, and also personal and contextual data such as location (e.g. via GPS sensors), persons and objects present (e.g. via Bluetooth) and physiological state (e.g. via biometric sensors). PLs can be collected by individuals over very extended periods, potentially running to many years. Such archives have many potential applications including helping individuals recover partial forgotten information, sharing experiences with friends or family, telling the story of one’s life, clinical applications for the memory impaired, and fundamental psychological investigations of memory. The Centre for Digital Video Processing (CDVP) at Dublin City University is currently engaged in the collection and exploration of applications of large PLs. We are collecting rich archives of daily life including textual and visual materials, and contextual context data. An important part of this work is to consider how the effectiveness of our ideas can be measured in terms of metrics and experimental design. While these studies have considerable similarity with traditional evaluation activities in areas such as information retrieval and summarization, the characteristics of PLs mean that new challenges and questions emerge. We are currently exploring the issues through a series of pilot studies and questionnaires. Our initial results indicate that there are many research questions to be explored and that the relationships between personal memory, context and content for these tasks is complex and fascinating

    Examining the utility of affective response in search of personal lifelogs

    Get PDF
    Personal lifelog archives contain digital records captured from an individual’s daily life, for example emails, documents edited, webpages downloaded and photographs taken. While capturing this information is becoming increasingly easy, subsequently locating interesting items from within these archives is a significant challenge. One potential source of information to identify items of importance to an individual is their affective state during the capture of the information. The strength of an individual’s affective response to their current situation can often be gauged from their physiological response. For this study we explored the utility of the following biometric features to indicate significant items: galvanic skin response (GSR), heart rate (HR) and skin temperature (ST). Significant or important events tend to raise an individual’s arousal level, causing a measurable biometric response. We examined the utility of using biometric response to identify significant items and for re-ranking traditional information retrieval (IR) result sets. Results obtained indicate that skin temperature is most useful for extracting interesting items from personal archives containing passively captured images, computer activity and SMS messages

    Multiple multimodal mobile devices: Lessons learned from engineering lifelog solutions

    Get PDF
    For lifelogging, or the recording of one’s life history through digital means, to be successful, a range of separate multimodal mobile devices must be employed. These include smartphones such as the N95, the Microsoft SenseCam – a wearable passive photo capture device, or wearable biometric devices. Each collects a facet of the bigger picture, through, for example, personal digital photos, mobile messages and documents access history, but unfortunately, they operate independently and unaware of each other. This creates significant challenges for the practical application of these devices, the use and integration of their data and their operation by a user. In this chapter we discuss the software engineering challenges and their implications for individuals working on integration of data from multiple ubiquitous mobile devices drawing on our experiences working with such technology over the past several years for the development of integrated personal lifelogs. The chapter serves as an engineering guide to those considering working in the domain of lifelogging and more generally to those working with multiple multimodal devices and integration of their data

    Overview of the NTCIR-14 Lifelog-3 task

    Get PDF
    Lifelog-3 was the third instance of the lifelog task at NTCIR. At NTCIR-14, the Lifelog-3 task explored three different lifelog data access related challenges, the search challenge, the annotation challenge and the insights challenge. In this paper we review the activities of participating teams who took part in the challenges and we suggest next steps for the community

    Overview of NTCIR-13 Lifelog-2 Task

    Get PDF
    In this paper we review the NTCIR13-Lifelog core task, which ran at NTCIR-13. We outline the test collection employed, along with the tasks, the submissions and the findings from this pilot task. We finish by suggesting future plans for the task
    corecore