17 research outputs found

    VieLens,: an interactive search engine for LSC2019

    Get PDF
    With the appearance of many wearable devices like smartwatches, recording glasses (such as Google glass), smart phones, digital personal profiles have become more readily available nowadays. However, searching and navigating these multi-source, multi-modal, and often unstructured data to extract useful information is still a relatively challenging task. Therefore, the LSC2019 competition has been organized so that researchers can demonstrate novel search engines, as well as exchange ideas and collaborate on these types of problems. We present in this paper our approach for supporting interactive searches of lifelog data by employing a new retrieval system called VieLens, which is an interactive retrieval system enhanced by natural language processing techniques to extend and improve search results mainly in the context of a user’s activities in their daily life

    Exquisitor at the Lifelog Search Challenge 2020

    Get PDF
    We present an enhanced version of Exquisitor, our interactive and scalable media exploration system. At its core, Exquisitor is an interactive learning system using relevance feedback on media items to build a model of the users' information need. Relying on efficient media representation and indexing, it facilitates real-time user interaction. The new features for the Lifelog Search Challenge 2020 include support for timeline browsing, search functionality for finding positive examples, and significant interface improvements. Participation in the Lifelog Search Challenge allows us to compare our paradigm, relying predominantly on interactive learning, with more traditional search-based multimedia retrieval systems

    Introduction to the Sixth Annual Lifelog Search Challenge, LSC’23

    Get PDF
    For the sixth time since 2018, the Lifelog Search Challenge (LSC) was organized as a comparative benchmarking exercise for various interactive lifelog search systems. The goal of this international competition is to test system capabilities to access large multimodal lifelogs. LSC’23 attracted twelve participanting teams, each of whom had developed a competitive interactive lifelog retrieval system. The benchmark was organized in front of live audience at the LSC workshop at ACM ICMR’23. As in previous editions, this introductory paper presents the LSC workshop and introduces the participating lifelog search systems

    Myscéal: an experimental interactive lifelog retrieval system for LSC'20

    Get PDF
    The Lifelog Search Challenge (LSC), is an annual comparative benchmarking activity for comparing approaches to interactive retrieval from multi-modal lifelogs. Being an interactive search challenge, issues such as retrieval accuracy, search speed and usability of interfaces are key challenges that must be addressed by every participant. In this paper, we introduce Myscéal, an interactive lifelog retrieval engine designed to support novice users to retrieve items of interest from a large multimodal lifelog. Additionally, we also introduce a new similarity measure called “aTFIDF”, to match a user’s free-text information need with the multimodal lifelog index

    Myscéal 2.0: a revised experimental interactive lifelog retrieval system for LSC'21

    Get PDF
    Building an interactive retrieval system for lifelogging contains many challenges due to massive multi-modal personal data besides the requirement of accuracy and rapid response for such a tool. The Lifelog Search Challenge (LSC) is the international lifelog retrieval competition that inspires researchers to develop their systems to cope with the challenges and evaluates the effectiveness of their solutions. In this paper, we upgrade our previous Myscéal 2.0 and present Myscéal 2.0 system for the LSC'21 with the improved features inspired by the novice users experiments. The experiments show that a novice user achieved more than half of the expert score on average. To mitigate the gap of them, some potential enhancements were identified and integrated to the enhanced version

    Memento: a prototype lifelog search engine for LSC’21

    Get PDF
    In this paper, we introduce a new lifelog retrieval system called Memento that leverages semantic representations of images and textual queries projected into a common latent space to facilitate effective retrieval. It bridges the semantic gap between complex visual scenes/events and user information needs expressed as textual and faceted queries. The system, developed for the 2021 Lifelog Search Challenge also has a minimalist user interface that includes primary search, temporal search, and visual data filtering components

    Graph-based indexing and retrieval of lifelog data

    Get PDF
    Understanding the relationship between objects in an image is an important challenge because it can help to describe actions in the image. In this paper, a graphical data structure, named “Scene Graph”, is utilized to represent an encoded informative visual relationship graph for an image, which we suggest has a wide range of potential applications. This scene graph is applied and tested in the popular domain of lifelogs, and specifically in the challenge of known-item retrieval from lifelogs. In this work, every lifelog image is represented by a scene graph, and at retrieval time, this scene graph is compared with the semantic graph, parsed from a textual query. The result is combined with location or date information to determine the matching items. The experiment shows that this technique can outperform a conventional method

    Flexible interactive retrieval SysTem 3.0 for visual lifelog exploration at LSC 2022

    Get PDF
    Building a retrieval system with lifelogging data is more complicated than with ordinary data due to the redundancies, blurriness, massive amount of data, various sources of information accompanying lifelogging data, and especially the ad-hoc nature of queries. The Lifelog Search Challenge (LSC) is a benchmarking challenge that encourages researchers and developers to push the boundaries in lifelog retrieval. For LSC'22, we develop FIRST 3.0, a novel and flexible system that leverages expressive cross-domain embeddings to enhance the searching process. Our system aims to adaptively capture the semantics of an image at different levels of detail. We also propose to augment our system with an external search engine to help our system with initial visual examples for unfamiliar concepts. Finally, we organize image data in hierarchical clusters based on their visual similarity and location to assist users in data exploration. Experiments show that our system is both fast and effective in handling various retrieval scenarios
    corecore