231 research outputs found

    Layered evaluation of interactive adaptive systems : framework and formative methods

    Get PDF
    Peer reviewedPostprin

    USER CONTROLLABILITY IN A HYBRID RECOMMENDER SYSTEM

    Get PDF
    Since the introduction of Tapestry in 1990, research on recommender systems has traditionally focused on the development of algorithms whose goal is to increase the accuracy of predicting users’ taste based on historical data. In the last decade, this research has diversified, with human factors being one area that has received increased attention. Users’ characteristics, such as trusting propensity and interest in a domain, or systems’ characteristics, such as explainability and transparency, have been shown to have an effect on improving the user experience with a recommender. This dissertation investigates on the role of controllability and user characteristics upon the engagement and experience of users of a hybrid recommender system. A hybrid recommender is a system that integrates the results of different algorithms to produce a single set of recommendations. This research examines whether allowing the user to control the process of fusing or integrating different algorithms (i.e., different sources of relevance) results in increased engagement and a better user experience. The essential contribution of this dissertation is an extensive study of controllability in a hybrid fusion scenario. In particular, the introduction of an interactive Venn diagram visualization, combined with sliders explored in a previous work, can provide an efficient visual paradigm for information filtering with a hybrid recommender that fuses different prospects of relevance with overlapping recommended items. This dissertation also provides a three-fold evaluation of the user experience: objective metrics, subjective user perception, and behavioral measures

    User-controllable personalization: A case study with SetFusion

    Get PDF
    In this research we investigated the role of user controllability on personalized systems by implementing and studying a novel interactive recommender interface, SetFusion. We examined whether allowing the user to control the process of fusing or integrating different algorithms (i.e., different sources of relevance) resulted in increased engagement and a better user experience. The essential contribution of this research stems from the results of a user study (N=40) of controllability in a scenario where users could fuse different recommendation approaches, with the possibility of inspecting and filtering the items recommended. First, we introduce an interactive Venn diagram visualization, which combined with sliders, can provide an efficient visual paradigm for information filtering. Second, we provide a three-fold evaluation of the user experience: objective metrics, subjective user perception, and behavioral measures. Through the analysis of these metrics, we confirmed results from recent studies, such as the effect of trusting propensity on accepting the recommendations and also unveiled the importance of features such as being a native speaker. Our results present several implications for the design and implementation of user-controllable personalized systems

    Interactive Explanation with Varying Level of Details in an Explainable Scientific Literature Recommender System

    Full text link
    Explainable recommender systems (RS) have traditionally followed a one-size-fits-all approach, delivering the same explanation level of detail to each user, without considering their individual needs and goals. Further, explanations in RS have so far been presented mostly in a static and non-interactive manner. To fill these research gaps, we aim in this paper to adopt a user-centered, interactive explanation model that provides explanations with different levels of detail and empowers users to interact with, control, and personalize the explanations based on their needs and preferences. We followed a user-centered approach to design interactive explanations with three levels of detail (basic, intermediate, and advanced) and implemented them in the transparent Recommendation and Interest Modeling Application (RIMA). We conducted a qualitative user study (N=14) to investigate the impact of providing interactive explanations with varying level of details on the users' perception of the explainable RS. Our study showed qualitative evidence that fostering interaction and giving users control in deciding which explanation they would like to see can meet the demands of users with different needs, preferences, and goals, and consequently can have positive effects on different crucial aspects in explainable recommendation, including transparency, trust, satisfaction, and user experience.Comment: 23 page

    Supporting Personalized Music Exploration through a Genre Exploration Recommender

    Get PDF
    Recommender systems have been largely focused on the task of predicting users' current preferences and finding the most relevant items that users currently like. However, this approach is not sufficient as users may want to explore and develop new preferences, for example about a new genre. Allowing users to explore new preferences has many advantages, such as helping users to stay away from the so-called ``filter bubbles'', supporting new preference exploration and development, and promoting under-explored niche tastes, in addition to the mainstream preferences. Therefore, in this dissertation, we explore how recommender systems can be leveraged to support users' new preference exploration in the context of music genre exploration. The research takes a multidisciplinary approach in which we explore music recommendation algorithms and interactive exploration interface design for supporting music genre exploration, paired with insights from individual's music preference evolution and theories on decision making (such as digital nudges). For this purpose, we propose a music genre exploration tool and refine the tool over subsequent studies. We evaluate the music genre exploration tool with multiple single-session user-centric studies and one longitudinal user study on the long-term effectiveness of the tool to drive new preference exploration with various types of users’ objective behavior and their subjective user experience. From the studies, we find that users perceived the music genre exploration tool to be a new and helpful way to explore and develop new music tastes. By allowing users to make trade-offs between their current preferences and the new music genre they want to explore, the music genre exploration helps users make an easy personalized first step out of their comfort zone and towards the new preferences. The newly designed interactive exploration interface of the music exploration tool improves the usability and helpfulness of genre exploration by improving transparency, controllability and understandability. We further investigate individual differences during musical preference evolution by checking individuals' musical preference consistency and identify a relevant personal factor associated with this consistency (i.e., musical expertise). Our findings suggest that users with different musical expertise tend to show different musical exploration behavior. We further enhance the exploration tool with digital nudges to see if digital nudges can promote more exploration from users, and based on insights on individual differences, how this differs among individuals with different expertise levels. Based on our findings, we discuss opportunities and implications for future recommender systems to support new preference exploration and development

    Making Filter Bubbles Understandable

    Get PDF
    Recommender systems tend to create filter bubbles and, as a consequence, lower diversity exposure, often with the user not being aware of it. The biased preselection of content by recommender systems has called for approaches to deal with exposure diversity, such as giving users control over their filter bubble. We analyze how to make filter bubbles understandable and controllable by using interactive word clouds, following the idea of building trust in the system. On the basis of several prototypes, we performed explorative research on how to design word clouds for the controllability of filter bubbles. Our findings can inform designers of interactive filter bubbles in personalized offers of broadcasters, publishers, and media houses

    Explanations in Music Recommender Systems in a Mobile Setting

    Get PDF
    Revised version: some spelling errors corrected.Every day, millions of users utilize their mobile phones to access music streaming services such as Spotify. However, these `black boxes’ seldom provide adequate explanations for their music recommendations. A systematic literature review revealed that there is a strong relationship between moods and music, and that explanations and interface design choices can effect how people perceive recommendations just as much as algorithm accuracy. However, little seems to be known about how to apply user-centric design approaches, which exploit affective information to present explanations, to mobile devices. In order to bridge these gaps, the work of Andjelkovic, Parra, & O’Donovan (2019) was extended upon and applied as non-interactive designs in a mobile setting. Three separate Amazon Mechanical Turk studies asked participants to compare the same three interface designs: baseline, textual, and visual (n=178). Each survey displayed a different playlist with either low, medium, or high music popularity. Results indicate that music familiarity may or may not influence the need for explanations, but explanations are important to users. Both explanatory designs fared equally better than the baseline, and the use of affective information may help systems become more efficient, transparent, trustworthy, and satisfactory. Overall, there does not seem to be a `one design fits all’ solution for explanations in a mobile setting.Master's Thesis in Information ScienceINFO390MASV-INFOMASV-IK

    Controllability and explainability in a hybrid social recommender system

    Get PDF
    The growth in artificial intelligence (AI) technology has advanced many human-facing applications. The recommender system is one of the promising sub-domain of AI-driven application, which aims to predict items or ratings based on user preferences. These systems were empowered by large-scale data and automated inference methods that bring useful but puzzling suggestions to the users. That is, the output is usually unpredictable and opaque, which may demonstrate user perceptions of the system that can be confusing, frustrating or even dangerous in many life-changing scenarios. Adding controllability and explainability are two promising approaches to improve human interaction with AI. However, the varying capability of AI-driven applications makes the conventional design principles are less useful. It brings tremendous opportunities as well as challenges for the user interface and interaction design, which has been discussed in the human-computer interaction (HCI) community for over two decades. The goal of this dissertation is to build a framework for AI-driven applications that enables people to interact effectively with the system as well as be able to interpret the output from the system. Specifically, this dissertation presents the exploration of how to bring controllability and explainability to a hybrid social recommender system, included several attempts in designing user-controllable and explainable interfaces that allow the users to fuse multi-dimensional relevance and request explanations of the received recommendations. The works contribute to the HCI fields by providing design implications of enhancing human-AI interaction and gaining transparency of AI-driven applications

    User Feedback in Controllable and Explainable Social Recommender Systems: a Linguistic Analysis

    Get PDF
    Controllable and explainable intelligent user interfaces have been used to provide transparent recommendations. Many researchers have explored interfaces that support user control and provide explanations of the recommendation process and models. To extend the works to real-world decision-making scenarios, we need to understand further the users’ mental models of the enhanced system components. In this paper, we make a step in this direction by investigating a free form feedback left by users of social recommender systems to specify the reasons of selecting prompted social recommendations. With a user study involving 50 subjects (N=50), we present the linguistic changes in using controllable and explainable interfaces for a social information-seeking task. Based on our findings, we discuss design implications for controllable and explainable recommender systems
    • 

    corecore