117,753 research outputs found

    Designing an Adaptive Web Navigation Interface for Users with Variable Pointing Performance

    Get PDF
    Many online services and products require users to point and interact with user interface elements. For individuals who experience variable pointing ability due to physical impairments, environmental issues or age, using an input device (e.g., a computer mouse) to select elements on a website can be difficult. Adaptive user interfaces dynamically change their functionality in response to user behavior. They can support individuals with variable pointing abilities by 1) adapting dynamically to make element selection easier when a user is experiencing pointing difficulties, and 2) informing users about these pointing errors. While adaptive interfaces are increasingly prevalent on the Web, little is known about the preferences and expectations of users with variable pointing abilities and how to design systems that dynamically support them given these preferences. We conducted an investigation with 27 individuals who intermittently experience pointing problems to inform the design of an adaptive interface for web navigation. We used a functional high-fidelity prototype as a probe to gather information about user preferences and expectations. Our participants expected the system to recognize and integrate their preferences for how pointing tasks were carried out, preferred to receive information about system functionality and wanted to be in control of the interaction. We used findings from the study to inform the design of an adaptive Web navigation interface, PINATA that tracks user pointing performance over time and provides dynamic notifications and assistance tailored to their specifications. Our work contributes to a better understanding of users' preferences and expectations of the design of an adaptive pointing system

    Analyzing User Behavior Patterns in Adaptive Exploratory Search Systems with LifeFlow

    Get PDF
    Adaptive exploratory search is a method that can provide user-centered personalized search results by incorporating interactive user interfaces. Analyzing the user behavior pat- terns of these systems can be complicated when they sup- port transparent and controllable open user models. This paper suggests to use a visualization tool to address the problem, as a complement to the typical statistical analy- sis. By adopting an event sequence visualization tool called LifeFlow, we were able to easily find out user interesting behavior patterns, especially regarding the open user model exploration

    Studying Interaction Methodologies in Video Retrieval

    Get PDF
    So far, several approaches have been studied to bridge the problem of the Semantic Gap, the bottleneck in image and video retrieval. However, no approach is successful enough to increase retrieval performances significantly. One reason is the lack of understanding the user's interest, a major condition towards adapting results to a user. This is partly due to the lack of appropriate interfaces and the missing knowledge of how to interpret user's actions with these interfaces. In this paper, we propose to study the importance of various implicit indicators of relevance. Furthermore, we propose to investigate how this implicit feedback can be combined with static user profiles towards an adaptive video retrieval model

    Engineering Adaptive Model-Driven User Interfaces

    No full text
    Software applications that are very large-scale, can encompass hundreds of complex user interfaces (UIs). Such applications are commonly sold as feature-bloated off-the-shelf products to be used by people with variable needs in the required features and layout preferences. Although many UI adaptation approaches were proposed, several gaps and limitations including: extensibility and integration in legacy systems, still need to be addressed in the state-of-the-art adaptive UI development systems. This paper presents Role-Based UI Simplification (RBUIS) as a mechanism for increasing usability through adaptive behaviour by providing end-users with a minimal feature-set and an optimal layout, based on the context-of- use. RBUIS uses an interpreted runtime model-driven approach based on the Cedar Architecture, and is supported by the integrated development environment (IDE), Cedar Studio. RBUIS was evaluated by integrating it into OFBiz, an open-source ERP system. The integration method was assessed and measured by establishing and applying technical metrics. Afterwards, a usability study was carried out to evaluate whether UIs simplified with RBUIS show an improvement over their initial counterparts. This study leveraged questionnaires, checking task completion times and output quality, and eye-tracking. The results showed that UIs simplified with RBUIS significantly improve end-user efficiency, effectiveness, and perceived usability

    Engineering adaptive user interfaces using monitoring-oriented programming

    Get PDF
    User interfaces which adapt based on usage patterns, for example based on frequency of use of certain features, have been proposed as a means of limiting the complexity of the user interface without specialising it unnecessarily to particular user profiles. However, from a software engineering perspective, adaptive user interfaces pose a challenge in code structuring, and separation of the different layers of user interface and application state and logic can introduce interdependencies which make software development and maintenance more challenging. In this paper we explore the use of monitoring-oriented programming to add adaptive features to user interfaces, an approach which has been touted as a means of separating certain layers of logic from the main system. We evaluate the approach both using standard software engineering measures and also through a user acceptance experiment - by having a number of developers use the proposed approach to add adaptation logic to an existing application.peer-reviewe

    MARLUI: Multi-Agent Reinforcement Learning for Adaptive UIs

    Full text link
    Adaptive user interfaces (UIs) automatically change an interface to better support users' tasks. Recently, machine learning techniques have enabled the transition to more powerful and complex adaptive UIs. However, a core challenge for adaptive user interfaces is the reliance on high-quality user data that has to be collected offline for each task. We formulate UI adaptation as a multi-agent reinforcement learning problem to overcome this challenge. In our formulation, a user agent mimics a real user and learns to interact with a UI. Simultaneously, an interface agent learns UI adaptations to maximize the user agent's performance. The interface agent learns the task structure from the user agent's behavior and, based on that, can support the user agent in completing its task. Our method produces adaptation policies that are learned in simulation only and, therefore, does not need real user data. Our experiments show that learned policies generalize to real users and achieve on par performance with data-driven supervised learning baselines

    Exploring mobile news reading interactions for news app personalisation

    Get PDF
    As news is increasingly accessed on smartphones and tablets, the need for personalising news app interactions is apparent. We report a series of three studies addressing key issues in the development of adaptive news app interfaces. We first surveyed users' news reading preferences and behaviours; analysis revealed three primary types of reader. We then implemented and deployed an Android news app that logs users' interactions with the app. We used the logs to train a classifier and showed that it is able to reliably recognise a user according to their reader type. Finally we evaluated alternative, adaptive user interfaces for each reader type. The evaluation demonstrates the differential benefit of the adaptation for different users of the news app and the feasibility of adaptive interfaces for news apps
    • …
    corecore