19,127 research outputs found

    User Interface Plasticity: Model Driven Engineering to the Limit!

    Get PDF
    Keynote paper.International audienceTen years ago, I introduced the notion of user interface plasticity to denote the capacity of user interfaces to adapt, or to be adapted, to the context of use while preserving usability. The Model Driven Engineering (MDE) approach, which was used for user interface generation since the early eighties in HCI, has recently been revived to address this complex problem. Although MDE has resulted in interesting and convincing results for conventional WIMP user interfaces, it has not fully demonstrated its theoretical promises yet. In this paper, we discuss how to push MDE to the limit in order to reconcile high-level modeling techniques with low-level programming in order to go beyond WIMP user interfaces

    Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges

    Get PDF
    In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices

    Xplain: an Editor for building Self-Explanatory User Interfaces by Model-Driven Engineering

    Get PDF
    International audienceModern User Interfaces (UI) must deal with the increasing complexity of applications in terms of functionality as well as new properties as plasticity. The plasticity of a UI denotes its capacity of adaptation to the context of use while preserving its quality. The efforts in plasticity have focused on the (meta) modeling of the UI, but the quality remains uncovered. This paper describes an on-going research that studies a method to develop Self-Explanatory User Interfaces as well as an editor that implements this method. Self-explanation makes reference to the capacity of a UI to provide the end-user with information about its rationale (which is the purpose of the UI), its design rationale (why is the UI structured into this set of workspaces?, what's the purpose of this button?), its current state (why is the menu disabled?) as well as the evolution of the state (how can I enable this feature?). Explanations are provided by embedded models

    Adapting Component-based User Interfaces at Runtime using Observers

    Get PDF
    Model-driven engineering (MDE) already plays a key role in Human-Computer Interaction for the automatic generation of end-user interfaces from their abstract and platform-independent specifications. Moreover, MDE techniques and tools are proving to be very useful for adapting at runtime the final user interfaces according to the current context properties: platform, user roles, component states, etc. In this paper we propose a mechanism to adapt user interfaces at runtime. These user interfaces will be (re)generated through the dynamic composition of user-interface software components, depending on the observed properties of the environment and of the components’ behaviour.Ministerio de Ciencia e Innovación TIN2010-15588Ministerio de Ciencia e Innovación TRA2009-0309Ministerio de Ciencia e Innovación TIN2008-00889-EMinisterio de Ciencia e Innovación TIN2008-03107Junta de Andalucía TIC-6114Junta de Andalucía P07-TIC-0318

    Self-Explanatory User Interfaces by Model-Driven Engineering

    Get PDF
    International audienceModern User Interfaces (UI) must deal with the increasing complexity of applications as well as new features such as the capacity of UIs to be dynamically adapted to the con- text of use. The complexity does not necessarily imply a better quality. Thus, it becomes necessary to make users un- derstand the UIs. This paper describes an on-going research about Self-Explanatory User Interfaces (SE-UI) by Model- Driven Engineering (MDE). Self-explanation makes refer- ence to the capacity of a UI to provide the end-user with information about its rationale (which is the purpose of the UI), its design rationale (why is the UI structured into this set of workspaces?, what's the purpose of this button?), its current state (why is the menu disabled?) as well as the evo- lution of the state (how can I enable this feature?). Explana- tions are provided by embedded models. We explore model- driven engineering to understand why and how this approach can lead us to overcome shortcomings of UI quality success- fully

    Self-Explanatory User Interfaces by Model-Driven Engineering

    No full text
    International audienceModern User Interfaces (UI) must deal with the increasing complexity of applications as well as new features such as the capacity of UIs to be dynamically adapted to the con- text of use. The complexity does not necessarily imply a better quality. Thus, it becomes necessary to make users un- derstand the UIs. This paper describes an on-going research about Self-Explanatory User Interfaces (SE-UI) by Model- Driven Engineering (MDE). Self-explanation makes refer- ence to the capacity of a UI to provide the end-user with information about its rationale (which is the purpose of the UI), its design rationale (why is the UI structured into this set of workspaces?, what's the purpose of this button?), its current state (why is the menu disabled?) as well as the evo- lution of the state (how can I enable this feature?). Explana- tions are provided by embedded models. We explore model- driven engineering to understand why and how this approach can lead us to overcome shortcomings of UI quality success- fully

    The simplicity project: easing the burden of using complex and heterogeneous ICT devices and services

    Get PDF
    As of today, to exploit the variety of different "services", users need to configure each of their devices by using different procedures and need to explicitly select among heterogeneous access technologies and protocols. In addition to that, users are authenticated and charged by different means. The lack of implicit human computer interaction, context-awareness and standardisation places an enormous burden of complexity on the shoulders of the final users. The IST-Simplicity project aims at leveraging such problems by: i) automatically creating and customizing a user communication space; ii) adapting services to user terminal characteristics and to users preferences; iii) orchestrating network capabilities. The aim of this paper is to present the technical framework of the IST-Simplicity project. This paper is a thorough analysis and qualitative evaluation of the different technologies, standards and works presented in the literature related to the Simplicity system to be developed

    How Assessing Plasticity Design Choices Can Improve UI Quality: A Case Study

    Get PDF
    International audienceIn Human Computer Interaction, plasticity refers to the capacity of User Interfaces (UIs) to withstand variations of context of use while preserving quality in use. Frequently, insuring more or less smooth transition from one context of use to the other (from the end-user perspective) is conducted ad hoc. To support a more systematic approach for characterizing UI tuning in terms of quality in use along context of use variations, we present an exploratory study focused deliberately on platform aspects. The design process of this particular case study is detailed and all design decisions have been recorded in terms of their influence on UI ergonomic quality, using Ergonomic Criteria. The interesting result is that most design choices when changing the platform lead to the reexamination of the initial designs. Ongoing work is done to support the insight that considering plasticity seems to help in explicitly broadening UI design choices and sharpening the solution
    • 

    corecore