64 research outputs found

    Comparing Evaluation Methods for Encumbrance and Walking on Interaction with Touchscreen Mobile Devices

    Get PDF
    In this paper, two walking evaluation methods were compared to evaluate the effects of encumbrance while the preferred walking speed (PWS) is controlled. Users frequently carry cumbersome objects (e.g. shopping bags) and use mobile devices at the same time which can cause interaction difficulties and erroneous input. The two methods used to control the PWS were: walking on a treadmill and walking around a predefined route on the ground while following a pacesetter. The results from our target acquisition experiment showed that for ground walking at 100% of PWS, accuracy dropped to 36% when carrying a bag in the dominant hand while accuracy reduced to 34% for holding a box under the dominant arm. We also discuss the advantages and limitations of each evaluation method when examining encumbrance and suggest treadmill walking is not the most suitable approach to use if walking speed is an important factor in future mobile studies

    The effects of encumbrance and mobility on interactions with touchscreen mobile devices

    Get PDF
    Mobile handheld devices such as smartphones are now convenient as they allow users to make calls, reply to emails, find nearby services and many more. The increase in functionality and availability of mobile applications also allow mobile devices to be used in many different everyday situations (for example, while on the move and carrying items). While previous work has investigated the interaction difficulties in walking situations, there is a lack of empirical work in the literature on mobile input when users are physically constrained by other activities. As a result, how users input on touchscreen handheld devices in encumbered and mobile contexts is less well known and deserves more attention to examine the usability issues that are often ignored. This thesis investigates targeting performance on touchscreen mobile phones in one common encumbered situation - when users are carrying everyday objects while on the move. To identify the typical objects held during mobile interactions and define a set of common encumbrance scenarios to evaluate in subsequent user studies, Chapter 3 describes an observational study that examined users in different public locations. The results showed that people carried different types of bags and boxes the most frequently. To measure how much tapping performance on touchscreen mobile phones is affected, Chapter 4 examines a range of encumbrance scenarios, which includes holding a bag in-hand or a box underarm, either on the dominant or non-dominant side, during target selections on a mobile phone. Users are likely to switch to a more effective input posture when encumbered and on the move, so Chapter 5 investigates one- and two- handed encumbered interactions and evaluates situations where both hands are occupied with multiple objects. Touchscreen devices afford various multi-touch input types, so Chapter 6 compares the performance of four main one- and two- finger gesture inputs: tapping, dragging, spreading & pinching and rotating, while walking and encumbered. Several main evaluation approaches have been used in previous walking studies, but more attention is required when the effects of encumbrance is also being examined. Chapter 7 examines the appropriateness of two methods (ground and treadmill walking) for encumbered and walking studies, justifies the need to control walking speed and examines the effects of varying walking speed (i.e. walking slower or faster than normal) on encumbered targeting performance. The studies all showed a reduction in targeting performance when users were walking and encumbered, so Chapter 8 explores two ways to improve target selections. The first approach defines a target size, based on the results collected from earlier studies, to increase tapping accuracy and subsequently, a novel interface arrangement was designed which optimises screen space more effectively. The second approach evaluates a benchmark pointing technique, which has shown to improve the selection of small targets, to see if it is useful in walking and encumbered contexts

    SemanticLock: An authentication method for mobile devices using semantically-linked images

    Full text link
    We introduce SemanticLock, a single factor graphical authentication solution for mobile devices. SemanticLock uses a set of graphical images as password tokens that construct a semantically memorable story representing the user`s password. A familiar and quick action of dragging or dropping the images into their respective positions either in a \textit{continous flow} or in \textit{discrete} movements on the the touchscreen is what is required to use our solution. The authentication strength of the SemanticLock is based on the large number of possible semantic constructs derived from the positioning of the image tokens and the type of images selected. Semantic Lock has a high resistance to smudge attacks and it equally exhibits a higher level of memorability due to its graphical paradigm. In a three weeks user study with 21 participants comparing SemanticLock against other authentication systems, we discovered that SemanticLock outperformed the PIN and matched the PATTERN both on speed, memorability, user acceptance and usability. Furthermore, qualitative test also show that SemanticLock was rated more superior in like-ability. SemanticLock was also evaluated while participants walked unencumbered and walked encumbered carrying "everyday" items to analyze the effects of such activities on its usage

    Evaluating Conversational User Interfaces when Mobil

    Get PDF

    Extending mobile touchscreen interaction

    Get PDF
    Touchscreens have become a de facto interface for mobile devices, and are penetrating further beyond their core application domain of smartphones. This work presents a design space for extending touchscreen interaction, to which new solutions may be mapped. Specific touchscreen enhancements in the domains of manual input, visual output and haptic feedback are explored and quantitative and experiental findings reported. Particular areas covered are unintentional interaction, screen locking, stereoscopic displays and picoprojection. In addition, the novel interaction approaches of finger identification and onscreen physical guides are also explored. The use of touchscreens in the domains of car dashboards and smart handbags are evaluated as domain specific use cases. This work draws together solutions from the broad area of mobile touchscreen interaction. Fruitful directions for future research are identified, and information is provided for future researchers addressing those topics.Kosketusnäytöistä on muodostunut mobiililaitteiden pääasiallinen käyttöliittymä, ja ne ovat levinneet alkuperäiseltä ydinsovellusalueeltaan, matkapuhelimista, myös muihin laitteisiin. Työssä tutkitaan uusia vuorovaikutuksen, visualisoinnin ja käyttöliittymäpalautteen keinoja, jotka laajentavat perinteistä kosketusnäytön avulla tapahtuvaa vuorovaikutusta. Näihin liittyen väitöskirjassa esitetään sekä kvantitatiivisia tuloksia että uutta kartoittavia löydöksiä. Erityisesti työ tarkastelee tahatonta kosketusnäytön käyttöä, kosketusnäytön lukitusta, stereoskooppisia kosketusnäyttöjä ja pikoprojektoreiden hyödyntämistä. Lisäksi kartoitetaan uusia vuorovaikutustapoja, jotka liittyvät sormien identifioimiseen vuorovaikutuksen yhteydessä, ja fyysisiin, liikettä ohjaaviin rakenteisiin kosketusnäytöllä. Kosketusnäytön käyttöä autossa sekä osana älykästä käsilaukkua tarkastellaan esimerkkeinä käyttökonteksteista. Väitöskirjassa esitetään vuorovaikutussuunnittelun viitekehys, joka laajentaa kosketusnäyttöjen kautta tapahtuvaa vuorovaikutusta mobiililaitteen kanssa, ja johon työssä esitellyt, uudet vuorovaikutustavat voidaan sijoittaa. Väitöskirja yhdistää kosketusnäyttöihin liittyviä käyttöliittymäsuunnittelun ratkaisuja laajalta alueelta. Työ esittelee potentiaalisia suuntaviivoja tulevaisuuden tutkimuksille ja tuo uutta tutkimustietoa, jota mobiililaitteiden vuorovaikutuksen tutkijat ja käyttöliittymäsuunnittelijat voivat hyödyntää

    Understanding in-context interaction: An investigation into on-the-go mobile search

    Get PDF
    Recent years have seen a profound change in how most users interact with search engines: the majority of search requests now come from mobile devices, which are used in a number of distracting contexts. This use of mobile devices in various situational contexts away from a desk presents a range of novel challenges for users and, consequently, possibilities for interface improvements. However, there is at present a lack of work that evaluates interaction in such contexts to understand what effects context and mobility have on behaviour and errors and, ultimately, users’ search performance. Through a controlled study, in which we simulate walking conditions on a treadmill and obstacle course, we use a combination of interaction logs and multiple video streams to capture interaction behaviour as participants (n = 24) complete simple search tasks. Using a bespoke tagging tool to analyse these recordings, we investigate how situational context and distractions impact user behaviour and performance, contrasting this with users in a baseline, seated condition. Our findings provide insights into the issues these common contexts cause, how users adapt and how such interfaces could be improved

    Extending mobile touchscreen interaction

    Get PDF
    Touchscreens have become a de facto interface for mobile devices, and are penetrating further beyond their core application domain of smartphones. This work presents a design space for extending touchscreen interaction, to which new solutions may be mapped. Specific touchscreen enhancements in the domains of manual input, visual output and haptic feedback are explored and quantitative and experiental findings reported. Particular areas covered are unintentional interaction, screen locking, stereoscopic displays and picoprojection. In addition, the novel interaction approaches of finger identification and onscreen physical guides are also explored. The use of touchscreens in the domains of car dashboards and smart handbags are evaluated as domain specific use cases. This work draws together solutions from the broad area of mobile touchscreen interaction. Fruitful directions for future research are identified, and information is provided for future researchers addressing those topics.Kosketusnäytöistä on muodostunut mobiililaitteiden pääasiallinen käyttöliittymä, ja ne ovat levinneet alkuperäiseltä ydinsovellusalueeltaan, matkapuhelimista, myös muihin laitteisiin. Työssä tutkitaan uusia vuorovaikutuksen, visualisoinnin ja käyttöliittymäpalautteen keinoja, jotka laajentavat perinteistä kosketusnäytön avulla tapahtuvaa vuorovaikutusta. Näihin liittyen väitöskirjassa esitetään sekä kvantitatiivisia tuloksia että uutta kartoittavia löydöksiä. Erityisesti työ tarkastelee tahatonta kosketusnäytön käyttöä, kosketusnäytön lukitusta, stereoskooppisia kosketusnäyttöjä ja pikoprojektoreiden hyödyntämistä. Lisäksi kartoitetaan uusia vuorovaikutustapoja, jotka liittyvät sormien identifioimiseen vuorovaikutuksen yhteydessä, ja fyysisiin, liikettä ohjaaviin rakenteisiin kosketusnäytöllä. Kosketusnäytön käyttöä autossa sekä osana älykästä käsilaukkua tarkastellaan esimerkkeinä käyttökonteksteista. Väitöskirjassa esitetään vuorovaikutussuunnittelun viitekehys, joka laajentaa kosketusnäyttöjen kautta tapahtuvaa vuorovaikutusta mobiililaitteen kanssa, ja johon työssä esitellyt, uudet vuorovaikutustavat voidaan sijoittaa. Väitöskirja yhdistää kosketusnäyttöihin liittyviä käyttöliittymäsuunnittelun ratkaisuja laajalta alueelta. Työ esittelee potentiaalisia suuntaviivoja tulevaisuuden tutkimuksille ja tuo uutta tutkimustietoa, jota mobiililaitteiden vuorovaikutuksen tutkijat ja käyttöliittymäsuunnittelijat voivat hyödyntää

    Behaviour-aware mobile touch interfaces

    Get PDF
    Mobile touch devices have become ubiquitous everyday tools for communication, information, as well as capturing, storing and accessing personal data. They are often seen as personal devices, linked to individual users, who access the digital part of their daily lives via hand-held touchscreens. This personal use and the importance of the touch interface motivate the main assertion of this thesis: Mobile touch interaction can be improved by enabling user interfaces to assess and take into account how the user performs these interactions. This thesis introduces the new term "behaviour-aware" to characterise such interfaces. These behaviour-aware interfaces aim to improve interaction by utilising behaviour data: Since users perform touch interactions for their main tasks anyway, inferring extra information from said touches may, for example, save users' time and reduce distraction, compared to explicitly asking them for this information (e.g. user identity, hand posture, further context). Behaviour-aware user interfaces may utilise this information in different ways, in particular to adapt to users and contexts. Important questions for this research thus concern understanding behaviour details and influences, modelling said behaviour, and inference and (re)action integrated into the user interface. In several studies covering both analyses of basic touch behaviour and a set of specific prototype applications, this thesis addresses these questions and explores three application areas and goals: 1) Enhancing input capabilities – by modelling users' individual touch targeting behaviour to correct future touches and increase touch accuracy. The research reveals challenges and opportunities of behaviour variability arising from factors including target location, size and shape, hand and finger, stylus use, mobility, and device size. The work further informs modelling and inference based on targeting data, and presents approaches for simulating touch targeting behaviour and detecting behaviour changes. 2) Facilitating privacy and security – by observing touch targeting and typing behaviour patterns to implicitly verify user identity or distinguish multiple users during use. The research shows and addresses mobile-specific challenges, in particular changing hand postures. It also reveals that touch targeting characteristics provide useful biometric value both in the lab as well as in everyday typing. Influences of common evaluation assumptions are assessed and discussed as well. 3) Increasing expressiveness – by enabling interfaces to pass on behaviour variability from input to output space, studied with a keyboard that dynamically alters the font based on current typing behaviour. Results show that with these fonts users can distinguish basic contexts as well as individuals. They also explicitly control font influences for personal communication with creative effects. This thesis further contributes concepts and implemented tools for collecting touch behaviour data, analysing and modelling touch behaviour, and creating behaviour-aware and adaptive mobile touch interfaces. Together, these contributions support researchers and developers in investigating and building such user interfaces. Overall, this research shows how variability in mobile touch behaviour can be addressed and exploited for the benefit of the users. The thesis further discusses opportunities for transfer and reuse of touch behaviour models and information across applications and devices, for example to address tradeoffs of privacy/security and usability. Finally, the work concludes by reflecting on the general role of behaviour-aware user interfaces, proposing to view them as a way of embedding expectations about user input into interactive artefacts

    Addressing the Challenges of Situationally-Induced Impairments and Disabilities in Mobile Interaction

    Get PDF
    Situationally-induced impairments and disabilities (SIIDs) make it difficult for users of interactive computing systems to perform tasks due to context (e.g., listening to a phone call when in a noisy crowd) rather than a result of a congenital or acquired impairment (e.g., hearing damage). SIIDs are a great concern when considering the ubiquitousness of technology in a wide range of contexts. Considering our daily reliance on technology, and mobile technology in particular, it is increasingly important that we fully understand and model how SIIDs occur. Similarly, we must identify appropriate methods for sensing and adapting technology to reduce the effects of SIIDs. In this workshop, we will bring together researchers working on understanding, sensing, modelling, and adapting technologies to ameliorate the effects of SIIDs. This workshop will provide a venue to identify existing research gaps, new directions for future research, and opportunities for future collaboration
    corecore