5 research outputs found

    Towards an account of intuitiveness

    Get PDF
    Intuitive systems are usable systems. Design guidelines advocate intuitiveness and vendors claim it - but what does it mean for a user interface, interactive system, or device to be intuitive? A review of the use of the term 'intuitive' indicates that it has two distinct but overlapping meanings, namely intuitiveness based on familiarity and intuitiveness reflecting our embodiment (and frequently both). While everyday usage indicates that familiarity means either a passing acquaintance or an intimacy with something or someone, it will be concluded that familiarity might best be equated with 'know-how', which in turn is based on a deep, often tacit, understanding. The intuitive nature of tangible user interfaces will in turn be attributed to embodiment rather than tangibility per se. Merleau-Ponty writes that it is through our bodies that we 'prehend' the world. A number of disciplines now regard action-perception as so closely coupled that they are better considered as a dyad rather than separately. A modified treatment of action-perception coupling is proposed, with familiarity providing an epistemic core, as the basis of intuitiveness

    Understanding and Rejecting Errant Touches on Multi-touch Tablets

    Get PDF
    Given the pervasion of multi-touch tablet, pen-based applications have rapidly moved onto this new platform. Users draw both with bare fingers and using capacitive pens as they would do on paper in the past. Unlike paper, these tablets cannot distinguish legitimate finger/pen input from accidental touches by other parts of the user's hand. In this thesis, we refer it to as errant touch rejection problem since users may unintentionally touch the screen with other parts of their hand. In this thesis, I design, implement and evaluate new approaches, bezel-focus rejection, of preventing errant touches on multi-touch tablets. I began the research by conducting a formal study to collect and characterize errant touches. I analyzed the data collected from the study and the results are guiding me to design rejection techniques. I will conclude this research by developing bezel-focus rejection and evaluate its performance. The results show that bezel-focus rejection yields high rejection rate of errant touches and make users more inclined to rest hands on tablet than comparison techniques. This research has two major contributions to Human Computer Interaction (HCI) community. First, my proposed errant touch rejection approaches can be applied the other pen-based note-taking applications. Second, my experimental results can serve as a guide to other developing similar techniques

    Barehand Mode Switching in Touch and Mid-Air Interfaces

    Get PDF
    Raskin defines a mode as a distinct setting within an interface where the same user input will produce results different to those it would produce in other settings. Most interfaces have multiple modes in which input is mapped to different actions, and, mode-switching is simply the transition from one mode to another. In touch interfaces, the current mode can change how a single touch is interpreted: for example, it could draw a line, pan the canvas, select a shape, or enter a command. In Virtual Reality (VR), a hand gesture-based 3D modelling application may have different modes for object creation, selection, and transformation. Depending on the mode, the movement of the hand is interpreted differently. However, one of the crucial factors determining the effectiveness of an interface is user productivity. Mode-switching time of different input techniques, either in a touch interface or in a mid-air interface, affects user productivity. Moreover, when touch and mid-air interfaces like VR are combined, making informed decisions pertaining to the mode assignment gets even more complicated. This thesis provides an empirical investigation to characterize the mode switching phenomenon in barehand touch-based and mid-air interfaces. It explores the potential of using these input spaces together for a productivity application in VR. And, it concludes with a step towards defining and evaluating the multi-faceted mode concept, its characteristics and its utility, when designing user interfaces more generally

    Mobile interaction using paperweight metaphor

    No full text

    Mobile Interaction Using Paperweight Metaphor

    No full text
    PDAや携帯電話などの表示画面が小さな小型情報機器では,大きなサイズのWWWページなどを閲覧する場合に,スクロール操作と閲覧操作を頻繁に繰り返す必要がある.本論文では,表示コンテンツのスクロール操作と編集操作を直感的に切り替える手段として,文鎮メタファに基づくインタフェース手法を提案する.平滑な机の上に紙片を置き,片手で紙片に文字を書き込もうとする場合,筆記具の先だけを紙の上において動かすと,文字を書くことができず紙が滑ってしまうことがある.このような状況では,人は,手のひらを使って,紙を押さえて固定して文字を書こうとする.手を使って文鎮のように紙を押さえるこの動作を メタファとして利用すれば,スクロールと編集操作をスムーズに切り替えるインタフェースが実現可能である.そこで,本論文では,ペン入力が可能なPDAなどの手のひらが当たる部分にタッチセンサを取り付けたデバイスを提案する.これにより,手のひらがタッチセンサに触れていないときに,ペンでドラッグするとコンテンツがスクロールするインタフェースを実現できる.本論文では,このインタフェースを実装し,地図,WWWページ,写真を閲覧するアプリケーションを試作し評価した.Conventional scrolling methods for small sized display in PDAs or mobile phones are difficult to use when frequent switching of scrolling and editing operations are required, for example, browsing and operating large sized WWW pages. In this paper, we propose a new user-interface method to provide seamless switching between scrolling and other operations such as editing, based on Paperweight Metaphor. A sheet of paper that has been placed on a slippery table is difficult to draw on. Therefore, in order to write or draw something on the sheet of paper, a person must secure the paper with his/her palm to avoid the paper from moving. This will be a good metaphor to design switching operation of scroll and editing modes. We have made prototype systems by placing a touch sensor under each PDA display where user's palm will be hit. We also have developed application programs to browse maps, WWW pages and photographs, which switch scrolling and other operation mode by the sensor output, and have evaluated them
    corecore