113 research outputs found

    The "Seen but Unnoticed" Vocabulary of Natural Touch: Revolutionizing Direct Interaction with Our Devices and One Another (UIST 2021 Vision)

    Full text link
    This UIST Vision argues that "touch" input and interaction remains in its infancy when viewed in context of the seen but unnoticed vocabulary of natural human behaviors, activity, and environments that surround direct interaction with displays. Unlike status-quo touch interaction -- a shadowplay of fingers on a single screen -- I argue that our perspective of direct interaction should encompass the full rich context of individual use (whether via touch, sensors, or in combination with other modalities), as well as collaborative activity where people are engaged in local (co-located), remote (tele-present), and hybrid work. We can further view touch through the lens of the "Society of Devices," where each person's activities span many complementary, oft-distinct devices that offer the right task affordance (input modality, screen size, aspect ratio, or simply a distinct surface with dedicated purpose) at the right place and time. While many hints of this vision already exist (see references), I speculate that a comprehensive program of research to systematically inventory, sense, and design interactions around such human behaviors and activities -- and that fully embrace touch as a multi-modal, multi-sensor, multi-user, and multi-device construct -- could revolutionize both individual and collaborative interaction with technology.Comment: 5 pages. Non-archival UIST Vision paper accepted and presented at the 34th Annual ACM Symposium on User Interface Software and Technology (UIST 2021) by Ken Hinckley. This is the definitive "published" version as the Association of Computing Machinery (ACM) does not archive UIST Vision paper

    Thumb + Pen Interaction on Tablets

    Get PDF
    ABSTRACT Modern tablets support simultaneous pen and touch input, but it remains unclear how to best leverage this capability for bimanual input when the nonpreferred hand holds the tablet. We explore Thumb + Pen interactions that support simultaneous pen and touch interaction, with both hands, in such situations. Our approach engages the thumb of the device-holding hand, such that the thumb interacts with the touch screen in an indirect manner, thereby complementing the direct input provided by the preferred hand. For instance, the thumb can determine how pen actions (articulated with the opposite hand) are interpreted. Alternatively, the pen can point at an object, while the thumb manipulates one or more of its parameters through indirect touch. Our techniques integrate concepts in a novel way that derive from marking menus, spring-loaded modes, indirect input, and multi-touch conventions. Our overall approach takes the form of a set of probes, each representing a meaningfully distinct class of application. They serve as an initial exploration of the design space at a level which will help determine the feasibility of supporting bimanual interaction in such contexts, and the viability of the Thumb + Pen techniques in so doing

    The uncanny valley of haptics

    Get PDF
    During teleoperation and virtual reality experiences, enhanced haptic feedback incongruent with other sensory cues can reduce subjective realism, producing an uncanny valley of haptics

    The uncanny valley of haptics

    Get PDF
    During teleoperation and virtual reality experiences, enhanced haptic feedback incongruent with other sensory cues can reduce subjective realism, producing an uncanny valley of haptics

    Outline Pursuits:Gaze-assisted Selection of Occluded Objects in Virtual Reality

    Get PDF
    In 3D environments, objects can be difficult to select when they overlap, as this affects available target area and increases selection ambiguity. We introduce Outline Pursuits which extends a primary pointing modality for gaze-assisted selection of occluded objects. Candidate targets within a pointing cone are presented with an outline that is traversed by a moving stimulus. This affords completion of the selection by gaze attention to the intended target's outline motion, detected by matching the user's smooth pursuit eye movement. We demonstrate two techniques implemented based on the concept, one with a controller as the primary pointer, and one in which Outline Pursuits are combined with head pointing for hands-free selection. Compared with conventional raycasting, the techniques require less movement for selection as users do not need to reposition themselves for a better line of sight, and selection time and accuracy are less affected when targets become highly occluded

    Eye&Head:Synergetic Eye and Head Movement for Gaze Pointing and Selection

    Get PDF
    Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection

    Spitzer Follow-up of Extremely Cold Brown Dwarfs Discovered by the Backyard Worlds: Planet 9 Citizen Science Project

    Get PDF
    We present Spitzer follow-up imaging of 95 candidate extremely cold brown dwarfs discovered by the Backyard Worlds: Planet 9 citizen science project, which uses visually perceived motion in multiepoch Wide-field Infrared Survey Explorer (WISE) images to identify previously unrecognized substellar neighbors to the Sun. We measure Spitzer [3.6]–[4.5] color to phototype our brown dwarf candidates, with an emphasis on pinpointing the coldest and closest Y dwarfs within our sample. The combination of WISE and Spitzer astrometry provides quantitative confirmation of the transverse motion of 75 of our discoveries. Nine of our motion-confirmed objects have best-fit linear motions larger than 1'' yr⁻¹; our fastest-moving discovery is WISEA J155349.96+693355.2 (μ ≈ 2.”15 yr⁻¹), a possible T-type subdwarf. We also report a newly discovered wide-separation (~400 au) T8 comoving companion to the white dwarf LSPM J0055+5948 (the fourth such system to be found), plus a candidate late T companion to the white dwarf LSR J0002+6357 at 5 5 projected separation (~8700 au if associated). Among our motion-confirmed targets, five have Spitzer colors most consistent with spectral type Y. Four of these five have exceptionally red Spitzer colors suggesting types of Y1 or later, adding considerably to the small sample of known objects in this especially valuable low-temperature regime. Our Y dwarf candidates begin bridging the gap between the bulk of the Y dwarf population and the coldest known brown dwarf

    Spitzer Follow-up of Extremely Cold Brown Dwarfs Discovered by the Backyard Worlds: Planet 9 Citizen Science Project

    Get PDF
    We present Spitzer follow-up imaging of 95 candidate extremely cold brown dwarfs discovered by the Backyard Worlds: Planet 9 citizen science project, which uses visually perceived motion in multi-epoch WISE images to identify previously unrecognized substellar neighbors to the Sun. We measure Spitzer [3.6]-[4.5] color to phototype our brown dwarf candidates, with an emphasis on pinpointing the coldest and closest Y dwarfs within our sample. The combination of WISE and Spitzer astrometry provides quantitative confirmation of the transverse motion of 75 of our discoveries. Nine of our motion-confirmed objects have best-fit linear motions larger than 1"/yr; our fastest-moving discovery is WISEA J155349.96+693355.2 (total motion ~2.15"/yr), a possible T type subdwarf. We also report a newly discovered wide-separation (~400 AU) T8 comoving companion to the white dwarf LSPM J0055+5948 (the fourth such system to be found), plus a candidate late T companion to the white dwarf LSR J0002+6357 at 5.5' projected separation (~8,700 AU if associated). Among our motion-confirmed targets, five have Spitzer colors most consistent with spectral type Y. Four of these five have exceptionally red Spitzer colors suggesting types of Y1 or later, adding considerably to the small sample of known objects in this especially valuable low-temperature regime. Our Y dwarf candidates begin bridging the gap between the bulk of the Y dwarf population and the coldest known brown dwarf.Comment: accepted for publication in The Astrophysical Journa
    corecore