1,892 research outputs found

    Freeform User Interfaces for Graphical Computing

    Get PDF
    報告番号: 甲15222 ; 学位授与年月日: 2000-03-29 ; 学位の種別: 課程博士 ; 学位の種類: 博士(工学) ; 学位記番号: 博工第4717号 ; 研究科・専攻: 工学系研究科情報工学専

    How do interactive tabletop systems influence collaboration?

    Get PDF
    This paper examines some aspects of the usefulness of interactive tabletop systems, if and how these impact collaboration. We chose creative problem solving such as brainstorming as an application framework to test several collaborative media: the use of pen-and-paper tools, the ‘‘around-the-table’’ form factor, the digital tabletop interface, the attractiveness of interaction styles. Eighty subjects in total (20 groups of four members) participated in the experiments. The evaluation criteria were task performance, collaboration patterns (especially equity of contributions), and users’ subjective experience. The ‘‘aroundthe-table’’ form factor, which is hypothesized to promote social comparison, increased performance and improved collaboration through an increase of equity. Moreover, the attractiveness of the tabletop device improved subjective experience and increased motivation to engage in the task. However, designing attractiveness seems a highly challenging issue, since overly attractive interfaces may distract users from the task

    Designing Hybrid Interactions through an Understanding of the Affordances of Physical and Digital Technologies

    Get PDF
    Two recent technological advances have extended the diversity of domains and social contexts of Human-Computer Interaction: the embedding of computing capabilities into physical hand-held objects, and the emergence of large interactive surfaces, such as tabletops and wall boards. Both interactive surfaces and small computational devices usually allow for direct and space-multiplex input, i.e., for the spatial coincidence of physical action and digital output, in multiple points simultaneously. Such a powerful combination opens novel opportunities for the design of what are considered as hybrid interactions in this work. This thesis explores the affordances of physical interaction as resources for interface design of such hybrid interactions. The hybrid systems that are elaborated in this work are envisioned to support specific social and physical contexts, such as collaborative cooking in a domestic kitchen, or collaborative creativity in a design process. In particular, different aspects of physicality characteristic of those specific domains are explored, with the aim of promoting skill transfer across domains. irst, different approaches to the design of space-multiplex, function-specific interfaces are considered and investigated. Such design approaches build on related work on Graspable User Interfaces and extend the design space to direct touch interfaces such as touch-sensitive surfaces, in different sizes and orientations (i.e., tablets, interactive tabletops, and walls). These approaches are instantiated in the design of several experience prototypes: These are evaluated in different settings to assess the contextual implications of integrating aspects of physicality in the design of the interface. Such implications are observed both at the pragmatic level of interaction (i.e., patterns of users' behaviors on first contact with the interface), as well as on user' subjective response. The results indicate that the context of interaction affects the perception of the affordances of the system, and that some qualities of physicality such as the 3D space of manipulation and relative haptic feedback can affect the feeling of engagement and control. Building on these findings, two controlled studies are conducted to observe more systematically the implications of integrating some of the qualities of physical interaction into the design of hybrid ones. The results indicate that, despite the fact that several aspects of physical interaction are mimicked in the interface, the interaction with digital media is quite different and seems to reveal existing mental models and expectations resulting from previous experience with the WIMP paradigm on the desktop PC

    Nested Explorative Maps: A new 3D canvas for conceptual design in architecture

    Get PDF
    International audienceIn this digital age, architects still need to alternate between paper sketches and 3D modeling software for their designs. Indeed, while 3D models enable to explore different views, creating them at very early stages might reduce creativity since they do not allow to superpose several tentative designs nor to refine them progressively, as sketches do. To enable exploratory design in 3D, we introduce Nested Explorative Maps, a new system dedicated to interactive design in architecture. Our model enables coarse to fine sketching of nested architectural structures, enabling to progressively sketch a 3D building from floor plan to interior design, thanks to a series of nested maps able to spread in 3D. Each map allows the visual representation of uncertainty as well as the interactive exploration of the alternative, tentative options. We validate the model through a user study conducted with professional architects, enabling us to highlight the potential of Nested Explorative Maps for conceptual design in architecture.En cette ère du numérique, les architectes doivent encore alterner entre le croquis papier et logiciels de modélisation 3D afin de réaliser leurs conceptions. En effet, les modèles 3D permettent d’explorer différentes vues mais leur création à un stade très précoce peut impliquer une perte de la créativité car ils ne permettent pas de superposer plusieurs plans provisoires ni de les affiner progressivement, comme le font les esquisses. Pour permettre la conception exploratoire dans l'espace 3D, nous présentons Nested Explorative Maps, un nouveau système dédié à la conception interactive en architecture. Notre modèle permet de dessiner du grossier aux détails des structures architecturales imbriquées, afin de dessiner progressivement un bâtiment en 3D, du plan à la décoration intérieure, grâce à une série de cartes imbriquées capables de se répandre en 3D. Chaque carte permet de représenter visuellement l’incertitude et d’explorer de manière interactive les différentes options possibles. Une étude utilisateur réalisée auprès d'architectes professionnels nous a permis de valider notre modèle et de mettre en évidence le potentiel des cartes exploratoires imbriquées pour la conception conceptuelle en architecture

    Phrasing Bimanual Interaction for Visual Design

    Get PDF
    Architects and other visual thinkers create external representations of their ideas to support early-stage design. They compose visual imagery with sketching to form abstract diagrams as representations. When working with digital media, they apply various visual operations to transform representations, often engaging in complex sequences. This research investigates how to build interactive capabilities to support designers in putting together, that is phrasing, sequences of operations using both hands. In particular, we examine how phrasing interactions with pen and multi-touch input can support modal switching among different visual operations that in many commercial design tools require using menus and tool palettes—techniques originally designed for the mouse, not pen and touch. We develop an interactive bimanual pen+touch diagramming environment and study its use in landscape architecture design studio education. We observe interesting forms of interaction that emerge, and how our bimanual interaction techniques support visual design processes. Based on the needs of architects, we develop LayerFish, a new bimanual technique for layering overlapping content. We conduct a controlled experiment to evaluate its efficacy. We explore the use of wearables to identify which user, and distinguish what hand, is touching to support phrasing together direct-touch interactions on large displays. From design and development of the environment and both field and controlled studies, we derive a set methods, based upon human bimanual specialization theory, for phrasing modal operations through bimanual interactions without menus or tool palettes

    Design and semantics of form and movement (DeSForM 2006)

    Get PDF
    Design and Semantics of Form and Movement (DeSForM) grew from applied research exploring emerging design methods and practices to support new generation product and interface design. The products and interfaces are concerned with: the context of ubiquitous computing and ambient technologies and the need for greater empathy in the pre-programmed behaviour of the ‘machines’ that populate our lives. Such explorative research in the CfDR has been led by Young, supported by Kyffin, Visiting Professor from Philips Design and sponsored by Philips Design over a period of four years (research funding £87k). DeSForM1 was the first of a series of three conferences that enable the presentation and debate of international work within this field: • 1st European conference on Design and Semantics of Form and Movement (DeSForM1), Baltic, Gateshead, 2005, Feijs L., Kyffin S. & Young R.A. eds. • 2nd European conference on Design and Semantics of Form and Movement (DeSForM2), Evoluon, Eindhoven, 2006, Feijs L., Kyffin S. & Young R.A. eds. • 3rd European conference on Design and Semantics of Form and Movement (DeSForM3), New Design School Building, Newcastle, 2007, Feijs L., Kyffin S. & Young R.A. eds. Philips sponsorship of practice-based enquiry led to research by three teams of research students over three years and on-going sponsorship of research through the Northumbria University Design and Innovation Laboratory (nuDIL). Young has been invited on the steering panel of the UK Thinking Digital Conference concerning the latest developments in digital and media technologies. Informed by this research is the work of PhD student Yukie Nakano who examines new technologies in relation to eco-design textiles

    HandPainter – 3D sketching in VR with hand-based physical proxy

    Get PDF
    3D sketching in virtual reality (VR) enables users to create 3D virtual objects intuitively and immersively. However, previous studies showed that mid-air drawing may lead to inaccurate sketches. To address this issue, we propose to use one hand as a canvas proxy and the index finger of the other hand as a 3D pen. To this end, we first perform a formative study to compare two-handed interaction with tablet-pen interaction for VR sketching. Based on the findings of this study, we design HandPainter, a VR sketching system which focuses on the direct use of two hands for 3D sketching without requesting any tablet, pen, or VR controller. Our implementation is based on a pair of VR gloves, which provide hand tracking and gesture capture. We devise a set of intuitive gestures to control various functionalities required during 3D sketching, such as canvas panning and drawing positioning. We show the effectiveness of HandPainter by presenting a number of sketching results and discussing the outcomes of a user study-based comparison with mid-air drawing and tablet-based sketching tools
    corecore