116 research outputs found

    Phrasing Bimanual Interaction for Visual Design

    Get PDF
    Architects and other visual thinkers create external representations of their ideas to support early-stage design. They compose visual imagery with sketching to form abstract diagrams as representations. When working with digital media, they apply various visual operations to transform representations, often engaging in complex sequences. This research investigates how to build interactive capabilities to support designers in putting together, that is phrasing, sequences of operations using both hands. In particular, we examine how phrasing interactions with pen and multi-touch input can support modal switching among different visual operations that in many commercial design tools require using menus and tool palettes—techniques originally designed for the mouse, not pen and touch. We develop an interactive bimanual pen+touch diagramming environment and study its use in landscape architecture design studio education. We observe interesting forms of interaction that emerge, and how our bimanual interaction techniques support visual design processes. Based on the needs of architects, we develop LayerFish, a new bimanual technique for layering overlapping content. We conduct a controlled experiment to evaluate its efficacy. We explore the use of wearables to identify which user, and distinguish what hand, is touching to support phrasing together direct-touch interactions on large displays. From design and development of the environment and both field and controlled studies, we derive a set methods, based upon human bimanual specialization theory, for phrasing modal operations through bimanual interactions without menus or tool palettes

    Physical Interaction Concepts for Knowledge Work Practices

    Get PDF
    The majority of workplaces in developed countries concern knowledge work. Accordingly, the IT industry and research made great efforts for many years to support knowledge workers -- and indeed, computer-based information workplaces have come of age. Nevertheless, knowledge work in the physical world has still quite a number of unique advantages, and the integration of physical and digital knowledge work leaves a lot to be desired. The present thesis aims at reducing these deficiencies; thereby, it leverages late technology trends, in particular interactive tabletops and resizable hand-held displays. We start from the observation that knowledge workers develop highly efficient practices, skills, and dexterity of working with physical objects in the real world, whether content-unrelated (coffee mugs, stationery etc.) or content-related (books, notepads etc.). Among the latter, paper-based objects -- the notorious analog information bearers -- represent by far the most relevant (super-) category. We discern two kinds of practices: collective practices concern the arrangement of objects with respect to other objects and the desk, while specific practices operate on individual objects and usually alter them. The former are mainly employed for an effective management of the physical desktop workspace -- e.g., everyday objects are frequently moved on tables to optimize the desk as a workplace -- or an effective organization of paper-based documents on the desktop -- e.g., stacking, fanning out, sorting etc. The latter concern the specific manipulation of physical objects related to the task at hand, i.e. knowledge work. Widespread assimilated practices concern not only writing on, annotating, or spatially arranging paper documents but also sophisticated manipulations -- such as flipping, folding, bending, etc. Compared to the wealth of such well-established practices in the real world, those for digital knowledge work are bound by the indirection imposed by mouse and keyboard input, where the mouse provided such a great advancement that researchers were seduced to calling its use "direct manipulation". In this light, the goal of this thesis can be rephrased as exploring novel interaction concepts for knowledge workers that i) exploit the flexible and direct manipulation potential of physical objects (as present in the real world) for more intuitive and expressive interaction with digital content, and ii) improve the integration of the physical and digital knowledge workplace. Thereby, two directions of research are pursued. Firstly, the thesis investigates the collective practices executed on the desks of knowledge workers, thereby discerning content-related (more precisely, paper-based documents) and content-unrelated object -- this part is coined as table-centric approaches and leverages the technology of interactive tabletops. Secondly, the thesis looks at specific practices executed on paper, obviously concentrating on knowledge related tasks due to the specific role of paper -- this part is coined as paper-centric approaches and leverages the affordances of paper-like displays, more precisely of resizable i.e. rollable and foldable displays. The table-centric approach leads to the challenge of blending interactive tabletop technology with the established use of physical desktop workspaces. We first conduct an exploratory user study to investigate behavioral and usage patterns of interaction with both physical and digital documents on tabletop surfaces while performing tasks such as grouping and browsing. Based on results of the study, we contribute two sets of interaction and visualization concepts -- coined as PaperTop and ObjecTop -- that concern specific paper based practices and collective practices, respectively. Their efficiency and effectiveness are evaluated in a series of user studies. As mentioned, the paper-centric perspective leverages late ultra-thin resizable display technology. We contribute two sets of novel interaction concepts again -- coined as FoldMe and Xpaaand -- that respond to the design space of dual-sided foldable and of rollout displays, respectively. In their design, we leverage the physical act of resizing not "just" for adjusting the screen real estate but also for interactively performing operations. Initial user studies show a great potential for interaction with digital contents, i.e. for knowledge work

    Style Blink: Exploring Digital Inking of Structured Information via Handcrafted Styling as a First-Class Object

    Get PDF
    Structured note-taking forms such as sketchnoting, self-tracking journals, and bullet journaling go beyond immediate capture of information scraps. Instead, hand-drawn pride-in-craftmanship increases perceived value for sharing and display. But hand-crafting lists, tables, and calendars is tedious and repetitive. To support these practices digitally, Style Blink ("Style-Blocks+Ink") explores handcrafted styling as a first-class object. Style-blocks encapsulate digital ink, enabling people to craft, modify, and reuse embellishments and decorations for larger structures, and apply custom layouts. For example, we provide interaction instruments that style ink for personal expression, inking palettes that afford creative experimentation, fillable pens that can be "loaded"with commands and actions to replace menu selections, techniques to customize inked structures post-creation by modifying the underlying handcrafted style-blocks and to re-layout the overall structure to match users' preferred template. In effect, any ink stroke, notation, or sketch can be encapsulated as a style-object and re-purposed as a tool. Feedback from 13 users show the potential of style adaptation and re-use in individual sketching practices

    Remote tactile feedback on interactive surfaces

    Get PDF
    Direct touch input on interactive surfaces has become a predominating standard for the manipulation of digital information in our everyday lives. However, compared to our rich interchange with the physical world, the interaction with touch-based systems is limited in terms of flexibility of input and expressiveness of output. Particularly, the lack of tactile feedback greatly reduces the general usability of a touch-based system and hinders from a productive entanglement of the virtual information with the physical world. This thesis proposes remote tactile feedback as a novel method to provide programmed tactile stimuli supporting direct touch interactions. The overall principle is to spatially decouple the location of touch input (e.g. fingertip or hand) and the location of the tactile sensation on the user's body (e.g. forearm or back). Remote tactile feedback is an alternative concept which avoids particular challenges of existing approaches. Moreover, the principle provides inherent characteristics which can accommodate for the requirements of current and future touch interfaces. To define the design space, the thesis provides a structured overview of current forms of touch surfaces and identifies trends towards non-planar and non-rigid forms with more versatile input mechanisms. Furthermore, a classification highlights limitations of the current methods to generate tactile feedback on touch-based systems. The proposed notion of tactile sensory relocation is a form of sensory substitution. Underlying neurological and psychological principles corroborate the approach. Thus, characteristics of the human sense of touch and principles from sensory substitution help to create a technical and conceptual framework for remote tactile feedback. Three consecutive user studies measure and compare the effects of both direct and remote tactile feedback on the performance and the subjective ratings of the user. Furthermore, the experiments investigate different body locations for the application of tactile stimuli. The results show high subjective preferences for tactile feedback, regardless of its type of application. Additionally, the data reveals no significant differences between the effects of direct and remote stimuli. The results back the feasibility of the approach and provide parameters for the design of stimuli and the effective use of the concept. The main part of the thesis describes the systematical exploration and analysis of the inherent characteristics of remote tactile feedback. Four specific features of the principle are identified: (1) the simplification of the integration of cutaneous stimuli, (2) the transmission of proactive, reactive and detached feedback, (3) the increased expressiveness of tactile sensations and (4) the provision of tactile feedback during multi-touch. In each class, several prototypical remote tactile interfaces are used in evaluations to analyze the concept. For example, the PhantomStation utilizes psychophysical phenomena to reduce the number of single tactile actuators. An evaluation with the prototype compares standard actuator technologies with each other in order to enable simple and scalable implementations. The ThermalTouch prototype creates remote thermal stimuli to reproduce material characteristics on standard touchscreens. The results show a stable rate of virtual object discrimination based on remotely applied temperature profiles. The AutmotiveRTF system is implemented in a vehicle and supports the driver's input on the in-vehicle-infotainment system. A field study with the system focuses on evaluating the effects of proactive and reactive feedback on the user's performance. The main contributions of the dissertation are: First, the thesis introduces the principle of remote tactile feedback and defines a design space for this approach as an alternative method to provide non-visual cues on interactive surfaces. Second, the thesis describes technical examples to rapidly prototype remote tactile feedback systems. Third, these prototypes are deployed in several evaluations which highlight the beneficial subjective and objective effects of the approach. Finally, the thesis presents features and inherent characteristics of remote tactile feedback as a means to support the interaction on today's touchscreens and future interactive surfaces.Die Interaktion mit berührungsempfindlichen Oberflächen ist heute ein Standard für die Manipulation von digitaler Information. Jedoch weist die Bedienung dieser interaktiven Bildschirme starke Einschränkungen hinsichtlich der Flexibilität bei der Eingabe und der Ausdruckskraft der Ausgabe auf, wenn man sie mit den vielfältigen Möglichkeiten des Umgangs mit Objekten in unserer Alltagswelt vergleicht. Besonders die nicht vorhandenen Tastsinnesrückmeldungen vermindern stark die Benutzbarkeit solcher Systeme und verhindern eine effektive Verknüpfung von virtueller Information und physischer Welt. Die vorliegende Dissertation beschreibt den Ansatz der 'distalen taktilen Rückmeldungen' als neuartige Möglichkeit zur Vermittlung programmierter Tastsinnesreize an Benutzer interaktiver Oberflächen. Das Grundprinzip dabei ist die räumliche Trennung zwischen der Eingabe durch Berührung (z.B. mit der Fingerspitze) und dem daraus resultierenden taktilen Reiz am Körper der Benutzer (z.B. am Rücken). Dabei vermeidet das Konzept der distalen taktilen Rückmeldungen einzelne technische und konzeptionelle Nachteile existierender Ansätze. Zusätzlich bringt es Interaktionsmöglichkeiten mit sich, die den Eigenheiten der Interaktion mit aktuellen und auch zukünftigen berührungsempfindlichen Oberflächen Rechnung tragen. Zu Beginn zeigt ein Überblick zu relevanten Arbeiten den aktuellen Forschungstrend hin zu nicht-flachen und verformbaren berührungsempfindlichen Oberflächen sowie zu vielfältigeren Eingabemethoden. Eine Klassifizierung ordnet existierende technische Verfahren zur Erzeugung von künstlichen Tastsinnesreizen und stellt jeweils konzeptuelle und technische Herausforderungen dar. Der in dieser Arbeit vorgeschlagene Ansatz der Verlagerung von Tastsinnesreizen ist eine Form der sensorischen Substitution, zugrunde liegende neurologische und psychologische Prinzipien untermauern das Vorgehen. Die Wirkprinzipien des menschlichen Tastsinnes und die Systeme zur sensorischen Substitution liefern daher konzeptionelle und technische Richtlinien zur Umsetzung der distalen taktilen Rückmeldungen. Drei aufeinander aufbauende Benutzerstudien vergleichen die Auswirkungen von direkten und distalen taktilen Rückmeldungen auf die Leistung und das Verhalten von Benutzern sowie deren subjektive Bewertung der Interaktion. Außerdem werden in den Experimenten die Effekte von Tastsinnesreizen an verschiedenen Körperstellen untersucht. Die Ergebnisse zeigen starke Präferenzen für Tastsinnesrückmeldungen, unabhängig von deren Applikationsort. Die Daten ergeben weiterhin keine signifikanten Unterschiede bei den quantitativen Effekten von direktem und distalen Rückmeldungen. Diese Ergebnisse befürworten die Realisierbarkeit des Ansatzes und zeigen Richtlinien für weitere praktische Umsetzungen auf. Der Hauptteil der Dissertation beschreibt die systematische Untersuchung und Analyse der inhärenten Möglichkeiten, die sich aus der Vermittlung distaler taktiler Rückmeldungen ergeben. Vier verschiedene Charakteristika werden identifiziert: (1) die vereinfachte Integration von Tastsinnesreizen, (2) die Vermittlung von proaktiven, reaktiven und entkoppelten Rückmeldungen, (3) die erhöhte Bandbreite der taktilen Signale und (4) die Darstellung von individuellen Tastsinnesreizen für verschiedene Kontaktpunkte mit der berührungsempfindlichen Oberfläche. Jedes dieser Prinzipien wird durch prototypische Systeme umgesetzt und in Benutzerstudien analysiert. Beispielsweise nutzt das System PhantomStation psychophysikalische Illusionen, um die Anzahl der einzelnen Reizgeber zu reduzieren. In einer Evaluierung des Prototypen werden mehrere Aktuatortechnologien verglichen, um einfache und skalierbare Ansätze zu identifizieren. Der ThermalTouch-Prototyp wird dazu genutzt, distale thermale Reize zu vermitteln, um so Materialeigenschaften auf Berührungsbildschirmen darstellen zu können. Eine Benutzerstudie zeigt, dass sich auf Basis dieser Temperaturverläufe virtuelle Objekte unterscheiden lassen. Das AutomotiveRTF-System wird schließlich in ein Kraftfahrzeug integriert, um den Fahrer bei der Eingabe auf dem Informations- und Unterhaltungssystem zu unterstützen. Eine Feldstudie untersucht die Auswirkungen der proaktiven und reaktiven Rückmeldungen auf die Benutzerleistung. Die vorliegende Dissertation leistet mehrere Beiträge zur Mensch-Maschine-Interaktion: Das Prinzip der distalen taktilen Rückmeldungen wird eingeführt als Alternative zur Erzeugung nicht-visueller Rückmeldungen auf interaktiven Oberflächen. Es werden technische Verfahrensweisen zur prototypischen Implementierung solcher Systeme vorgeschlagen. Diese technischen Prototypen werden in einer Vielzahl verschiedener Benutzerstudien eingesetzt, welche die quantitativen und qualitativen Vorteile des Ansatzes aufzeigen. Schließlich wird gezeigt, wie sich das Prinzip zur Unterstützung heutiger und zukünftiger Interaktionsformen mit berührungsempfindlichen Bildschirmen nutzen lässt

    Digital tabletops and collaborative learning

    Get PDF
    People collaborate around tables at home, school and work. Digital tabletop technology presents an opportunity to bring computer support to these traditional face-to-face collaborative settings. This thesis principally addresses the challenge of designing digital tabletop applications for small group learning in the classroom and makes contributions in two distinct, but closely related areas: (i) interaction techniques for digital tabletops; and (ii) the design and evaluation of a digital tabletop-based system for supporting collaborative learning. A review of previous literature combined with a preliminary observational study on collaboration around traditional tables indentifies a number of requirements for tabletop interaction. These include the need for fluid interaction techniques that allow control of interface object attributes when these objects are moved between tabletop territories. Attribute gates are proposed as a solution to this problem through utilizing a novel, crossing-based, interaction technique. A recognition of the territorial focus in existing interaction techniques, and their limiting assumption that users work at relatively fixed locations around the table, led to the identification of another challenge, supporting the mobility of users around the shared workspace of the table. TANGISOFT is presented as a hybrid tangible-soft keyboard designed specifically for applications that require mobile users with moderate text entry requirements. The investigation of the potential of tabletop technology to support collaborative learning was carried out through the design, development, and evaluation of Digital Mysteries. From an interaction design perspective, the design aimed to utilize the unique affordances of tabletops in terms of combining the benefits of traditional tables and digital technology. From a learning perspective, the design aimed to support higher-level thinking skills, feedback, reflection, and metacognition by focusing on activities that promote these skills and supporting effective collaboration. The evaluation of Digital Mysteries demonstrated that the design was successful in encouraging the targeted learning activities. The design process and validation of Digital Mysteries embody a significant contribution to the development of our understanding of digital tabletop technology at the application level, and collaborative learning applications in particular. This understanding is summarized in the form of general guidelines for designing collaborative learning applications for digital tabletop technology.EThOS - Electronic Theses Online ServiceDiwan Software LtdGBUnited Kingdo

    Bringing the Physical to the Digital

    Get PDF
    This dissertation describes an exploration of digital tabletop interaction styles, with the ultimate goal of informing the design of a new model for tabletop interaction. In the context of this thesis the term digital tabletop refers to an emerging class of devices that afford many novel ways of interaction with the digital. Allowing users to directly touch information presented on large, horizontal displays. Being a relatively young field, many developments are in flux; hardware and software change at a fast pace and many interesting alternative approaches are available at the same time. In our research we are especially interested in systems that are capable of sensing multiple contacts (e.g., fingers) and richer information such as the outline of whole hands or other physical objects. New sensor hardware enable new ways to interact with the digital. When embarking into the research for this thesis, the question which interaction styles could be appropriate for this new class of devices was a open question, with many equally promising answers. Many everyday activities rely on our hands ability to skillfully control and manipulate physical objects. We seek to open up different possibilities to exploit our manual dexterity and provide users with richer interaction possibilities. This could be achieved through the use of physical objects as input mediators or through virtual interfaces that behave in a more realistic fashion. In order to gain a better understanding of the underlying design space we choose an approach organized into two phases. First, two different prototypes, each representing a specific interaction style – namely gesture-based interaction and tangible interaction – have been implemented. The flexibility of use afforded by the interface and the level of physicality afforded by the interface elements are introduced as criteria for evaluation. Each approaches’ suitability to support the highly dynamic and often unstructured interactions typical for digital tabletops is analyzed based on these criteria. In a second stage the learnings from these initial explorations are applied to inform the design of a novel model for digital tabletop interaction. This model is based on the combination of rich multi-touch sensing and a three dimensional environment enriched by a gaming physics simulation. The proposed approach enables users to interact with the virtual through richer quantities such as collision and friction. Enabling a variety of fine-grained interactions using multiple fingers, whole hands and physical objects. Our model makes digital tabletop interaction even more “natural”. However, because the interaction – the sensed input and the displayed output – is still bound to the surface, there is a fundamental limitation in manipulating objects using the third dimension. To address this issue, we present a technique that allows users to – conceptually – pick objects off the surface and control their position in 3D. Our goal has been to define a technique that completes our model for on-surface interaction and allows for “as-direct-as possible” interactions. We also present two hardware prototypes capable of sensing the users’ interactions beyond the table’s surface. Finally, we present visual feedback mechanisms to give the users the sense that they are actually lifting the objects off the surface. This thesis contributes on various levels. We present several novel prototypes that we built and evaluated. We use these prototypes to systematically explore the design space of digital tabletop interaction. The flexibility of use afforded by the interaction style is introduced as criterion alongside the user interface elements’ physicality. Each approaches’ suitability to support the highly dynamic and often unstructured interactions typical for digital tabletops are analyzed. We present a new model for tabletop interaction that increases the fidelity of interaction possible in such settings. Finally, we extend this model so to enable as direct as possible interactions with 3D data, interacting from above the table’s surface

    Designing Hybrid Interactions through an Understanding of the Affordances of Physical and Digital Technologies

    Get PDF
    Two recent technological advances have extended the diversity of domains and social contexts of Human-Computer Interaction: the embedding of computing capabilities into physical hand-held objects, and the emergence of large interactive surfaces, such as tabletops and wall boards. Both interactive surfaces and small computational devices usually allow for direct and space-multiplex input, i.e., for the spatial coincidence of physical action and digital output, in multiple points simultaneously. Such a powerful combination opens novel opportunities for the design of what are considered as hybrid interactions in this work. This thesis explores the affordances of physical interaction as resources for interface design of such hybrid interactions. The hybrid systems that are elaborated in this work are envisioned to support specific social and physical contexts, such as collaborative cooking in a domestic kitchen, or collaborative creativity in a design process. In particular, different aspects of physicality characteristic of those specific domains are explored, with the aim of promoting skill transfer across domains. irst, different approaches to the design of space-multiplex, function-specific interfaces are considered and investigated. Such design approaches build on related work on Graspable User Interfaces and extend the design space to direct touch interfaces such as touch-sensitive surfaces, in different sizes and orientations (i.e., tablets, interactive tabletops, and walls). These approaches are instantiated in the design of several experience prototypes: These are evaluated in different settings to assess the contextual implications of integrating aspects of physicality in the design of the interface. Such implications are observed both at the pragmatic level of interaction (i.e., patterns of users' behaviors on first contact with the interface), as well as on user' subjective response. The results indicate that the context of interaction affects the perception of the affordances of the system, and that some qualities of physicality such as the 3D space of manipulation and relative haptic feedback can affect the feeling of engagement and control. Building on these findings, two controlled studies are conducted to observe more systematically the implications of integrating some of the qualities of physical interaction into the design of hybrid ones. The results indicate that, despite the fact that several aspects of physical interaction are mimicked in the interface, the interaction with digital media is quite different and seems to reveal existing mental models and expectations resulting from previous experience with the WIMP paradigm on the desktop PC

    The tool space

    Get PDF
    Visions of futuristic desktop computer work spaces have often incorporated large interactive surfaces that either integrate into or replace the prevailing desk setup with displays, keyboard and mouse. Such visions often connote the distinct characteristics of direct touch interaction, e.g. by transforming the desktop into a large touch screen that allows interacting with content using one’s bare hands. However, the role of interactive surfaces for desktop computing may not be restricted to enabling direct interaction. Especially for prolonged interaction times, the separation of visual focus and manual input has proven to be ergonomic and is usually supported by vertical monitors and separate – hence indirect – input devices placed on the horizontal desktop. If we want to maintain this ergonomically matured style of computing with the introduction of interactive desktop displays, the following question arises: How can and should this novel input and output modality affect prevailing interaction techniques. While touch input devices have been used for decades in desktop computing as track pads or graphic tablets, the dynamic rendering of content and increasing physical dimensions of novel interactive surfaces open up new design opportunities for direct, indirect and hybrid touch input techniques. Informed design decisions require a careful consideration of the relationship between input sensing, visual display and applied interaction styles. Previous work in the context of desktop computing has focused on understanding the dual-surface setup as a holistic unit that supports direct touch input and allows the seamless transfer of objects across horizontal and vertical surfaces. In contrast, this thesis assumes separate spaces for input (horizontal input space) and output (vertical display space) and contributes to the understanding of how interactive surfaces can enrich indirect input for complex tasks, such as 3D modeling or audio editing. The contribution of this thesis is threefold: First, we present a set of case studies on user interface design for dual-surface computer workspaces. These case studies cover several application areas such as gaming, music production and analysis or collaborative visual layout and comprise formative evaluations. On the one hand, these case studies highlight the conflict that arises when the direct touch interaction paradigm is applied to dual-surface workspaces. On the other hand, they indicate how the deliberate avoidance of established input devices (i.e. mouse and keyboard) leads to novel design ideas for indirect touch-based input. Second, we introduce our concept of the tool space as an interaction model for dual-surface workspaces, which is derived from a theoretical argument and the previous case studies. The tool space dynamically renders task-specific input areas that enable spatial command activation and increase input bandwidth through leveraging multi-touch and two-handed input. We further present evaluations of two concept implementations in the domains 3D modeling and audio editing which demonstrate the high degrees of control, precision and sense of directness that can be achieved with our tools. Third, we present experimental results that inform the design of the tool space input areas. In particular, we contribute a set of design recommendations regarding the understanding of two-handed indirect multi-touch input and the impact of input area form factors on spatial cognition and navigation performance.Zukunftsvisionen thematisieren zuweilen neuartige, auf großen interaktiven Oberflächen basierende Computerarbeitsplätze, wobei etablierte PC-Komponenten entweder ersetzt oder erweitert werden. Oft schwingt bei derartigen Konzepten die Idee von natürlicher oder direkter Toucheingabe mit, die es beispielsweise erlaubt mit den Fingern direkt auf virtuelle Objekte auf einem großen Touchscreen zuzugreifen. Die Eingabe auf interaktiven Oberflächen muss aber nicht auf direkte Interaktionstechniken beschränkt sein. Gerade bei längerer Benutzung ist aus ergonomischer Sicht eine Trennung von visuellem Fokus und manueller Eingabe von Vorteil, wie es zum Beispiel bei der Verwendung von Monitoren und den gängigen Eingabegeräten der Fall ist. Soll diese Art der Eingabe auch bei Computerarbeitsplätzen unterstützt werden, die auf interaktiven Oberflächen basieren, dann stellt sich folgende Frage: Wie wirken sich die neuen Ein- und Ausgabemodalitäten auf vorherrschende Interaktionstechniken aus? Toucheingabe kommt beim klassischen Desktop-Computing schon lange zur Anwendung: Im Gegensatz zu sogenannten Trackpads oder Grafiktabletts eröffnen neue interaktive Oberflächen durch ihre visuellen Darstellungsmöglichkeiten und ihre Größe neue Möglichkeiten für das Design von direkten, indirekten oder hybriden Eingabetechniken. Fundierte Designentscheidungen erfordern jedoch eine sorgfältige Auseinandersetzung mit Ein- und Ausgabetechnologien sowie adequaten Interaktionsstilen. Verwandte Forschungsarbeiten haben sich auf eine konzeptuelle Vereinheitlichung von Arbeitsbereichen konzentriert, die es beispielsweise erlaubt, digitale Objekte mit dem Finger zwischen horizontalen und vertikalen Arbeitsbereichen zu verschieben. Im Gegensatz dazu geht die vorliegende Arbeit von logisch und räumlich getrennten Bereichen aus: Die horizontale interaktive Oberfläche dient primär zur Eingabe, während die vertikale als Display fungiert. Insbesondere trägt diese Arbeit zu einem Verständnis bei, wie durch eine derartige Auffassung interaktiver Oberflächen komplexe Aufgaben, wie zum Beispiel 3D-Modellierung oder Audiobearbeitung auf neue und gewinnbringende Art und Weise unterstützt werden können. Der wissenschaftliche Beitrag der vorliegenden Arbeit lässt sich in drei Bereiche gliedern: Zunächst werden Fallstudien präsentiert, die anhand konkreter Anwendungen (z.B. Spiele, Musikproduktion, kollaboratives Layout) neuartige Nutzerschnittstellen für Computerarbeitsplätze explorieren und evaluieren, die horizontale und vertikale interaktive Oberflächen miteinander verbinden. Einerseits verdeutlichen diese Fallstudien verschiedene Konflikte, die bei der Anwendung von direkter Toucheingabe an solchen Computerarbeitsplätzen hervorgerufen werden. Andererseits zeigen sie auf, wie der bewusste Verzicht auf etablierte Eingabegeräte zu neuen Toucheingabe-Konzepten führen kann. In einem zweiten Schritt wird das Toolspace-Konzept als Interaktionsmodell für Computerarbeitsplätze vorgestellt, die auf einem Verbund aus horizontaler und vertikaler interaktiver Oberfläche bestehen. Dieses Modell ergibt sich aus den vorangegangenen Fallstudien und wird zusätzlich theoretisch motiviert. Der Toolspace stellt anwendungsspezifische und dynamische Eingabeflächen dar, die durch räumliche Aktivierung und die Unterstützung beidhändiger Multitouch-Eingabe die Eingabebandbreite erhöhen. Diese Idee wird anhand zweier Fallstudien illustriert und evaluiert, die zeigen, dass dadurch ein hohes Maß an Kontrolle und Genauigkeit erreicht sowie ein Gefühl von Direktheit vermittelt wird. Zuletzt werden Studienergebnisse vorgestellt, die Erkenntnisse zum Entwurf von Eingabeflächen im Tool Space liefern, insbesondere zu den Themen beidhändige indirekte Multitouch-Eingabe sowie zum Einfluss von Formfaktoren auf räumliche Kognition und Navigation
    corecore