1,204 research outputs found

    Creating Your Bubble: Personal Space On and Around Large Public Displays

    Get PDF
    We describe an empirical study that explores how users establish and use personal space around large public displays (LPDs). Our study complements field studies in this space by more fully characterizing interpersonal distances based on coupling and confirms the use of on-screen territories on vertical displays. Finally, we discuss implications for future research: limitations of proxemics and territoriality, how user range can augment existing theory, and the influence of display size on personal space

    Distributed UI on Interactive tabletops: issues and context model

    Get PDF
    International audienceThe User Interface distribution can also be applied on interactive tabletops which are connected and more or less remote. This distribution raises issues which concern collaboration (how to distribute the UI to collaborate?); besides, concerning the tangible interaction: which role and appearance (tangible or virtual) must have the objects? In this chapter we describe an extended context model in order to take into account both interactions on a single interactive tabletop and interactions which are distributed and collaborative. The model proposed can, from our point of view, be used to make sure that the usability of the interaction is guaranteed. Indeed, it is essential to know the interaction configuration in order to ensure the usability of the system. The model suggested is illustrated in a case study integrating collaboration and UI distribution. A conclusion gives the limits of the article before a presentation of prospects

    Blended Interaction Spaces for Distributed Team Collaboration

    Get PDF

    Designing for Cross-Device Interactions

    Get PDF
    Driven by technological advancements, we now own and operate an ever-growing number of digital devices, leading to an increased amount of digital data we produce, use, and maintain. However, while there is a substantial increase in computing power and availability of devices and data, many tasks we conduct with our devices are not well connected across multiple devices. We conduct our tasks sequentially instead of in parallel, while collaborative work across multiple devices is cumbersome to set up or simply not possible. To address these limitations, this thesis is concerned with cross-device computing. In particular it aims to conceptualise, prototype, and study interactions in cross-device computing. This thesis contributes to the field of Human-Computer Interaction (HCI)—and more specifically to the area of cross-device computing—in three ways: first, this work conceptualises previous work through a taxonomy of cross-device computing resulting in an in-depth understanding of the field, that identifies underexplored research areas, enabling the transfer of key insights into the design of interaction techniques. Second, three case studies were conducted that show how cross-device interactions can support curation work as well as augment users’ existing devices for individual and collaborative work. These case studies incorporate novel interaction techniques for supporting cross-device work. Third, through studying cross-device interactions and group collaboration, this thesis provides insights into how researchers can understand and evaluate multi- and cross-device interactions for individual and collaborative work. We provide a visualization and querying tool that facilitates interaction analysis of spatial measures and video recordings to facilitate such evaluations of cross-device work. Overall, the work in this thesis advances the field of cross-device computing with its taxonomy guiding research directions, novel interaction techniques and case studies demonstrating cross-device interactions for curation, and insights into and tools for effective evaluation of cross-device systems

    BIS<em>i</em>: a Blended Interaction Space

    Get PDF

    Personalized Interaction with High-Resolution Wall Displays

    Get PDF
    Fallende Hardwarepreise sowie eine zunehmende Offenheit gegenĂŒber neuartigen InteraktionsmodalitĂ€ten haben in den vergangen Jahren den Einsatz von wandgroßen interaktiven Displays möglich gemacht, und in der Folge ist ihre Anwendung, unter anderem in den Bereichen Visualisierung, Bildung, und der UnterstĂŒtzung von Meetings, erfolgreich demonstriert worden. Aufgrund ihrer GrĂ¶ĂŸe sind Wanddisplays fĂŒr die Interaktion mit mehreren Benutzern prĂ€destiniert. Gleichzeitig kann angenommen werden, dass Zugang zu persönlichen Daten und Einstellungen — mithin personalisierte Interaktion — weiterhin essentieller Bestandteil der meisten AnwendungsfĂ€lle sein wird. Aktuelle Benutzerschnittstellen im Desktop- und Mobilbereich steuern Zugriffe ĂŒber ein initiales Login. Die Annahme, dass es nur einen Benutzer pro Bildschirm gibt, zieht sich durch das gesamte System, und ermöglicht unter anderem den Zugriff auf persönliche Daten und Kommunikation sowie persönliche Einstellungen. Gibt es hingegen mehrere Benutzer an einem großen Bildschirm, mĂŒssen hierfĂŒr Alternativen gefunden werden. Die daraus folgende Forschungsfrage dieser Dissertation lautet: Wie können wir im Kontext von Mehrbenutzerinteraktion mit wandgroßen Displays personalisierte Schnittstellen zur VerfĂŒgung stellen? Die Dissertation befasst sich sowohl mit personalisierter Interaktion in der NĂ€he (mit Touch als EingabemodalitĂ€t) als auch in etwas weiterer Entfernung (unter Nutzung zusĂ€tzlicher mobiler GerĂ€te). Grundlage fĂŒr personalisierte Mehrbenutzerinteraktion sind technische Lösungen fĂŒr die Zuordnung von Benutzern zu einzelnen Interaktionen. Hierzu werden zwei Alternativen untersucht: In der ersten werden Nutzer via Kamera verfolgt, und in der zweiten werden MobilgerĂ€te anhand von Ultraschallsignalen geortet. Darauf aufbauend werden Interaktionstechniken vorgestellt, die personalisierte Interaktion unterstĂŒtzen. Diese nutzen zusĂ€tzliche MobilgerĂ€te, die den Zugriff auf persönliche Daten sowie Interaktion in einigem Abstand von der Displaywand ermöglichen. Einen weiteren Teil der Arbeit bildet die Untersuchung der praktischen Auswirkungen der Ausgabe- und InteraktionsmodalitĂ€ten fĂŒr personalisierte Interaktion. Hierzu wird eine qualitative Studie vorgestellt, die Nutzerverhalten anhand des kooperativen Mehrbenutzerspiels Miners analysiert. Der abschließende Beitrag beschĂ€ftigt sich mit dem Analyseprozess selber: Es wird das Analysetoolkit fĂŒr Wandinteraktionen GIAnT vorgestellt, das Nutzerbewegungen, Interaktionen, und Blickrichtungen visualisiert und dadurch die Untersuchung der Interaktionen stark vereinfacht.An increasing openness for more diverse interaction modalities as well as falling hardware prices have made very large interactive vertical displays more feasible, and consequently, applications in settings such as visualization, education, and meeting support have been demonstrated successfully. Their size makes wall displays inherently usable for multi-user interaction. At the same time, we can assume that access to personal data and settings, and thus personalized interaction, will still be essential in most use-cases. In most current desktop and mobile user interfaces, access is regulated via an initial login and the complete user interface is then personalized to this user: Access to personal data, configurations and communications all assume a single user per screen. In the case of multiple people using one screen, this is not a feasible solution and we must find alternatives. Therefore, this thesis addresses the research question: How can we provide personalized interfaces in the context of multi-user interaction with wall displays? The scope spans personalized interaction both close to the wall (using touch as input modality) and further away (using mobile devices). Technical solutions that identify users at each interaction can replace logins and enable personalized interaction for multiple users at once. This thesis explores two alternative means of user identification: Tracking using RGB+depth-based cameras and leveraging ultrasound positioning of the users' mobile devices. Building on this, techniques that support personalized interaction using personal mobile devices are proposed. In the first contribution on interaction, HyDAP, we examine pointing from the perspective of moving users, and in the second, SleeD, we propose using an arm-worn device to facilitate access to private data and personalized interface elements. Additionally, the work contributes insights on practical implications of personalized interaction at wall displays: We present a qualitative study that analyses interaction using a multi-user cooperative game as application case, finding awareness and occlusion issues. The final contribution is a corresponding analysis toolkit that visualizes users' movements, touch interactions and gaze points when interacting with wall displays and thus allows fine-grained investigation of the interactions

    Investigating Practices When Using an Overview Device in Collaborative Multi-Surface Trip-Planning

    Get PDF
    The availability of mobile device ecologies enables new types of ad-hoc co-located decision-making and sensemak-ing practices in which people find, collect, discuss, and share information. However, little is known about what kind of device configurations are suitable for these types of tasks. This paper contributes new insights into how people use configurations of devices for one representative example task: collaborative co-located trip-planning. We present an empirical study that explores and compares three strategies to use multiple devices: no-overview, overview on own device, and a separate overview device. The results show that the overview facilitated decision- and sensemaking during a collaborative trip-planning task by aiding groups to iterate their itinerary, organize locations and timings efficiently, and discover new insights. Groups shared and discussed more opinions, resulting in more democratic decision-making. Groups provided with a separate overview device engaged more frequently and spent more time in closely-coupled collaboration

    Collaborative behavior, performance and engagement with visual analytics tasks using mobile devices

    Get PDF
    Interactive visualizations are external tools that can support users’ exploratory activities. Collaboration can bring benefits to the exploration of visual representations or visu‐ alizations. This research investigates the use of co‐located collaborative visualizations in mobile devices, how users working with two different modes of interaction and view (Shared or Non‐Shared) and how being placed at various position arrangements (Corner‐to‐Corner, Face‐to‐Face, and Side‐by‐Side) affect their knowledge acquisition, engagement level, and learning efficiency. A user study is conducted with 60 partici‐ pants divided into 6 groups (2 modes×3 positions) using a tool that we developed to support the exploration of 3D visual structures in a collaborative manner. Our results show that the shared control and view version in the Side‐by‐Side position is the most favorable and can improve task efficiency. In this paper, we present the results and a set of recommendations that are derived from them

    Collaborative Human-Computer Interaction with Big Wall Displays - BigWallHCI 2013 3rd JRC ECML Crisis Management Technology Workshop

    Get PDF
    The 3rd JRC ECML Crisis Management Technology Workshop on Human-Computer Interaction with Big Wall Displays in Situation Rooms and Monitoring Centres was co-organised by the European Commission Joint Research Centre and the University of Applied Sciences St. Pölten, Austria. It took place in the European Crisis Management Laboratory (ECML) of the JRC in Ispra, Italy, from 18 to 19 April 2013. 40 participants from stakeholders in the EC, civil protection bodies, academia, and industry attended the workshop. The hardware of large display areas is on the one hand mature since many years and on the other hand changing rapidly and improving constantly. This high pace developments promise amazing new setups with respect to e.g., pixel density or touch interaction. On the software side there are two components with room for improvement: 1. the software provided by the display manufacturers to operate their video walls (source selection, windowing system, layout control) and 2. dedicated ICT systems developed to the very needs of crisis management practitioners and monitoring centre operators. While industry starts to focus more on the collaborative aspects of their operating software already, the customized and tailored ICT applications needed are still missing, unsatisfactory, or very expensive since they have to be developed from scratch many times. Main challenges identified to enhance big wall display systems in crisis management and situation monitoring contexts include: 1. Interaction: Overcome static layouts and/or passive information consumption. 2. Participatory Design & Development: Software needs to meet users’ needs. 3. Development and/or application of Information Visualisation & Visual Analytics principle to support the transition from data to information to knowledge. 4. Information Overload: Proper methods for attention management, automatic interpretation, incident detection, and alarm triggering are needed to deal with the ever growing amount of data to be analysed.JRC.G.2-Global security and crisis managemen
    • 

    corecore