28,369 research outputs found
Creative idea exploration within the structure of a guiding framework : the card brainstorming game
I present a card brainstorming exercise that transforms a conceptual tangible interaction framework into a tool for creative dialogue and discuss the experiences made in using it. Ten sessions with this card game demonstrate the frameworks' versatility and utility. Observation and participant feedback highlight the value of a provocative question format and of the metaphor of a card game
Interpretation at the controller's edge: designing graphical user interfaces for the digital publication of the excavations at Gabii (Italy)
This paper discusses the authors’ approach to designing an interface for the Gabii Project’s digital volumes that attempts to fuse elements of traditional synthetic publications and site reports with rich digital datasets. Archaeology, and classical archaeology in particular, has long engaged with questions of the formation and lived experience of towns and cities. Such studies might draw on evidence of local topography, the arrangement of the built environment, and the placement of architectural details, monuments and inscriptions (e.g. Johnson and Millett 2012). Fundamental to the continued development of these studies is the growing body of evidence emerging from new excavations. Digital techniques for recording evidence “on the ground,” notably SFM (structure from motion aka close range photogrammetry) for the creation of detailed 3D models and for scene-level modeling in 3D have advanced rapidly in recent years. These parallel developments have opened the door for approaches to the study of the creation and experience of urban space driven by a combination of scene-level reconstruction models (van Roode et al. 2012, Paliou et al. 2011, Paliou 2013) explicitly combined with detailed SFM or scanning based 3D models representing stratigraphic evidence. It is essential to understand the subtle but crucial impact of the design of the user interface on the interpretation of these models. In this paper we focus on the impact of design choices for the user interface, and make connections between design choices and the broader discourse in archaeological theory surrounding the practice of the creation and consumption of archaeological knowledge. As a case in point we take the prototype interface being developed within the Gabii Project for the publication of the Tincu House. In discussing our own evolving practices in engagement with the archaeological record created at Gabii, we highlight some of the challenges of undertaking theoretically-situated user interface design, and their implications for the publication and study of archaeological materials
Tangible user interfaces : past, present and future directions
In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research
Disciplining the body? Reflections on the cross disciplinary import of ‘embodied meaning’ into interaction design
The aim of this paper is above all critically to examine and clarify some of the negative implications that the idea of ‘embodied meaning’ has for the emergent field of interaction design research.
Originally, the term ‘embodied meaning’ has been brought into HCI research from phenomenology and cognitive semantics in order to better understand how user’s experience of new technological systems relies to an increasing extent on full-body interaction. Embodied approaches to technology design could thus be found in Winograd & Flores (1986), Dourish (2001), Lund (2003), Klemmer, Hartman & Takayama (2006), Hornecker & Buur (2006), Hurtienne & Israel (2007) among others.
However, fertile as this cross-disciplinary import may be, design research can generally be criticised for being ‘undisciplined’, because of its tendency merely to take over reductionist ideas of embodied meaning from those neighbouring disciplines without questioning the inherent limitations it thereby subscribe to.
In this paper I focus on this reductionism and what it means for interaction design research. I start out by introducing the field of interaction design and two central research questions that it raises. This will serve as a prerequisite for understanding the overall intention of bringing the notion of ‘embodied meaning’ from cognitive semantics into design research. Narrowing my account down to the concepts of ‘image schemas’ and their ‘metaphorical extension’, I then explain in more detail what is reductionistic about the notion of embodied meaning. Having done so, I shed light on the consequences this reductionism might have for design research by examining a recently developed framework for intuitive user interaction along with two case examples. In so doing I sketch an alternative view of embodied meaning for interaction design research.
Keywords:
Interaction Design, Embodied Meaning, Tangible User Interaction, Design Theory, Cognitive Semiotics</p
Recommended from our members
Collaborative music interaction on tabletops: an HCI approach
With the advent of tabletop interaction, collaborative activities are better supported than they are on single-user PCs because there exists a physical shareable space, and interaction with digital data is more embodied and social. In sound and music computing, collaborative music making has traditionally been done using interconnected networks, but using separated computers. Musical tabletops introduce opportunities of playing in collaboration through sharing physically the same musical interface. However, few tabletop musical interfaces exploit this collaborative potential (e.g. the Reactable). We are interested in looking into how collaboration can be fully supported by means of musical tabletops for music performance in contrast with more traditional settings. We are also looking at whether collective musical engagement can be enhanced by providing more suitable interfaces to collaboration. In HCI and software development, we find an iterative process approach of design and evaluation—where evaluation allows us to identify key issues that can be addressed in the next design iteration of the system. Using a similar iterative approach, we plan to design and evaluate some tabletop musical interfaces. The aim is to understand what design choices can enhance and enrich collaboration and collective musical engagement on these systems. In this paper, we explain the evaluation methodologies we have undertaken in three preliminary pilot studies, and the lessons we have learned. Initial findings indicate that evaluating tabletop musical interfaces is a complex endeavour which requires an approach as close as possible to a real context, with an interdisciplinary approach provided by interaction analysis techniques
Physicality and Cooperative Design
CSCW researchers have increasingly come to realize that material work setting and its population of artefacts play a crucial part in coordination of distributed or co-located work. This paper uses the notion of physicality as a basis to understand cooperative work. Using examples from an ongoing fieldwork on cooperative design practices, it provides a conceptual understanding of physicality and shows that material settings and co-worker’s working practices play an important role in understanding physicality of cooperative design
Meetings and Meeting Modeling in Smart Environments
In this paper we survey our research on smart meeting rooms and its relevance for augmented reality meeting support and virtual reality generation of meetings in real time or off-line. The research reported here forms part of the European 5th and 6th framework programme projects multi-modal meeting manager (M4) and augmented multi-party interaction (AMI). Both projects aim at building a smart meeting environment that is able to collect multimodal captures of the activities and discussions in a meeting room, with the aim to use this information as input to tools that allow real-time support, browsing, retrieval and summarization of meetings. Our aim is to research (semantic) representations of what takes place during meetings in order to allow generation, e.g. in virtual reality, of meeting activities (discussions, presentations, voting, etc.). Being able to do so also allows us to look at tools that provide support during a meeting and at tools that allow those not able to be physically present during a meeting to take part in a virtual way. This may lead to situations where the differences between real meeting participants, human-controlled virtual participants and (semi-) autonomous virtual participants disappear
- …