114,336 research outputs found

    Human Computer Interaction and Emerging Technologies

    Get PDF
    The INTERACT Conferences are an important platform for researchers and practitioners in the field of human-computer interaction (HCI) to showcase their work. They are organised biennially by the International Federation for Information Processing (IFIP) Technical Committee on Human–Computer Interaction (IFIP TC13), an international committee of 30 member national societies and nine Working Groups. INTERACT is truly international in its spirit and has attracted researchers from several countries and cultures. With an emphasis on inclusiveness, it works to lower the barriers that prevent people in developing countries from participating in conferences. As a multidisciplinary field, HCI requires interaction and discussion among diverse people with different interests and backgrounds. The 17th IFIP TC13 International Conference on Human-Computer Interaction (INTERACT 2019) took place during 2-6 September 2019 in Paphos, Cyprus. The conference was held at the Coral Beach Hotel Resort, and was co-sponsored by the Cyprus University of Technology and Tallinn University, in cooperation with ACM and ACM SIGCHI. This volume contains the Adjunct Proceedings to the 17th INTERACT Conference, comprising a series of selected papers from workshops, the Student Design Consortium and the Doctoral Consortium. The volume follows the INTERACT conference tradition of submitting adjunct papers after the main publication deadline, to be published by a University Press with a connection to the conference itself. In this case, both the Adjunct Proceedings Chair of the conference, Dr Usashi Chatterjee, and the lead Editor of this volume, Dr Fernando Loizides, work at Cardiff University which is the home of Cardiff University Press

    Accessibility and tangible interaction in distributed workspaces based on multi-touch surfaces

    Full text link
    [EN] Traditional interaction mechanisms in distributed digital spaces often fail to consider the intrinsic properties of action, perception, and communication among workgroups, which may affect access to the common resources used to mutually organize information. By developing suitable spatial geometries and natural interaction mechanisms, distributed spaces can become blended where the physical and virtual boundaries of local and remote spaces merge together to provide the illusion of a single unified space. In this paper, we discuss the importance of blended interaction in distributed spaces and the particular challenges faced when designing accessible technology. We illustrate this discussion through a new tangible interaction mechanism for collaborative spaces based on tabletop system technology implemented with optical frames. Our tangible elements facilitate the exchange of digital information in distributed collaborative settings by providing a physical manifestation of common digital operations. The tangibles are designed as passive elements that do not require the use of any additional hardware or external power while maintaining a high degree of accuracy.This work was supported by the Spanish Ministry of Economy and Competitiveness and the European Regional Development Fund, through the ANNOTA Project (Ref. TIN2013-46036-C3-1-R).Salvador-Herranz, G.; Camba, J.; Contero, M.; Naya Sanchis, F. (2018). Accessibility and tangible interaction in distributed workspaces based on multi-touch surfaces. Universal Access in the Information Society. 17(2):247-256. https://doi.org/10.1007/s10209-017-0563-7S247256172Arkin, E.M., Chew, L.P., Huttenlocher, D.P., Kedem, K., Mitchell, J.S.B.: An efficiently computable metric for comparing polygonal shapes. IEEE Trans. Acoust. Speech Signal Process. 13(3), 209–216 (1991)Benyon, D.: Presence in blended spaces. Interact. Comput. 24(4), 219–226 (2012)Bhalla, M.R., Bhalla, A.V.: Comparative study of various touchscreen technologies. Int. J. Comput. Appl. 6(8), 12–18 (2010)Bradski, G., Kaehler, A.: Learning OpenCV: Computer Vision with the OpenCV Library. O’Reilly Media Inc., Newton (2008)Candela, E.S., PĂ©rez, M.O., Romero, C.M., LĂłpez, D.C.P., Herranz, G.S., Contero, M., Raya, M.A.: Humantop: a multi-object tracking tabletop. Multimed. Tools Appl. 70(3), 1837–1868 (2014)Cohen, J., Withgott, M., Piernot, P.: Logjam: a tangible multi-person interface for video logging. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 128–135. ACM (1999)Couture, N., RiviĂšre, G., Reuter, P.: Geotui: a tangible user interface for geoscience. In: Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, pp. 89–96. ACM (2008)de la GuĂ­a, E., Lozano, M.D., Penichet, V.R.: Cognitive rehabilitation based on collaborative and tangible computer games. In: 2013 7th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth), pp. 389–392. IEEE (2013)Dietz, P., Leigh, D.: Diamondtouch: a multi-user touch technology. In: Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology, pp. 219–226. ACM (2001)FalcĂŁo, T.P., Price, S.: What have you done! the role of ‘interference’ in tangible environments for supporting collaborative learning. In: Proceedings of the 9th International Conference on Computer Supported Collaborative Learning-Volume 1, pp. 325–334. International Society of the Learning Sciences (2009)Fallman, D.: Wear, point and tilt. In: Proceedings of the Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, pp. 293–302. ACM Press (2002)Fishkin, K.P., Gujar, A., Harrison, B.L., Moran, T.P., Want, R.: Embodied user interfaces for really direct manipulation. Commun. ACM 43(9), 74–80 (2000)Fitzmaurice, G.W., Buxton, W.: An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 43–50. ACM (1997)Fitzmaurice, G.W., Ishii, H., Buxton, W.A.: Bricks: laying the foundations for graspable user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 442–449. ACM Press (1995)Graham, R.L., Yao, F.F.: Finding the convex hull of a simple polygon. J. Algorithms 4(4), 324–331 (1983)Hartigan, J.A., Wong, M.A.: Algorithm as 136: a k-means clustering algorithm. J. R. Stat. Soc.: Ser. C (Appl. Stat.) 28(1), 100–108 (1979)Higgins, S.E., Mercier, E., Burd, E., Hatch, A.: Multi-touch tables and the relationship with collaborative classroom pedagogies: a synthetic review. Int. J. Comput. Support. Collab. Learn. 6(4), 515–538 (2011)Hinckley, K., Pausch, R., Goble, J.C., Kassell, N.F.: Passive real-world interface props for neurosurgical visualization. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 452–458. ACM (1994)Hinske, S.: Determining the position and orientation of multi-tagged objects using RFID technology. In: 5th Annual IEEE International Conference on Pervasive Computing and Communications Workshops, 2007. PerCom Workshops’07, pp. 377–381. IEEE (2007)Hornecker, E.: A design theme for tangible interaction: embodied facilitation. In: ECSCW 2005, pp. 23–43. Springer (2005)Hoshi, K., Öhberg, F., Nyberg, A.: Designing blended reality space: conceptual foundations and applications. In: Proceedings of the 25th BCS Conference on Human–Computer Interaction, pp. 217–226. British Computer Society (2011)Ishii, H.: Tangible User Interfaces. CRC Press, Boca Raton (2007)Ishii, H., Ullmer, B.: Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 234–241. ACM (1997)Jacob, R.J., Girouard, A., Hirshfield, L.M., Horn, M.S., Shaer, O., Solovey, E.T., Zigelbaum, J.: Reality-based interaction: a framework for post-wimp interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 201–210. ACM (2008)Jetter, H.C., Dachselt, R., Reiterer, H., Quigley, A., Benyon, D., Haller, M.: Blended Interaction: Envisioning Future Collaborative Interactive Spaces. ACM, New York (2013)Jin, X., Han, J.: Quality threshold clustering. In: Sammut, C., Webb, G.I. (eds.) Encyclopedia of Machine Learning, pp. 820–820. Springer, Boston, MA (2011)JordĂ , S., Geiger, G., Alonso, M., Kaltenbrunner, M.: The reactable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction, pp. 139–146. ACM (2007)Kaltenbrunner, M., Bovermann, T., Bencina, R., Costanza, E.: Tuio: a protocol for table-top tangible user interfaces. In: Proceedings of the 6th International Workshop on Gesture in Human–Computer Interaction and Simulation, pp. 1–5 (2005)Kirk, D., Sellen, A., Taylor, S., Villar, N., Izadi, S.: Putting the physical into the digital: issues in designing hybrid interactive surfaces. In: Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology, pp. 35–44. British Computer Society (2009)Marques, T., Nunes, F., Silva, P., Rodrigues, R.: Tangible interaction on tabletops for elderly people. In: International Conference on Entertainment Computing, pp. 440–443. Springer (2011)MĂŒller, D.: Mixed reality systems. iJOE 5(S2), 10–11 (2009)Newton-Dunn, H., Nakano, H., Gibson, J.: Block jam: a tangible interface for interactive music. In: Proceedings of the 2003 Conference on New Interfaces for Musical Expression, pp. 170–177. National University of Singapore (2003)Patten, J., Recht, B., Ishii, H.: Audiopad: a tag-based interface for musical performance. In: Proceedings of the 2002 Conference on New Interfaces for Musical Expression, pp. 1–6. National University of Singapore (2002)Patten, J., Recht, B., Ishii, H.: Interaction techniques for musical performance with tabletop tangible interfaces. In: Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, p. 27. ACM (2006)PQLabs: Inc. http://multitouch.com/ . Retrieved on 16 October 2016Ryokai, K., Marti, S., Ishii, H.: I/o brush: drawing with everyday objects as ink. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI’04, pp. 303–310. ACM, New York (2004). doi: 10.1145/985692.985731Salvador, G., Bañó, M., Contero, M., Camba, J.: Evaluation of a distributed collaborative workspace as a creativity tool in the context of design education. In: 2014 IEEE Frontiers in Education Conference (FIE) Proceedings, pp. 1–7. IEEE (2014)Salvador-Herranz, G., Contero, M., Camba, J.: Use of tangible marks with optical frame interactive surfaces in collaborative design scenarios based on blended spaces. In: International Conference on Cooperative Design, Visualization and Engineering, pp. 253–260. Springer (2014)Salvador-Herranz, G., Camba, J.D., Naya, F., Contero, M.: On the integration of tangible elements with multi-touch surfaces for the collaborative creation of concept maps. In: International Conference on Learning and Collaboration Technologies, pp. 177–186. Springer (2016)Schöning, J., Hook, J., Bartindale, T., Schmidt, D., Oliver, P., Echtler, F., Motamedi, N., Brandl, P., von Zadow, U.: Building interactive multi-touch surfaces. In: MĂŒller-Tomfelde, C. (ed.) Tabletops-Horizontal Interactive Displays, pp. 27–49. Springer, London, UK (2010)Shaer, O., Hornecker, E.: Tangible user interfaces: past, present, and future directions. Found. Trends Hum. Comput. Interact. 3(1–2), 1–137 (2010)Shen, C., Everitt, K., Ryall, K.: Ubitable: Impromptu face-to-face collaboration on horizontal interactive surfaces. In: International Conference on Ubiquitous Computing, pp. 281–288. Springer (2003)Suzuki, H., Kato, H.: Algoblock: a tangible programming language, a tool for collaborative learning. In: Proceedings of 4th European Logo Conference, pp. 297–303 (1993)Suzuki, H., Kato, H.: Interaction-level support for collaborative learning: Algoblockan open programming language. In: The 1st International Conference on Computer Support for Collaborative Learning, pp. 349–355. L. Erlbaum Associates Inc. (1995)Terrenghi, L., Kirk, D., Richter, H., KrĂ€mer, S., Hilliges, O., Butz, A.: Physical handles at the interactive surface: exploring tangibility and its benefits. In: Proceedings of the Working Conference on Advanced Visual Interfaces, pp. 138–145. ACM (2008)Veltkamp, R.C.: Shape matching: similarity measures and algorithms. In: SMI 2001 International Conference on Shape Modeling and Applications, pp. 188–197. IEEE (2001)Weinberg, G., Gan, S.L.: The squeezables: Toward an expressive and interdependent multi-player musical instrument. Comput. Music J. 25(2), 37–45 (2001)Weiser, M.: Some computer science issues in ubiquitous computing. Commun. ACM 36(7), 75–84 (1993)Wilson, F.: The hand: how its use shapes the brain, language, and human culture. Vintage Series. Vintage Books (1998). https://books.google.es/books?id=l_Boy_-NkwUCZuckerman, O., Arida, S., Resnick, M.: Extending tangible interfaces for education: digital montessori-inspired manipulatives. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 859–868. ACM (2005

    Recreating Daily life in Pompeii

    Full text link
    [EN] We propose an integrated Mixed Reality methodology for recreating ancient daily life that features realistic simulations of animated virtual human actors (clothes, body, skin, face) who augment real environments and re-enact staged storytelling dramas. We aim to go further from traditional concepts of static cultural artifacts or rigid geometrical and 2D textual augmentations and allow for 3D, interactive, augmented historical character-based event representations in a mobile and wearable setup. This is the main contribution of the described work as well as the proposed extensions to AR Enabling technologies: a VR/AR character simulation kernel framework with real-time, clothed virtual humans that are dynamically superimposed on live camera input, animated and acting based on a predefined, historically correct scenario. We demonstrate such a real-time case study on the actual site of ancient Pompeii.The work presented has been supported by the Swiss Federal Office for Education and Science and the EU IST programme, in frame of the EU IST LIFEPLUS 34545 and EU ICT INTERMEDIA 38417 projects.Magnenat-Thalmann, N.; Papagiannakis, G. (2010). Recreating Daily life in Pompeii. Virtual Archaeology Review. 1(2):19-23. https://doi.org/10.4995/var.2010.4679OJS192312P. MILGRAM, F. KISHINO, (1994) "A Taxonomy of Mixed Reality Visual Displays", IEICE Trans. Information Systems, vol. E77-D, no. 12, pp. 1321-1329R. AZUMA, Y. BAILLOT, R. BEHRINGER, S. FEINER, S. JULIER, B. MACINTYRE, (2001) "Recent Advances in Augmented Reality", IEEE Computer Graphics and Applications, November/December http://dx.doi.org/10.1109/38.963459D. STRICKER, P. DÄHNE, F. SEIBERT, I. CHRISTOU, L. ALMEIDA, N. IOANNIDIS, (2001) "Design and Development Issues for ARCHEOGUIDE: An Augmented Reality-based Cultural Heritage On-site Guide", EuroImage ICAV 3D Conference in Augmented Virtual Environments and Three-dimensional Imaging, Mykonos, Greece, 30 May-01 JuneW. WOHLGEMUTH, G. TRIEBFÜRST, (2000)"ARVIKA: augmented reality for development, production and service", DARE 2000 on Designing augmented reality environments, Elsinore, Denmark http://dx.doi.org/10.1145/354666.354688H. TAMURA, H. YAMAMOTO, A. KATAYAMA, (2001) "Mixed reality: Future dreams seen at the border between real and virtual worlds", Computer Graphics and Applications, vol.21, no.6, pp.64-70 http://dx.doi.org/10.1109/38.963462M. PONDER, G. PAPAGIANNAKIS, T. MOLET, N. MAGNENAT-THALMANN, D. THALMANN, (2003) "VHD++ Development Framework: Towards Extendible, Component Based VR/AR Simulation Engine Featuring Advanced Virtual Character Technologies", IEEE Computer Society Press, CGI Proceedings, pp. 96-104 http://dx.doi.org/10.1109/cgi.2003.1214453Archaeological Superintendence of Pompeii (2009), http://www.pompeiisites.orgG. PAPAGIANNAKIS, S. SCHERTENLEIB, B. O'KENNEDY , M. POIZAT, N.MAGNENAT-THALMANN, A. STODDART, D.THALMANN, (2005) "Mixing Virtual and Real scenes in the site of ancient Pompeii",Journal of CAVW, p 11-24, Volume 16, Issue 1, John Wiley and Sons Ltd, FebruaryEGGES, A., PAPAGIANNAKIS, G., MAGNENAT-THALMANN, N., (2007) "Presence and Interaction in Mixed Reality", The Visual Computer, Springer-Verlag Volume 23, Number 5, MaySEO H., MAGNENAT-THALMANN N. (2003), An Automatic Modeling of Human Bodies from Sizing Parameters. In ACM SIGGRAPH, Symposium on Interactive 3D Graphics, pp19-26, pp234. http://dx.doi.org/10.1145/641480.641487VOLINO P., MAGNENAT-THALMANN N. (2006), Resolving Surface Collisions through Intersection Contour Minimization. In ACM Transactions on Graphics (Siggraph 2006 proceedings), 25(3), pp 1154-1159. http://dx.doi.org/10.1145/1179352.1142007http://dx.doi.org/10.1145/1141911.1142007PAPAGIANNAKIS, G., SINGH, G., MAGNENAT-THALMANN, N., (2008) "A survey of mobile and wireless technologies for augmented reality systems", Journal of Computer Animation and Virtual Worlds, John Wiley and Sons Ltd, 19, 1, pp. 3-22, February http://dx.doi.org/10.1002/cav.22

    Levitating Particle Displays with Interactive Voxels

    Get PDF
    Levitating objects can be used as the primitives in a new type of display. We present levitating particle displays and show how research into object levitation is enabling a new way of presenting and interacting with information. We identify novel properties of levitating particle displays and give examples of the interaction techniques and applications they allow. We then discuss design challenges for these displays, potential solutions, and promising areas for future research

    Ambient Gestures

    No full text
    We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous ‘in the environment’ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing

    Applying a User-centred Approach to Interactive Visualization Design

    Get PDF
    Analysing users in their context of work and finding out how and why they use different information resources is essential to provide interactive visualisation systems that match their goals and needs. Designers should actively involve the intended users throughout the whole process. This chapter presents a user-centered approach for the design of interactive visualisation systems. We describe three phases of the iterative visualisation design process: the early envisioning phase, the global specification hase, and the detailed specification phase. The whole design cycle is repeated until some criterion of success is reached. We discuss different techniques for the analysis of users, their tasks and domain. Subsequently, the design of prototypes and evaluation methods in visualisation practice are presented. Finally, we discuss the practical challenges in design and evaluation of collaborative visualisation environments. Our own case studies and those of others are used throughout the whole chapter to illustrate various approaches
    • 

    corecore