5,393 research outputs found
Mobility is the Message: Experiments with Mobile Media Sharing
This thesis explores new mobile media sharing applications by building, deploying, and studying their use. While we share media in many different ways both on the web and on mobile phones, there are few ways of sharing media with people physically near us. Studied were three designed and built systems: Push!Music, Columbus, and Portrait Catalog, as well as a fourth commercially available system â Foursquare. This thesis offers four contributions: First, it explores the design space of co-present media sharing of four test systems. Second, through user studies of these systems it reports on how these come to be used. Third, it explores new ways of conducting trials as the technical mobile landscape has changed. Last, we look at how the technical solutions demonstrate different lines of thinking from how similar solutions might look today.
Through a Human-Computer Interaction methodology of design, build, and study, we look at systems through the eyes of embodied interaction and examine how the systems come to be in use. Using Goffmanâs understanding of social order, we see how these mobile media sharing systems allow people to actively present themselves through these media. In turn, using McLuhanâs way of understanding media, we reflect on how these new systems enable a new type of medium distinct from the web centric media, and how this relates directly to mobility.
While media sharing is something that takes place everywhere in western society, it is still tied to the way media is shared through computers. Although often mobile, they do not consider the mobile settings. The systems in this thesis treat mobility as an opportunity for design. It is still left to see how this mobile media sharing will come to present itself in peopleâs everyday life, and when it does, how we will come to understand it and how it will transform society as a medium distinct from those before. This thesis gives a glimpse at what this future will look like
GazeTouchPass: Multimodal Authentication Using Gaze and Touch on Mobile Devices
We propose a multimodal scheme, GazeTouchPass, that combines gaze and touch for shoulder-surfing resistant user authentication on mobile devices. GazeTouchPass allows passwords with multiple switches between input modalities during authentication. This requires attackers to simultaneously observe the device screen and the user's eyes to find the password. We evaluate the security and usability of GazeTouchPass in two user studies. Our findings show that GazeTouchPass is usable and significantly more secure than single-modal authentication against basic and even advanced shoulder-surfing attacks
Resonating Experiences of Self and Others enabled by a Tangible Somaesthetic Design
Digitalization is penetrating every aspect of everyday life including a
human's heart beating, which can easily be sensed by wearable sensors and
displayed for others to see, feel, and potentially "bodily resonate" with.
Previous work in studying human interactions and interaction designs with
physiological data, such as a heart's pulse rate, have argued that feeding it
back to the users may, for example support users' mindfulness and
self-awareness during various everyday activities and ultimately support their
wellbeing. Inspired by Somaesthetics as a discipline, which focuses on an
appreciation of the living body's role in all our experiences, we designed and
explored mobile tangible heart beat displays, which enable rich forms of bodily
experiencing oneself and others in social proximity. In this paper, we first
report on the design process of tangible heart displays and then present
results of a field study with 30 pairs of participants. Participants were asked
to use the tangible heart displays during watching movies together and report
their experience in three different heart display conditions (i.e., displaying
their own heart beat, their partner's heart beat, and watching a movie without
a heart display). We found, for example that participants reported significant
effects in experiencing sensory immersion when they felt their own heart beats
compared to the condition without any heart beat display, and that feeling
their partner's heart beats resulted in significant effects on social
experience. We refer to resonance theory to discuss the results, highlighting
the potential of how ubiquitous technology could utilize physiological data to
provide resonance in a modern society facing social acceleration.Comment: 18 page
EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confined tracking box restrict users to a single stationary position in front of the display. We present EyeScout, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user's lateral movement. EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interaction modes for a single user: In "Walk then Interact" the user can walk up to an arbitrary position in front of the display and interact, while in "Walk and Interact" the user can interact even while on the move. We report on a user study that shows that EyeScout is well perceived by users, extends a public display's sweet spot into a sweet line, and reduces gaze interaction kick-off time to 3.5 seconds -- a 62% improvement over state of the art solutions. We discuss sample applications that demonstrate how EyeScout can enable position and movement-independent gaze interaction with large public displays
Anticipatory Mobile Computing: A Survey of the State of the Art and Research Challenges
Today's mobile phones are far from mere communication devices they were ten
years ago. Equipped with sophisticated sensors and advanced computing hardware,
phones can be used to infer users' location, activity, social setting and more.
As devices become increasingly intelligent, their capabilities evolve beyond
inferring context to predicting it, and then reasoning and acting upon the
predicted context. This article provides an overview of the current state of
the art in mobile sensing and context prediction paving the way for
full-fledged anticipatory mobile computing. We present a survey of phenomena
that mobile phones can infer and predict, and offer a description of machine
learning techniques used for such predictions. We then discuss proactive
decision making and decision delivery via the user-device feedback loop.
Finally, we discuss the challenges and opportunities of anticipatory mobile
computing.Comment: 29 pages, 5 figure
- âŠ