48 research outputs found

    Technology for Bonding in Human-Animal Interaction

    Get PDF
    This workshop focuses on the use and influence of technology on human-animal bonding, and how to facilitate them with technology. We explore the elements and characteristics of human-animal bonding, and how technology is connected to emotions and bonding between the human and the animal. We are particularly interested in animal's experiences, emotions, and welfare in bonding. The workshop facilitates discussion, creates a framework to support design activities, identifies future research themes, and creates ideas on facilitating the mutual bonding in human-animal interaction. The main focus is on dogs, but workshop aims is to pave way for further investigations and research with other domestic animals, such as cats, horses, and rabbits

    Haptic feedback in eye typing

    Get PDF
    Proper feedback is essential in gaze based interfaces, where the same modality is used for both perception and control. We measured how vibrotactile feedback, a form of haptic feedback, compares with the commonly used visual and auditory feedback in eye typing. Haptic feedback was found to produce results that are close to those of auditory feedback; both were easy to perceive and participants liked both the auditory ”click” and the tactile “tap” of the selected key. Implementation details (such as the placement of the haptic actuator) were also found important

    Symbiotic attention management in the context of internet of things

    Get PDF
    In this position paper we stress the need for considering the nature of human attention when designing future potentially interruptive IoT and propose to let IoT devices share attention-related data and collaborate on the task of drawing human attention in order to achieve higher quality attention management with less overall system resources. Finally, we categorize some existing strategies for drawing people's attention according to a simple symbiotic (human-machine) attention management framework

    The Interplay Between Affect, Dog's Physical Activity and Dog-Owner Relationship

    Get PDF
    Leaving a dog home alone is part of everyday life for most dog owners. Previous research shows that dog-owner relationship has multifarious effects on dog behavior. However, little is known about the interplay between dog-owner relationship, physical activity of the dog, and affective experiences at the time of the owner leaving home and reunion when the owner comes home. In this paper, we explored how the general (daily, home alone, and over the 2-week study period) physical activity of the dog, and owner's perceptions of the dog's affective state were correlated at those particular moments. Nineteen volunteer dog owners had their dogs (N = 19) wear two activity trackers (ActiGraph wGT2X-GT and FitBark2) for 2 weeks 24 h/day. Prior to the 2-week continuous physical activity measurement period, the owners filled in questionnaires about the dog-owner relationship and the dog behavior. In daily questionnaires, owners described and assessed their own and their perception of the emotion-related experiences of their dog and behavior of the dog at the moment of separation and reunion. The results indicated that the dog-owner relationship has an interplay with the mean daily and weekly physical activity levels of the dog. An indication of strong emotional dog-owner relationship (especially related to the attentiveness of the dog, continuous companionship, and time spent together when relaxing) correlated positively with the mean daily activity levels of the dog during the first measurement week of the study. Results also suggest that the mean daily and over the 2-week measurement period physical activity of the dog correlated the affective experiences of the dog and owner as reported by the owner when the dog was left home alone. More research is needed to understand the interplay between affect, physical activity of the dog, dog-owner relationship, and the effects of these factors on, and their interplay with, the welfare of dogs.Peer reviewe

    ReType:Quick Text Editing with Keyboard and Gaze

    Get PDF
    When a user needs to reposition the cursor during text editing, this is often done using the mouse. For experienced typists especially, the switch between keyboard and mouse can slow down the keyboard editing workflow considerably. To address this we propose ReType, a new gaze-assisted positioning technique combining keyboard with gaze input based on a new ‘patching’ metaphor. ReType allows users to perform some common editing operations while keeping their hands on the keyboard. We present the result of two studies. A free-use study indicated that ReType enhances the user experience of text editing. ReType was liked by many participants, regardless of their typing skills. A comparative user study showed that ReType is able to match or even beat the speed of mouse-based interaction for small text edits. We conclude that the gaze-augmented user interface can make common interactions more fluent, especially for professional keyboard users

    Dog behaviour classification with movement sensors placed on the harness and the collar

    Get PDF
    Dog owners' understanding of the daily behaviour of their dogs may be enhanced by movement measurements that can detect repeatable dog behaviour, such as levels of daily activity and rest as well as their changes. The aim of this study was to evaluate the performance of supervised machine learning methods utilising accelerometer and gyroscope data provided by wearable movement sensors in classification of seven typical dog activities in a semi-controlled test situation. Forty-five middle to large sized dogs participated in the study. Two sensor devices were attached to each dog, one on the back of the dog in a harness and one on the neck collar. Altogether 54 features were extracted from the acceleration and gyroscope signals divided in two-second segments. The performance of four classifiers were compared using features derived from both sensor modalities. and from the acceleration data only. The results were promising; the movement sensor at the back yielded up to 91 % accuracy in classifying the dog activities and the sensor placed at the collar yielded 75 % accuracy at best. Including the gyroscope features improved the classification accuracy by 0.7-2.6 %, depending on the classifier and the sensor location. The most distinct activity was sniffing, whereas the static postures (lying on chest, sitting and standing) were the most challenging behaviours to classify, especially from the data of the neck collar sensor. The data used in this article as well as the signal processing scripts are openly available in Mendeley Data, https://doi.org/10.17632/vxhx934tbn.1.Peer reviewe

    Evaluation of Dry Electrodes in Canine Heart Rate Monitoring

    Get PDF
    The functionality of three dry electrocardiogram electrode constructions was evaluated by measuring canine heart rate during four different behaviors: Standing, sitting, lying and walking. The testing was repeated (n = 9) in each of the 36 scenarios with three dogs. Two of the electrodes were constructed with spring-loaded test pins while the third electrode was a molded polymer electrode with Ag/AgCl coating. During the measurement, a specifically designed harness was used to attach the electrodes to the dogs. The performance of the electrodes was evaluated and compared in terms of heartbeat detection coverage. The effect on the respective heart rate coverage was studied by computing the heart rate coverage from the measured electrocardiogram signal using a pattern-matching algorithm to extract the R-peaks and further the beat-to-beat heart rate. The results show that the overall coverage ratios regarding the electrodes varied between 45-95% in four different activity modes. The lowest coverage was for lying and walking and the highest was for standing and sitting.Peer reviewe

    Outline Pursuits:Gaze-assisted Selection of Occluded Objects in Virtual Reality

    Get PDF
    In 3D environments, objects can be difficult to select when they overlap, as this affects available target area and increases selection ambiguity. We introduce Outline Pursuits which extends a primary pointing modality for gaze-assisted selection of occluded objects. Candidate targets within a pointing cone are presented with an outline that is traversed by a moving stimulus. This affords completion of the selection by gaze attention to the intended target's outline motion, detected by matching the user's smooth pursuit eye movement. We demonstrate two techniques implemented based on the concept, one with a controller as the primary pointer, and one in which Outline Pursuits are combined with head pointing for hands-free selection. Compared with conventional raycasting, the techniques require less movement for selection as users do not need to reposition themselves for a better line of sight, and selection time and accuracy are less affected when targets become highly occluded

    Eye&Head:Synergetic Eye and Head Movement for Gaze Pointing and Selection

    Get PDF
    Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection

    The Role of Eye Gaze in Security and Privacy Applications: Survey and Future HCI Research Directions

    Get PDF
    For the past 20 years, researchers have investigated the use of eye tracking in security applications. We present a holistic view on gaze-based security applications. In particular, we canvassed the literature and classify the utility of gaze in security applications into a) authentication, b) privacy protection, and c) gaze monitoring during security critical tasks. This allows us to chart several research directions, most importantly 1) conducting field studies of implicit and explicit gaze-based authentication due to recent advances in eye tracking, 2) research on gaze-based privacy protection and gaze monitoring in security critical tasks which are under-investigated yet very promising areas, and 3) understanding the privacy implications of pervasive eye tracking. We discuss the most promising opportunities and most pressing challenges of eye tracking for security that will shape research in gaze-based security applications for the next decade
    corecore