20 research outputs found

    Effects of different push-to-talk solutions on driving performance

    Get PDF
    Police officers have been using the Project54 system in their vehicles for a number of years. They have also started using the handheld version of Project54 outside their vehicles recently. There is a need to connect these two instances of the system into a continuous user interface. On the other hand, research has shown that the PTT button location affects driving performance. This thesis investigates the difference between the old, fixed PTT button and a new wireless PTT glove, that could be used in and outside of the car. The thesis describes the design of the glove and the driving simulator experiment that was conducted to investigate the glove\u27s merit. The main results show that the glove allows more freedom of operation, appears to be easier and more efficient to operate and it reduces the visual distraction of the drivers

    Exploring the Influence of Light and Cognitive Load on Pupil Diameter in Driving Simulator Studies

    Get PDF
    Pupil diameter can be used as a physiological measure of cognitive load in driving simulator studies. However, pupil size depends on both cognitive load and lighting conditions. In order to accurately estimate cognitive load these two effects must be separated. In our study we introduce illumination only, cognitive only and combined tasks. Based on these we decouple the two effects on pupil diameter and we design a predictor of the pupil’s reaction to light which can be used to estimate changes in pupil diameter that are due to cognitive load

    Comparison of the Effects of Two Push-to-Talk Button Implementations on Driver Hand Position and Visual Attention

    Get PDF
    Buttons built into the steering wheel are used in many vehicles as push-to-talk (PTT) buttons for in-car speech user interfaces. We explore the influence of such a fixed PTT button on driver hand position on the steering wheel and on visual attention while driving. We also explore these variables for a wireless PTT glove, which allows drivers to use the entire surface of the steering wheel to operate the PTT button. Participants in our driving simulator-based study were willing to take advantage of the flexibility in hand position afforded by the glove PTT button. We also found that participants cast glances toward the steering wheel significantly less often when using the PTT glove than they did when operating the fixed PTT button

    Gaze Tracking for Human Robot Interaction

    Get PDF

    Gaze Tracking for Human Robot Interaction

    Get PDF

    Gaze Contingency in Turn-Taking for Human Robot Interaction: Advantages and Drawbacks

    Get PDF
    Palinko O, Sciutti A, Schillingmann L, Rea F, Nagai Y, Sandini G. Gaze Contingency in Turn-Taking for Human Robot Interaction: Advantages and Drawbacks. Presented at the 24th IEEE International Symposium on Robot and Human Interactive Communication

    A Design Space Exploration of Creative Concepts for Care Robots: Questioning the Differentiation of Social and Physical Assistance

    Get PDF
    In an interdisciplinary project, creative concepts for care robotics were developed. To explore the design space that these open up, we discussed them along the common differentiation of physical (effective) and social-emotional assistance. Trying to rate concepts on these dimensions frequently raised questions regarding the relation between the social-emotional and the physical, and highlighted gaps and a lack of conceptual clarity. We here present our design concepts, report on our discussion, and summarize our insights; in particular we suggest that the social and the physical dimension of care technologies should always be thought of and designed as interrelated

    A Robot reading human gaze: Why eye tracking is better than head tracking for human-robot collaboration

    No full text
    Robots are at the position to become our everyday companions in the near future. Still, many hurdles need to be cleared to achieve this goal. One of them is the fact that robots are still not able to perceive some important communication cues naturally used by humans, e.g. gaze. In the recent past, eye gaze in robot perception was substituted by its proxy, head orientation. Such an approach is still adopted in many applications today. In this paper we introduce performance improvements to an eye tracking system we previously developed and use it to explore if this approximation is appropriate. More precisely, we compare the impact of the use of eye- or head-based gaze estimation in a human robot interaction experiment with the iCub robot and na\uefve subjects. We find that the possibility to exploit the richer information carried by eye gaze has a significant impact on the interaction. As a result, our eye tracking system allows for a more efficient human-robot collaboration than a comparable head tracking approach, according to both quantitative measures and subjective evaluation by the human participants

    Eye gaze tracking for a humanoid robot

    No full text
    Humans use eye gaze in their daily interaction with other humans. Humanoid robots, on the other hand, have not yet taken full advantage of this form of implicit communication. In this paper we present a passive monocular gaze tracking system implemented on the iCub humanoid robot. The validation of the system proved that it is a viable low-cost, calibration-free gaze tracking solution for humanoid platforms, with a mean absolute error of about 5 degrees on horizontal angle estimates. We also demonstrated the applicability of our system to human-robot collaborative tasks, showing that the eye gaze reading ability can enable successful implicit communication between humans and the robot. Finally, in the conclusion we give generic guidelines on how to improve our system and discuss some potential applications of gaze estimation for humanoid robots
    corecore