Skip to main content
Article thumbnail
Location of Repository


By Sébastien Carbini, Jean Emmanuel Viallet and Olivier Bernier


Abstract Among gestures naturally performed by users during communication, pointing gestures can be easily recognized and included in more natural new Human Computer Interfaces. We approximate the eye-finger pointing direction of a user by detecting and tracking, in real time, the 3D positions of the centre of the face and of both hands; the positions are obtained by a stereoscopic device located on the top of the display. From the head position and biometric constraints, we define both a rest area and an action area. In this former area, the hands are searched for and the pointing intention is detected. The first hand spontaneously moved forward by the user is defined as the pointing hand whereas the second detected hand, when it first moves forwards, is considered as the selection hand. Experiments on spatial precision, carried out with a group of users, show that the minimum size of an object to be easily pointed is some 1.5 percent of the diagonal of the large display. Young scientist paper for “Feature tracking in a dynamic 3d reality ” special session

Topics: Non intrusive Human Computer Interface, pointing gesture
Year: 2008
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • http://perso.rd.francetelecom.... (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.