1,880 research outputs found
The Evolution of First Person Vision Methods: A Survey
The emergence of new wearable technologies such as action cameras and
smart-glasses has increased the interest of computer vision scientists in the
First Person perspective. Nowadays, this field is attracting attention and
investments of companies aiming to develop commercial devices with First Person
Vision recording capabilities. Due to this interest, an increasing demand of
methods to process these videos, possibly in real-time, is expected. Current
approaches present a particular combinations of different image features and
quantitative methods to accomplish specific objectives like object detection,
activity recognition, user machine interaction and so on. This paper summarizes
the evolution of the state of the art in First Person Vision video analysis
between 1997 and 2014, highlighting, among others, most commonly used features,
methods, challenges and opportunities within the field.Comment: First Person Vision, Egocentric Vision, Wearable Devices, Smart
Glasses, Computer Vision, Video Analytics, Human-machine Interactio
Recommended from our members
Binocular Eye Movements Are Adapted to the Natural Environment.
Humans and many animals make frequent saccades requiring coordinated movements of the eyes. When landing on the new fixation point, the eyes must converge accurately or double images will be perceived. We asked whether the visual system uses statistical regularities in the natural environment to aid eye alignment at the end of saccades. We measured the distribution of naturally occurring disparities in different parts of the visual field. The central tendency of the distributions was crossed (nearer than fixation) in the lower field and uncrossed (farther) in the upper field in male and female participants. It was uncrossed in the left and right fields. We also measured horizontal vergence after completion of vertical, horizontal, and oblique saccades. When the eyes first landed near the eccentric target, vergence was quite consistent with the natural-disparity distribution. For example, when making an upward saccade, the eyes diverged to be aligned with the most probable uncrossed disparity in that part of the visual field. Likewise, when making a downward saccade, the eyes converged to enable alignment with crossed disparity in that part of the field. Our results show that rapid binocular eye movements are adapted to the statistics of the 3D environment, minimizing the need for large corrective vergence movements at the end of saccades. The results are relevant to the debate about whether eye movements are derived from separate saccadic and vergence neural commands that control both eyes or from separate monocular commands that control the eyes independently.SIGNIFICANCE STATEMENT We show that the human visual system incorporates statistical regularities in the visual environment to enable efficient binocular eye movements. We define the oculomotor horopter: the surface of 3D positions to which the eyes initially move when stimulated by eccentric targets. The observed movements maximize the probability of accurate fixation as the eyes move from one position to another. This is the first study to show quantitatively that binocular eye movements conform to 3D scene statistics, thereby enabling efficient processing. The results provide greater insight into the neural mechanisms underlying the planning and execution of saccadic eye movements
Eye tracking in virtual reality: Vive pro eye spatial accuracy, precision, and calibration reliability
A growing number of virtual reality devices now include eye tracking technology, which can facilitate oculomotor and cognitive research in VR and enable use cases like foveated rendering. These applications require different tracking performance, often measured as spatial accuracy and precision. While manufacturers report data quality estimates for their devices, these typically represent ideal performance and may not reflect real-world data quality. Additionally, it is unclear how accuracy and precision change across sessions within the same participant or between devices, and how performance is influenced by vision correction. Here, we measured spatial accuracy and precision of the Vive Pro Eye built-in eye tracker across a range of 30 visual degrees horizontally and vertically. Participants completed ten measurement sessions over multiple days, allowing to evaluate calibration reliability. Accuracy and precision were highest for central gaze and decreased with greater eccentricity in both axes. Calibration was successful in all participants, including those wearing contacts or glasses, but glasses yielded significantly lower performance. We further found differences in accuracy (but not precision) between two Vive Pro Eye headsets, and estimated participants’ inter-pupillary distance. Our metrics suggest high calibration reliability and can serve as a baseline for expected eye tracking performance in VR experiments
- …