A portable, low-cost eye tracking system for predicting visual attention during meal ingestion

Abstract

Obesity is a pressing public health challenge affecting physical and mental health. Understanding the cognitive and socio-economic factors that influence eating behaviors is essential for effective interventions. Visual attention is crucial in food choices, making it critical to study how individuals engage with food in real-life settings. However, current methodologies often rely on static images, limiting their ecological validity and real-world applicability, or are highly intrusive. This work offers a practical solution for tracking visual attention during real-life eating scenarios. We designed a tray with an integrated camera and a calibration pattern to remotely record volunteers during meal consumption. A convolutional neural network (CNN) was employed for the gaze tracking, utilizing a custom-built training database. Models without and with user calibration were provided. Results indicate an accuracy error of 5 cm, which improves to 2 cm with calibration, demonstrating our system's effectiveness in measuring visual attention in food psychology experiments.This research was supported by the Government of Navarra, Departamento de Universidad, Innovacion y Transformacion Digital (grant PC139-140, PORTIONS-4) . EA-R also received funding from the Centre for Nutrition Research, University of Navarra, Spain. The portion-control plate was kindly provided by Precise Portions LLC, Virginia, USA.Additional funding was provided by the Public University of Navarra and the Spanish Ministry of Science and Innovation under the project Challenges of Eye Tracking Off-the-Shelf (ChETOS) (reference PID2020-118014RB-I00)

Similar works

Full text

thumbnail-image

Academica-e (Univ. Pública de Navarra)

redirect
Last time updated on 27/12/2025

This paper was published in Academica-e (Univ. Pública de Navarra).

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.

Licence: info:eu-repo/semantics/openAccess