8 research outputs found

    Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality

    Full text link
    [EN] Fixation identification is an essential task in the extraction of relevant information from gaze patterns; various algorithms are used in the identification process. However, the thresholds used in the algorithms greatly affect their sensitivity. Moreover, the application of these algorithm to eye-tracking technologies integrated into head-mounted displays, where the subject's head position is unrestricted, is still an open issue. Therefore, the adaptation of eye-tracking algorithms and their thresholds to immersive virtual reality frameworks needs to be validated. This study presents the development of a dispersion-threshold identification algorithm applied to data obtained from an eye-tracking system integrated into a head-mounted display. Rules-based criteria are proposed to calibrate the thresholds of the algorithm through different features, such as number of fixations and the percentage of points which belong to a fixation. The results show that distance-dispersion thresholds between 1-1.6 degrees and time windows between0.25-0.4s are the acceptable range parameters, with 1 degrees and0.25s being the optimum. The work presents a calibrated algorithm to be applied in future experiments with eye-tracking integrated into head-mounted displays and guidelines for calibrating fixation identification algorithmsWe thank Pepe Roda Belles for the development of the virtual reality environment and the integration of the HMD with Unity platform. We also thank Masoud Moghaddasi for useful discussions and recommendations.Llanes-Jurado, J.; Marín-Morales, J.; Guixeres Provinciale, J.; Alcañiz Raya, ML. (2020). Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality. Sensors. 20(17):1-15. https://doi.org/10.3390/s20174956S1152017Cipresso, P., Giglioli, I. A. C., Raya, M. A., & Riva, G. (2018). The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature. Frontiers in Psychology, 9. doi:10.3389/fpsyg.2018.02086Chicchi Giglioli, I. A., Pravettoni, G., Sutil Martín, D. L., Parra, E., & Raya, M. A. (2017). A Novel Integrating Virtual Reality Approach for the Assessment of the Attachment Behavioral System. Frontiers in Psychology, 8. doi:10.3389/fpsyg.2017.00959Marín-Morales, J., Higuera-Trujillo, J. L., De-Juan-Ripoll, C., Llinares, C., Guixeres, J., Iñarra, S., & Alcañiz, M. (2019). Navigation Comparison between a Real and a Virtual Museum: Time-dependent Differences using a Head Mounted Display. Interacting with Computers, 31(2), 208-220. doi:10.1093/iwc/iwz018Kober, S. E., Kurzmann, J., & Neuper, C. (2012). Cortical correlate of spatial presence in 2D and 3D interactive virtual reality: An EEG study. International Journal of Psychophysiology, 83(3), 365-374. doi:10.1016/j.ijpsycho.2011.12.003Borrego, A., Latorre, J., Llorens, R., Alcañiz, M., & Noé, E. (2016). Feasibility of a walking virtual reality system for rehabilitation: objective and subjective parameters. Journal of NeuroEngineering and Rehabilitation, 13(1). doi:10.1186/s12984-016-0174-1Clemente, M., Rodríguez, A., Rey, B., & Alcañiz, M. (2014). Assessment of the influence of navigation control and screen size on the sense of presence in virtual reality using EEG. Expert Systems with Applications, 41(4), 1584-1592. doi:10.1016/j.eswa.2013.08.055Borrego, A., Latorre, J., Alcañiz, M., & Llorens, R. (2018). Comparison of Oculus Rift and HTC Vive: Feasibility for Virtual Reality-Based Exploration, Navigation, Exergaming, and Rehabilitation. Games for Health Journal, 7(3), 151-156. doi:10.1089/g4h.2017.0114Jensen, L., & Konradsen, F. (2017). A review of the use of virtual reality head-mounted displays in education and training. Education and Information Technologies, 23(4), 1515-1529. doi:10.1007/s10639-017-9676-0Jost, T. A., Drewelow, G., Koziol, S., & Rylander, J. (2019). A quantitative method for evaluation of 6 degree of freedom virtual reality systems. Journal of Biomechanics, 97, 109379. doi:10.1016/j.jbiomech.2019.109379Chandrasekera, T., Fernando, K., & Puig, L. (2019). Effect of Degrees of Freedom on the Sense of Presence Generated by Virtual Reality (VR) Head-Mounted Display Systems: A Case Study on the Use of VR in Early Design Studios. Journal of Educational Technology Systems, 47(4), 513-522. doi:10.1177/0047239518824862Bălan, O., Moise, G., Moldoveanu, A., Leordeanu, M., & Moldoveanu, F. (2020). An Investigation of Various Machine and Deep Learning Techniques Applied in Automatic Fear Level Detection and Acrophobia Virtual Therapy. Sensors, 20(2), 496. doi:10.3390/s20020496Armstrong, T., & Olatunji, B. O. (2012). Eye tracking of attention in the affective disorders: A meta-analytic review and synthesis. Clinical Psychology Review, 32(8), 704-723. doi:10.1016/j.cpr.2012.09.004Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372-422. doi:10.1037/0033-2909.124.3.372Irwin, D. E. (1992). Memory for position and identity across eye movements. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18(2), 307-317. doi:10.1037/0278-7393.18.2.307Tanriverdi, V., & Jacob, R. J. K. (2000). Interacting with eye movements in virtual environments. Proceedings of the SIGCHI conference on Human factors in computing systems - CHI ’00. doi:10.1145/332040.332443Skulmowski, A., Bunge, A., Kaspar, K., & Pipa, G. (2014). Forced-choice decision-making in modified trolley dilemma situations: a virtual reality and eye tracking study. Frontiers in Behavioral Neuroscience, 8. doi:10.3389/fnbeh.2014.00426Juvrud, J., Gredebäck, G., Åhs, F., Lerin, N., Nyström, P., Kastrati, G., & Rosén, J. (2018). The Immersive Virtual Reality Lab: Possibilities for Remote Experimental Manipulations of Autonomic Activity on a Large Scale. Frontiers in Neuroscience, 12. doi:10.3389/fnins.2018.00305Hessels, R. S., Niehorster, D. C., Nyström, M., Andersson, R., & Hooge, I. T. C. (2018). Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. Royal Society Open Science, 5(8), 180502. doi:10.1098/rsos.180502Diaz, G., Cooper, J., Kit, D., & Hayhoe, M. (2013). Real-time recording and classification of eye movements in an immersive virtual environment. Journal of Vision, 13(12), 5-5. doi:10.1167/13.12.5Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001). Binocular eye tracking in VR for visual inspection training. Proceedings of the ACM symposium on Virtual reality software and technology - VRST ’01. doi:10.1145/505008.505010Lim, J. Z., Mountstephens, J., & Teo, J. (2020). Emotion Recognition Using Eye-Tracking: Taxonomy, Review and Current Challenges. Sensors, 20(8), 2384. doi:10.3390/s20082384Manor, B. R., & Gordon, E. (2003). Defining the temporal threshold for ocular fixation in free-viewing visuocognitive tasks. Journal of Neuroscience Methods, 128(1-2), 85-93. doi:10.1016/s0165-0270(03)00151-1Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. Proceedings of the symposium on Eye tracking research & applications - ETRA ’00. doi:10.1145/355017.355028Duchowski, A., Medlin, E., Cournia, N., Murphy, H., Gramopadhye, A., Nair, S., … Melloy, B. (2002). 3-D eye movement analysis. Behavior Research Methods, Instruments, & Computers, 34(4), 573-591. doi:10.3758/bf03195486Bobic, V., & Graovac, S. (2016). Development, implementation and evaluation of new eye tracking methodology. 2016 24th Telecommunications Forum (TELFOR). doi:10.1109/telfor.2016.7818800Sidenmark, L., & Lundström, A. (2019). Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration. Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. doi:10.1145/3314111.3319815Alghamdi, N., & Alhalabi, W. (2019). Fixation Detection with Ray-casting in Immersive Virtual Reality. International Journal of Advanced Computer Science and Applications, 10(7). doi:10.14569/ijacsa.2019.0100710Blignaut, P. (2009). Fixation identification: The optimum threshold for a dispersion algorithm. Attention, Perception, & Psychophysics, 71(4), 881-895. doi:10.3758/app.71.4.881Shic, F., Scassellati, B., & Chawarska, K. (2008). The incomplete fixation measure. Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA ’08. doi:10.1145/1344471.1344500Vive Pro Eyehttps://www.vive.com/us

    An Immersive Virtual Reality Game for Predicting Risk Taking through the Use of Implicit Measures

    Full text link
    [EN] The tool presented in this article can be applied as an ecological measure for evaluating decision-making processes in risky situations. It can be used in different contexts from both Occupational Safety and Health practices and for research purposes. Risk taking (RT) measurement constitutes a challenge for researchers and practitioners and has been addressed from different perspectives. Personality traits and temperamental aspects such as sensation seeking and impulsivity influence the individual's approach to RT, prompting risk-seeking or risk-aversion behaviors. Virtual reality has emerged as a suitable tool for RT measurement, since it enables the exposure of a person to realistic risks, allowing embodied interactions, the application of stealth assessment techniques and physiological real-time measurement. In this article, we present the assessment on decision making in risk environments (AEMIN) tool, as an enhanced version of the spheres and shield maze task, a previous tool developed by the authors. The main aim of this article is to study whether it is possible is to discriminate participants with high versus low scores in the measures of personality, sensation seeking and impulsivity, through their behaviors and physiological responses during playing AEMIN. Applying machine learning methods to the dataset we explored: (a) if through these data it is possible to discriminate between the two populations in each variable; and (b) which parameters better discriminate between the two populations in each variable. The results support the use of AEMIN as an ecological assessment tool to measure RT, since it brings to light behaviors that allow to classify the subjects into high/low risk-related psychological constructs. Regarding physiological measures, galvanic skin response seems to be less salient in prediction models.This research was funded by the Spanish Ministry of Economy and Competitiveness funded project "Assessment and Training on Decision Making in Risk Environments", grant number RTC-2017-6523-6, by the Gerenaliat Valenciana funded project "Rebrand", grant number PROMETEU/2019/105, and by the European Union ERDF (European Regional Development Fund) program of the Valencian Community 2014-2020 funded project "Interfaces de realidad mixta aplicada a salud y toma de decisiones", grant number IDIFEDER/2018/029.Juan-Ripoll, CD.; Llanes-Jurado, J.; Chicchi-Giglioli, IA.; Marín-Morales, J.; Alcañiz Raya, ML. (2021). An Immersive Virtual Reality Game for Predicting Risk Taking through the Use of Implicit Measures. Applied Sciences. 11(2):1-21. https://doi.org/10.3390/app11020825S12111

    Why do we take risks? Perception of the situation and risk proneness predict domain-specific risk taking

    Full text link
    [EN] Risk taking (RT) is a component of the decision-making process in situations that involve uncertainty and in which the probability of each outcome - rewards and/or negative consequences - is already known. The influence of cognitive and emotional processes in decision making may affect how risky situations are addressed. First, inaccurate assessments of situations may constitute a perceptual bias in decision making, which might influence RT. Second, there seems to be consensus that a proneness bias exists, known as risk proneness, which can be defined as the propensity to be attracted to potentially risky activities. In the present study, we take the approach that risk perception and risk proneness affect RT behaviours. The study hypothesises that locus of control, emotion regulation, and executive control act as perceptual biases in RT, and that personality, sensation seeking, and impulsivity traits act as proneness biases in RT. The results suggest that locus of control, emotion regulation and executive control influence certain domains of RT, while personality influences in all domains except the recreational, and sensation seeking and impulsivity are involved in all domains of RT. The results of the study constitute a foundation upon which to build in this research area and can contribute to the increased understanding of human behaviour in risky situations.This work was supported by the European Union's Horizon 2020 funded project "Modelling and predicting human decision making using measures of subconscious brain processes through mixed reality interfaces and biometric signals (RHUMBO)" (No 813234), the Spanish Ministry of Economy, Industry and Competitiveness funded project "Assessment and training on decision making in risk environments (ATEMIN)" (RTC-20176523-6; MINECO/AEI/FEDER, UE), and by the Generalitat Valenciana funded project "Mixed reality and brain decision (REBRAND)" (PROMETEO/2019/105).Juan-Ripoll, CD.; Chicchi-Giglioli, IA.; Llanes-Jurado, J.; Marín-Morales, J.; Alcañiz Raya, ML. (2021). Why do we take risks? Perception of the situation and risk proneness predict domain-specific risk taking. Frontiers in Psychology. 12:1-12. https://doi.org/10.3389/fpsyg.2021.562381S1121

    Prediction models using artificial intelligence and longitudinal data from electronic health records: a systematic methodological review

    Get PDF
    Objective: To describe and appraise the use of artificial intelligence (AI) techniques that can cope with longitudinal data from electronic health records (EHRs) to predict health-related outcomes. Methods: This review included studies in any language that: EHR was at least one of the data sources, collected longitudinal data, used an AI technique capable of handling longitudinal data, and predicted any health-related outcomes. We searched MEDLINE, Scopus, Web of Science, and IEEE Xplorer from inception to January 3, 2022. Information on the dataset, prediction task, data preprocessing, feature selection, method, validation, performance, and implementation was extracted and summarized using descriptive statistics. Risk of bias and completeness of reporting were assessed using a short form of PROBAST and TRIPOD, respectively. Results: Eighty-one studies were included. Follow-up time and number of registers per patient varied greatly, and most predicted disease development or next event based on diagnoses and drug treatments. Architectures generally were based on Recurrent Neural Networks-like layers, though in recent years combining different layers or transformers has become more popular. About half of the included studies performed hyperparameter tuning and used attention mechanisms. Most performed a single train-test partition and could not correctly assess the variability of the model’s performance. Reporting quality was poor, and a third of the studies were at high risk of bias. Conclusions: AI models are increasingly using longitudinal data. However, the heterogeneity in reporting methodology and results, and the lack of public EHR datasets and code sharing, complicate the possibility of replication.The project received a research grant from the Carlos III Institute of Health, Ministry of Economy and Competitiveness (Spain), awarded on the 2019 call under the Health Strategy Action 2013-2016, within the National Research Programme oriented to Societal Challenges, within the Technical, Scientific and Innovation Research National Plan 2013-2016 (reference PI19/00535), and the PFIS Grant FI20/00040, cofunded with European Union ERDF (European Regional Development Fund) funds. The project has also been partially funded by Generalitat de Catalunya through the AGAUR (grant numbers 2021-SGR-01033, 2021-SGR-01537).Peer ReviewedPostprint (published version

    From Jacobi off-shell currents to integral relations

    No full text
    Abstract In this paper, we study off-shell currents built from the Jacobi identity of the kinematic numerators of gg → X with X=ss,qq¯,gg X=ss,qq,gg X=ss,q\overline{q}, gg . We find that these currents can be schematically written in terms of three-point interaction Feynman rules. This representation allows for a straightforward understanding of the Colour-Kinematics duality as well as for the construction of the building blocks for the generation of higher-multiplicity tree-level and multi-loop numerators. We also provide one-loop integral relations through the Loop-Tree duality formalism with potential applications and advantages for the computation of relevant physical processes at the Large Hadron Collider. We illustrate these integral relations with the explicit examples of QCD one-loop numerators of gg → ss

    Association of General and Abdominal Obesity With Hypertension, Dyslipidemia and Prediabetes in the PREDAPS Study

    No full text

    Asociación de obesidad general y abdominal con hipertensión, dislipemia y presencia de prediabetes en el estudio PREDAPS

    No full text
    corecore