4 research outputs found

    Eye tracking and artificial intelligence for competency assessment in engineering education: a review

    Get PDF
    In recent years, eye-tracking (ET) methods have gained an increasing interest in STEM education research. When applied to engineering education, ET is particularly relevant for understanding some aspects of student behavior, especially student competency, and its assessment. However, from the instructor’s perspective, little is known about how ET can be used to provide new insights into, and ease the process of, instructor assessment. Traditionally, engineering education is assessed through time-consuming and labor-extensive screening of their materials and learning outcomes. With regard to this, and coupled with, for instance, the subjective open-ended dimensions of engineering design, assessing competency has shown some limitations. To address such issues, alternative technologies such as artificial intelligence (AI), which has the potential to massively predict and repeat instructors’ tasks with higher accuracy, have been suggested. To date, little is known about the effects of combining AI and ET (AIET) techniques to gain new insights into the instructor’s perspective. We conducted a Review of engineering education over the last decade (2013–2022) to study the latest research focusing on this combination to improve engineering assessment. The Review was conducted in four databases (Web of Science, IEEE Xplore, EBSCOhost, and Google Scholar) and included specific terms associated with the topic of AIET in engineering education. The research identified two types of AIET applications that mostly focus on student learning: (1) eye-tracking devices that rely on AI to enhance the gaze-tracking process (improvement of technology), and (2) the use of AI to analyze, predict, and assess eye-tracking analytics (application of technology). We ended the Review by discussing future perspectives and potential contributions to the assessment of engineering learning

    Towards Everyday Virtual Reality through Eye Tracking

    Get PDF
    Durch Entwicklungen in den Bereichen Computergrafik, Hardwaretechnologie, Perception Engineering und Mensch-Computer Interaktion, werden Virtual Reality und virtuelle Umgebungen immer mehr in unser tägliches Leben integriert. Head-Mounted Displays werden jedoch im Vergleich zu anderen mobilen Geräten, wie Smartphones und Smartwatches, noch nicht so häufig genutzt. Mit zunehmender Nutzung dieser Technologie und der Gewöhnung von Menschen an virtuelle Anwendungsszenarien ist es wahrscheinlich, dass in naher Zukunft ein alltägliches Virtual-Reality-Paradigma realisiert wird. Im Hinblick auf die Kombination von alltäglicher Virtual Reality und Head-Mounted-Displays, ist Eye Tracking eine neue Technologie, die es ermöglicht, menschliches Verhalten in Echtzeit und nicht-intrusiv zu messen. Bevor diese Technologien in großem Umfang im Alltag eingesetzt werden können, müssen jedoch noch zahlreiche Aspekte genauer erforscht werden. Zunächst sollten Aufmerksamkeits- und Kognitionsmodelle in Alltagsszenarien genau verstanden werden. Des Weiteren sind Maßnahmen zur Wahrung der Privatsphäre notwendig, da die Augen mit visuellen biometrischen Indikatoren assoziiert sind. Zuletzt sollten anstelle von Studien oder Anwendungen, die sich auf eine begrenzte Anzahl menschlicher Teilnehmer mit relativ homogenen Merkmalen stützen, Protokolle und Anwendungsfälle für eine bessere Zugänglichkeit dieser Technologie von wesentlicher Bedeutung sein. In dieser Arbeit wurde unter Berücksichtigung der oben genannten Punkte ein bedeutender wissenschaftlicher Vorstoß mit drei zentralen Forschungsbeiträgen in Richtung alltäglicher Virtual Reality unternommen. Menschliche visuelle Aufmerksamkeit und Kognition innerhalb von Virtual Reality wurden in zwei unterschiedlichen Bereichen, Bildung und Autofahren, erforscht. Die Forschung im Bildungsbereich konzentrierte sich auf die Auswirkungen verschiedener Manipulationen im Klassenraum auf das menschliche Sehverhalten, während die Forschung im Bereich des Autofahrens auf sicherheitsrelevante Fragen und Blickführung abzielte. Die Nutzerstudien in beiden Bereichen zeigen, dass Blickbewegungen signifikante Implikationen für diese alltäglichen Situationen haben. Der zweite wesentliche Beitrag fokussiert sich auf Privatsphäre bewahrendes Eye Tracking für Blickbewegungsdaten von Head-Mounted Displays. Dies beinhaltet Differential Privacy, welche zeitliche Korrelationen von Blickbewegungssignalen berücksichtigt und Privatsphäre wahrende Blickschätzung durch Verwendung eines auf randomisiertem Encoding basierenden Frameworks, welches Augenreferenzunkte verwendet. Die Ergebnisse beider Arbeiten zeigen, dass die Wahrung der Privatsphäre möglich ist und gleichzeitig der Nutzen in einem akzeptablen Bereich bleibt. Wenngleich es bisher nur wenig Forschung zu diesem Aspekt von Eye Tracking gibt, ist weitere Forschung notwendig, um den alltäglichen Gebrauch von Virtual Reality zu ermöglichen. Als letzter signifikanter Beitrag, wurde ein Blockchain- und Smart Contract-basiertes Protokoll zur Eye Tracking Datenerhebung für Virtual Reality vorgeschlagen, um Virtual Reality besser zugänglich zu machen. Die Ergebnisse liefern wertvolle Erkenntnisse für alltägliche Nutzung von Virtual Reality und treiben den aktuellen Stand der Forschung in mehrere Richtungen voran.With developments in computer graphics, hardware technology, perception engineering, and human-computer interaction, virtual reality and virtual environments are becoming more integrated into our daily lives. Head-mounted displays, however, are still not used as frequently as other mobile devices such as smart phones and watches. With increased usage of this technology and the acclimation of humans to virtual application scenarios, it is possible that in the near future an everyday virtual reality paradigm will be realized. When considering the marriage of everyday virtual reality and head-mounted displays, eye tracking is an emerging technology that helps to assess human behaviors in a real time and non-intrusive way. Still, multiple aspects need to be researched before these technologies become widely available in daily life. Firstly, attention and cognition models in everyday scenarios should be thoroughly understood. Secondly, as eyes are related to visual biometrics, privacy preserving methodologies are necessary. Lastly, instead of studies or applications utilizing limited human participants with relatively homogeneous characteristics, protocols and use-cases for making such technology more accessible should be essential. In this work, taking the aforementioned points into account, a significant scientific push towards everyday virtual reality has been completed with three main research contributions. Human visual attention and cognition have been researched in virtual reality in two different domains, including education and driving. Research in the education domain has focused on the effects of different classroom manipulations on human visual behaviors, whereas research in the driving domain has targeted safety related issues and gaze-guidance. The user studies in both domains show that eye movements offer significant implications for these everyday setups. The second substantial contribution focuses on privacy preserving eye tracking for the eye movement data that is gathered from head-mounted displays. This includes differential privacy, taking temporal correlations of eye movement signals into account, and privacy preserving gaze estimation task by utilizing a randomized encoding-based framework that uses eye landmarks. The results of both works have indicated that privacy considerations are possible by keeping utility in a reasonable range. Even though few works have focused on this aspect of eye tracking until now, more research is necessary to support everyday virtual reality. As a final significant contribution, a blockchain- and smart contract-based eye tracking data collection protocol for virtual reality is proposed to make virtual reality more accessible. The findings present valuable insights for everyday virtual reality and advance the state-of-the-art in several directions

    Social Comparisons in the Classroom Revisited: Insights Into Underlying Processes Using Immersive Virtual Reality as a Research Tool

    Get PDF
    Social comparisons are commonplace in every classroom and widely acknowledged as central determinants of students’ academic self-evaluations (see, e.g., Dijkstra et al., 2008; Trautwein & Möller, 2016). Most prominently, in educational psychology research, social comparisons have been assumed to be the cause behind the well-known Big-Fish-Little-Pond effect (BFLPE; Marsh, 1987), suggesting negative effects of higher class-average (or school-average) achievement on students’ academic self-concept while controlling for individual achievement. Whereas existing research has provided compelling evidence of the effects of certain reference groups on students’ self-evaluations (Marsh et al., 2017; Marsh & Seaton, 2015), the actual mechanisms behind the proposed effects and how students process social information while learning are still a black box. The present dissertation was aimed at gaining insights into the respective underlying processes (i.e., the “inner workings” of this black box) by using immersive virtual reality (IVR) as a research tool. IVR technology provides an unprecedent opportunity for educational psychology research to integrate ecological validity and experimental control in research designs to gain authentic and yet standardized insights into classroom processes, such as social comparisons and beyond (see, e.g., Blascovich et al., 2002). To this end, the present dissertation was aimed at a theoretical as well as a methodological advancement of research on social comparisons in the classroom. To address these objectives, the dissertation drew on three empirical studies with an IVR classroom including an experimental manipulation of classmates’ performance-related behavior. First, pursuing a more in-depth theoretical understanding of social comparisons and the respective processing of social information in the classroom, the dissertation aimed to identify covert and overt social comparison behaviors that (a) reflect students’ cognitive and behavioral responses to social comparison information in an IVR classroom and (b) ultimately explain individual differences in students’ self-concepts. Studies 1 and 2 used students’ self-reports (of their interpretation of classmates’ performance-related behavior) and eye movement data (e.g., visual attention on classmates) to identify different social comparison processes in the IVR classroom and to provide insights into the mechanisms that underlie the BFLPE. Second, aiming to provide insights into how IVR classrooms can be used as an experimental tool in educational psychology research, Study 3 focused on the configuration of an IVR classroom to authentically simulate and control a (social) classroom environment. The study provides insights into how different fields of view, virtual avatar visualization styles and virtual classmates’ performance-related behaviors affect students’ processing of social information provided in the IVR classroom. Taken together, by using an IVR classroom as an experimentally controlled yet authentic research setting, the present dissertation was able to advance the theoretical understanding of social comparisons and respective processing of social information in the classroom that ultimately explain individual differences in students’ self-concept. Moreover, the present dissertation demonstrates how IVR classrooms and the corresponding standardized process data can be used to gain insights into classroom processes, such as social comparisons. The dissertation thereby provides implications for research on both social comparisons in the classroom and the use of IVR as an experimental tool in educational and social psychology research in general
    corecore