5 research outputs found

    Effects Of Gaze Distribution On Woodworking Knowledge And Skills

    Get PDF

    Learning Analytics for the Formative Assessment of New Media Skills

    Get PDF
    Recent theories of education have shifted learning environments towards student-centred education. Also, the advancement of technology and the need for skilled individuals in different areas have led to the introduction of new media skills. Along with new pedagogies and content, these changes require new forms of assessment. However, assessment as the core of learning has not been modified as much as other educational aspects. Hence, much attention is required to develop assessment methods based on current educational requirements. To address this gap, we have implemented two data-driven systematic literature reviews to recognize the existing state of the field in the current literature. Chapter four of this thesis focus on a literature review of automatic assessment, named learning analytics. This chapter investigates the topics and challenges in developing new learning analytics tools. Chapter five studies all assessment types, including traditional and automatic forms, in computational thinking education. Computational thinking education, which refers to the teaching of problem-solving skills, is one of the new media skills introduced in the 21st century. The findings from these two literature reviews categorize the assessment methods and identify the key topics in the literature of learning analytics and computational thinking assessment. Studying the identified topics, their relations, and related studies, we pinpoint the challenges, requirements, and opportunities of using automatic assessment in education. The findings from these studies can be used as a guideline for future studies aiming to enhance assessment methods in education. Also, the literature review strategy in this thesis can be utilized by other researchers to develop systematic data-driven literature reviews in future studies

    Gaze insights into debugging behavior using learner-centred analysis

    No full text
    The presented study tries to tackle an intriguing question of how user-generated data from current technologies can be used to reinforce learners' reflections, improve teaching practices, and close the learning analytics loop. In particular, the aim of the study is to utilize users' gaze to examine the role of a mirroring tool (i.e. Exercise View in Eclipse) in orchestrating basic behavioral regulation of participants engaged in a debugging task. The results demonstrated that students who processed the information presented in the Exercise View and acted upon it, improved their performance and achieved higher level of success than those who failed to do it. The findings shed a light how to capture what constitute relevant data within a particular context using gaze patterns, that could guide collection of essential learner-centred analytics for the purpose of designing usable and modular learning environments based on data-driven approaches

    Assessing the Effectiveness of Personalized Computer-Administered Feedback in an Introductory Biology Course

    Get PDF
    Though often held in high regard as a pedagogical tool, the role of feedback within the learning process remains poorly understood. The prevailing feedback literature reveals a history of inconsistent if not contradictory findings. This already complicated state is made worse by the recent introduction of learning analytic tools capable of providing students with ongoing personalized computer-generated feedback; the effectiveness of which remains unknown. The present study contributed to this new domain of knowledge by evaluating one such circumstance where a learning analytic feedback intervention was implemented in an introductory biology course at the University of Saskatchewan. The system provided personalized feedback to half of the enrolled students differentiated according their individual characteristics. The remaining students received generic feedback that was common to all students within the condition. The effectiveness of personalized feedback was evaluated with respect to academic achievement (i.e., final grade) and feedback satisfaction. Results of the treatment effect analyses showed no significant differences in student academic achievement but a small significant difference in feedback satisfaction. Follow-up analyses revealed that these significant differences in feedback satisfaction were not consistent from one iteration of the course to the next and that mean feedback satisfaction was in steady decline since the system’s implementation. It is suspected that the lack of improvement in academic achievement pertained to poor adherence of the system with the theoretical underpinnings of good feedback practice. Limitations of the study and future directions are discussed

    Potential impacts of the use of data analytics to improve the student experience

    Get PDF
    Student experience and learning analytics have been growing areas of interest for higher education practice and debate. Yet, little research has focused on the intersection of these topics: the use of analytics to improve the student experience. In order to support further investigation in this area, this study adopted an exploratory design research approach to identify potential benefits and concerns related to the use of analytics to enhance student experience in higher education. To achieve this, a prototype was designed and evaluated based on tests and discussions with academics and student representatives from nine Scottish universities. These exploratory results suggest four main potential benefits and nine possible problems and issues. A theoretical and critical analysis offers additional interpretation of the possible implications of these potential impacts. Important areas for future research are suggested
    corecore