13,453 research outputs found

    “Stickiness”: Gauging students’ attention to online learning activities

    Get PDF
    Purpose: Online content developers use the term “stickiness” to refer to the ability of their online service or game to attract and hold the attention of users and create a compelling and magnetic reason for them to return repeatedly (examples include virtual pets and social media). In business circles, the same term connotes the level of consumer loyalty to a particular brand. This paper aims to extend the concept of “stickiness” not only to describe repeat return and commitment to the learning “product”, but also as a measure of the extent to which students are engaged in online learning opportunities. Design/methodology/approach: This paper explores the efficacy of several approaches to the monitoring and measuring of online learning environments, and proposes a framework for assessing the extent to which these environments are compelling, engaging and “sticky”. Findings: In particular, the exploration so far has highlighted the difference between how lecturers have monitored the engagement of students in a face-to-face setting versus the online teaching environment. Practical implications: In the higher education environment where increasingly students are being asked to access learning in the online space, it is vital for teachers to be in a position to monitor and guide students in their engagement with online materials. Originality/value: The mere presence of learning materials online is not sufficient evidence of engagement. This paper offers options for testing specific attention to online materials allowing greater assurance around engagement with relevant and effective online learning activities

    Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning

    Get PDF
    The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning

    Immersive Telepresence: A framework for training and rehearsal in a postdigital age

    Get PDF

    The relationship of (perceived) epistemic cognition to interaction with resources on the internet

    Get PDF
    Information seeking and processing are key literacy practices. However, they are activities that students, across a range of ages, struggle with. These information seeking processes can be viewed through the lens of epistemic cognition: beliefs regarding the source, justification, complexity, and certainty of knowledge. In the research reported in this article we build on established research in this area, which has typically used self-report psychometric and behavior data, and information seeking tasks involving closed-document sets. We take a novel approach in applying established self-report measures to a large-scale, naturalistic, study environment, pointing to the potential of analysis of dialogue, web-navigation – including sites visited – and other trace data, to support more traditional self-report mechanisms. Our analysis suggests that prior work demonstrating relationships between self-report indicators is not paralleled in investigation of the hypothesized relationships between self-report and trace-indicators. However, there are clear epistemic features of this trace data. The article thus demonstrates the potential of behavioral learning analytic data in understanding how epistemic cognition is brought to bear in rich information seeking and processing tasks

    Towards investigating the validity of measurement of self-regulated learning based on trace data

    Get PDF
    Contains fulltext : 250033.pdf (Publisher’s version ) (Open Access)Contemporary research that looks at self-regulated learning (SRL) as processes of learning events derived from trace data has attracted increasing interest over the past decade. However, limited research has been conducted that looks into the validity of trace-based measurement protocols. In order to fill this gap in the literature, we propose a novel validation approach that combines theory-driven and data-driven perspectives to increase the validity of interpretations of SRL processes extracted from trace-data. The main contribution of this approach consists of three alignments between trace data and think aloud data to improve measurement validity. In addition, we define the match rate between SRL processes extracted from trace data and think aloud as a quantitative indicator together with other three indicators (sensitivity, specificity and trace coverage), to evaluate the "degree" of validity. We tested this validation approach in a laboratory study that involved 44 learners who learned individually about the topic of artificial intelligence in education with the use of a technology-enhanced learning environment for 45 minutes. Following this new validation approach, we achieved an improved match rate between SRL processes extracted from trace-data and think aloud data (training set: 54.24%; testing set: 55.09%) compared to the match rate before applying the validation approach (training set: 38.97%; test set: 34.54%). By considering think aloud data as "reference point", this improvement of the match rate quantified the extent to which validity can be improved by using our validation approach. In conclusion, the novel validation approach presented in this study used both empirical evidence from think aloud data and rationale from our theoretical framework of SRL, which now, allows testing and improvement of the validity of trace-based SRL measurements.39 p

    When, how and for whom changes in engagement happen:A transition analysis of instructional variables

    Get PDF
    The pace of our knowledge on online engagement has not been at par with our need to understand the temporal dynamics of online engagement, the transitions between engagement states, and the factors that influence a student being persistently engaged, transitioning to disengagement, or catching up and transitioning to an engaged state. Our study addresses such a gap and investigates how engagement evolves or changes over time, using a person-centered approach to identify for whom the changes happen and when. We take advantage of a novel and innovative multistate Markov model to identify what variables influence such transitions and with what magnitude, i.e., to answer the why. We use a large data set of 1428 enrollments in six courses (238 students). The findings show that online engagement changes differently —across students— and at different magnitudes —according to different instructional variables and previous engagement states. Cognitively engaging instructions helped cognitively engaged students stay engaged while negatively affecting disengaged students. Lectures —a resource that requires less mental energy— helped improve disengaged students. Such differential effects point to the different ways interventions can be applied to different groups, and how different groups may be supported. A balanced, carefully tailored approach is needed to design, intervene, or support students' engagement that takes into account the diversity of engagement states as well as the varied response magnitudes that intervention may incur across diverse students’ profiles

    Towards efficient provision of feedback supported by learning analytics

    Get PDF
    Proceedings of: 2012 12th IEEE International Conference on Advanced Learning Technologies (ICALT 2012). Rome, Italy, 04-06 July, 2012.Problem-based learning lab sessions shape a demanding environment, both for students as well for the teaching staff, due to the additional support required. This applies particularly to overcrowded classes. Under these conditions, some aspects do not perform well, like the efficiency of the provision of feedback and the orchestration of the session, jeopardizing the effectiveness of the learning activity. Based on empirical observation, a characterization of lab sessions has been carried out, integrating both qualitative and quantitative parameters describing the interactions that take place. Based on such characterization, a supporting tool is proposed to make use of the students' logs, learning analytics and visualization techniques for providing monitoring and awareness mechanisms for leveraging the detected problems and thus improving the learning and assessment processes.Research partially supported by the Spanish Plan Nacional de I+D+I projects “Learn3: Towards Learning of the Third Kind” (TIN2008-05163/TSI) and “EEE” (TIN2011-28308-C03-01), and the Madrid regional project “eMadrid: Investigación y desarrollo de tecnologías para el e-learning en la Comunidad de Madrid” (S2009/TIC-1650).Publicad

    Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project

    Get PDF
    Learning analytics (LA) has emerged as a field that offers promising new ways to support failing or weaker students, prevent drop-out and aid retention. However, other research suggests that large datasets of learner activity can be used to understand online learning behaviour and improve pedagogy. While the use of LA in language learning has received little attention to date, available research suggests that understanding language learner behaviour could provide valuable insights into task design for instructors and materials designers, as well as help students with effective learning strategies and personalised learning pathways. This paper first discusses previous research in the field of language learning and teaching based on learner tracking and the specific affordances of LA for CALL, as well as its inherent limitations and challenges. The second part of the paper analyses data arising from the European Commission (EC) funded VITAL project that adopted a bottom-up pedagogical approach to LA and implemented learner activity tracking in different blended or distance learning settings. Referring to data arising from 285 undergraduate students on a Business French course at Hasselt University which used a flipped classroom design, statistical and process-mining techniques were applied to map and visualise actual uses of online learning resources over the course of one semester. Results suggested that most students planned their self-study sessions in accordance with the flipped classroom design, both in terms of their timing of online activity and selection of contents. Other metrics measuring active online engagement – a crucial component of successful flipped learning - indicated significant differences between successful and non-successful students. Meaningful learner patterns were revealed in the data, visualising students’ paths through the online learning environment and uses of the different activity types. The research implied that valuable insights for instructors, course designers and students can be acquired based on the tracking and analysis of language learner data and the use of visualisation and process-mining tools
    • 

    corecore