4,933 research outputs found

    Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning

    Get PDF
    The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning

    Using Learning Analytics to Assess Student Learning in Online Courses

    Get PDF
    Learning analytics can be used to enhance student engagement and performance in online courses. Using learning analytics, instructors can collect and analyze data about students and improve the design and delivery of instruction to make it more meaningful for them. In this paper, the authors review different categories of online assessments and identify data sets that can be collected and analyzed for each of them. Two different data analytics and visualization tools were used: Tableau for quantitative data and Many Eyes for qualitative data. This paper has implications for instructors, instructional designers, administrators, and educational researchers who use online assessments

    Challenges for IT-Enabled Formative Assessment of Complex 21st Century Skills

    Get PDF
    In this article, we identify and examine opportunities for formative assessment provided by information technologies (IT) and the challenges which these opportunities present. We address some of these challenges by examining key aspects of assessment processes that can be facilitated by IT: datafication of learning; feedback and scaffolding; peer assessment and peer feedback. We then consider how these processes may be applied in relation to the assessment of horizontal, general complex 21st century skills (21st CS), which are still proving challenging to incorporate into curricula as well as to assess. 21st CS such as creativity, complex problem solving, communication, collaboration and self-regulated learning contain complex constructs incorporating motivational and affective components. Our analysis has enabled us to make recommendations for policy, practice and further research. While there is currently much interest in and some progress towards the development of learning/assessment analytics for assessing 21st CS, the complexity of assessing such skills, together with the need to include affective aspects means that using IT-enabled techniques will need to be combined with more traditional methods of teacher assessment as well as peer assessment for some time to come. Therefore learners, teachers and school leaders must learn how to manage the greater variety of sorts and sources of feedback including resolving tensions of inconsistent feedback from different sources

    Report on the Evaluation of EVS Usage and Trends at the University of Hertfordshire : February to June 2014

    Get PDF
    The Electronic Voting Systems (EVS) evaluation project for iTEAM has investigated the current level of engagement in the use of EVS across the institution in 2014. It has built on the work and outputs of the JISC supported Evaluating Electronic Voting Systems (EEVS) project in 2011-12 and the work of the iTEAM project through 2011-2013. It offers an up-to-date examination of the trends in EVS adoption and the breadth and nature of EVS use across the different academic schools. The project adopted a mixed-methods approach to evaluate usage and engagement. The starting point was a desk study to examine the existing data on numbers of EVS handsets purchased by academic schools in 2011, 2012 and 2013 and registered across the University and to explore the details from the School reports previously submitted to iTEAM. Sources of data included Information Hertfordshire and the iTEAM archive. Quantitative surveys were drawn up and information requests for student numbers were made to Senior Administrative Managers (SAM). A series of interviews were held with School-based academics including EVS Champions and Associate Deans for Learning and Teaching. Three purchasing trends for EVS handsets by different Schools were found:- slow decrease in HUM, LAW and PAM, moderate increase in BS, EDU and HSK and rapid increase in CS, ET and LMS. In terms of levels of EVS usage in 2013 -14 four different patterns emerged among the schools. These showed: slow increase (CS, LMS and PAM), slow decrease (BS, ET, EDU and HUM), rapid decrease (LAW) and no change (CA and HSK). The EVS purchasing and usage trends comply with the figures given by Rogers for his technology adoption model. Some schools are characterised by successful ongoing EVS use over several years while other school strategies for EVS, which had showed promise early on, have faltered in their use. There was some evidence that academics in STEMM subjects are more likely to engage willingly with EVS use where larger groups are taught, but this is not yet in evidence across all the STEMM groups at this university. Furthermore good practice exists and flourishes across non-STEMM subjects as well. The strategies for successful School-based EVS embedding and continued use include the following three hallmarks:- ‱Top-down management support for purchasing of handsets and including training for academics and administrators, and alignment with the School teaching and learning strategy. ‱The existence of a core of innovators and early adopters of technology including the local EVS champions, who are willing to actively engage with their fellow colleagues in sharing the potential of EVS technology. ‱An engagement with the pedagogical implications for changing and developing practice that the greater use of formative or summative polling and questioning requires. The immediate future of classroom technologies such as EVS offers two main directions. Firstly, there is the continuation of adopting ‘institutionally provided’ handheld devices. This is a low-cost method that can be used easily and flexibly. The other options for classroom polling rely on sufficient wifi availability in the teaching rooms and/or mobile phone signal strength/network availability and capacity. It is anticipated that the capacity issue will present fewer barriers for adoption in future, and that the future of the classroom response systems is inevitably linked to the widespread use of mobile technologies by students

    Learning Outcomes Assessment A Practitioner\u27s Handbook

    Get PDF
    Ontario’s colleges and universities have made strides in developing learning outcomes, yet effective assessment remains a challenge. Learning Outcomes Assessment A Practitioner\u27s Handbook is a step-by-step resource to help faculty, staff, academic leaders and educational developers design, review and assess program-level learning outcomes. The handbook explores the theory, principles, reasons for and methods behind developing program-level learning outcomes; emerging developments in assessment; and tips and techniques to build institutional culture, increase faculty involvement and examine curriculum-embedded assessment. It also includes definitions, examples, case studies and recommendations that can be tailored to specific institutional cultures.https://scholar.uwindsor.ca/ctlreports/1005/thumbnail.jp

    Teachers Know Best: Making Data Work For Teachers and Students

    Get PDF
    The Teachers Know Best research project seeks to encourage innovation in K - 12 education by helping product developers and those who procure resources for teachers better understand teachers' views. The intent of Making Data Work is to drill down to help educators, school leaders, and product developers better understand the challenges teachers face when working with this critical segment of digital instructional tools. More than 4,600 teachers from a nationally representative sample were surveyed about their use of data to drive instruction and the use of these tools.This study focuses on the potential of a specific subset of digital instructional tools: those that help teachers collect and make use of student data to tailor and improve instruction for individual students. The use of data is a crucial component in personalized learning, which ensures that student learning experiences -- what they learn and how, when, and where they learn it -- are tailored to their individual needs, skills, and interests and enable them to take ownership of their learning. Personalized learning is critical to meeting all students where they are, so they are neither bored with assignments that are too easy nor overwhelmed by work that is too hard

    The Validity, Generalizability and Feasibility of Summative Evaluation Methods in Visual Analytics

    Full text link
    Many evaluation methods have been used to assess the usefulness of Visual Analytics (VA) solutions. These methods stem from a variety of origins with different assumptions and goals, which cause confusion about their proofing capabilities. Moreover, the lack of discussion about the evaluation processes may limit our potential to develop new evaluation methods specialized for VA. In this paper, we present an analysis of evaluation methods that have been used to summatively evaluate VA solutions. We provide a survey and taxonomy of the evaluation methods that have appeared in the VAST literature in the past two years. We then analyze these methods in terms of validity and generalizability of their findings, as well as the feasibility of using them. We propose a new metric called summative quality to compare evaluation methods according to their ability to prove usefulness, and make recommendations for selecting evaluation methods based on their summative quality in the VA domain.Comment: IEEE VIS (VAST) 201

    Approaches to Measuring Attendance and Engagement

    Get PDF
    In this paper, we argue that, where we measure student attendance, this creates an extrinsic motivator in the form of a reward for (apparent) engagement and can thus lead to undesirable behaviour and outcomes. We go on to consider a number of other mechanisms to assess or encourage student engagement – such as interactions with a learning environment – and whether these are more benign in their impact on student behaviour i.e. they encourage the desired impact as they are not considered threatening, unlike the penalties associated with non-attendance. We consider a case study in Computer Science to investigate student behaviour, assessing different metrics for student engagement, such as the use of source control commits and how this measure of engagement differs from attendance
    • 

    corecore