7,594 research outputs found
Recommended from our members
Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates
Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on studentsâ engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in studentsâ time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online
Recommended from our members
Learning design â making practice explicit
New technologies have immense potential for learning, but the sheer variety possible also creates challenges for learners in terms of navigating through an increasingly complex digital landscape and for teachers in terms of how to design and support learning interventions. How can learners and teachers make informed decisions about what technologies to use in the design and support of learning activities? This presentation will consider this question and present a new methodology for design â 'learning design', which aims to shift the creation and support of learning from what has traditionally been an implicit, belief-based practice to one that is explicit and design based. Learning design research at the Open University, UK has included the development of a set of conceptual design views, a tool for visualising designs (CompendiumLD) and a social networking site, for sharing and discussing learning and teaching ideas and designs (Cloudworks). An overview of this work will be provided, along with a discussion of the perceived benefits of this new approach to educational design
Recommended from our members
What would learning in an open world look like? A vision for the future
The pace of current technological advancement is phenomenal. In the last few years we have seen the emergence of ever more sophisticated gaming technologies, rich, immersive virtual worlds and new social networking services that enable learners and teachers to connect and communicate in new ways. The pace of change looks set to continue as annual Horizon reports testify (http://www.nmc.org/horizon). Clearly new technologies offer much in an educational context, with the promise of flexible, personalised and student-centred learning. Indeed research over the past few years, looking at learners' use of technologies, has given us a rich picture of how learners of all ages are appropriating new tools within their own context, mixing different applications for finding/managing information and for communicating with others (Sharpe and Beetham, forthcoming)
A Heuristic Approach for Dual Expert/End-User Evaluation of Guidance in Visual Analytics
Guidance can support users during the exploration and analysis of complex
data. Previous research focused on characterizing the theoretical aspects of
guidance in visual analytics and implementing guidance in different scenarios.
However, the evaluation of guidance-enhanced visual analytics solutions remains
an open research question. We tackle this question by introducing and
validating a practical evaluation methodology for guidance in visual analytics.
We identify eight quality criteria to be fulfilled and collect expert feedback
on their validity. To facilitate actual evaluation studies, we derive two sets
of heuristics. The first set targets heuristic evaluations conducted by expert
evaluators. The second set facilitates end-user studies where participants
actually use a guidance-enhanced system. By following such a dual approach, the
different quality criteria of guidance can be examined from two different
perspectives, enhancing the overall value of evaluation studies. To test the
practical utility of our methodology, we employ it in two studies to gain
insight into the quality of two guidance-enhanced visual analytics solutions,
one being a work-in-progress research prototype, and the other being a publicly
available visualization recommender system. Based on these two evaluations, we
derive good practices for conducting evaluations of guidance in visual
analytics and identify pitfalls to be avoided during such studies.Comment: Accepted to IEEE VIS 202
Current Issues in Emerging eLearning, Volume 7, Issue 1: APLU Special Issue on Implementing Adaptive Learning At Scale
The second of two Specials Issues of the CIEE journal to have been produced and guest edited by the Personalized Learning Consortium (PLC) of the Association of Public and Land-grant Universities (APLU), featuring important research resulting from university initiatives to launch, implement and scale up the use of adaptive courseware and the strategies of adaptive learning
Decoding learning: the proof, promise and potential of digital education
With hundreds of millions of pounds spent on digital technology for education every year â from interactive whiteboards to the rise of oneâtoâone tablet computers â every new technology seems to offer unlimited promise to learning. many sectors have benefitted immensely from harnessing innovative uses of technology. cloud computing, mobile communications and internet applications have changed the way manufacturing, finance, business services, the media and retailers operate.
But key questions remain in education: has the range of technologies helped improve learnersâ experiences and the standards they achieve? or is this investment just languishing as kit in the cupboard? and what more can decision makers, schools, teachers, parents and the technology industry do to ensure the full potential of innovative technology is exploited? There is no doubt that digital technologies have had a profound impact upon the management of learning. institutions can now recruit, register, monitor, and report on students with a new economy, efficiency, and (sometimes) creativity. yet, evidence of digital technologies producing real transformation in learning and teaching remains elusive. The education sector has invested heavily in digital technology; but this investment has not yet resulted in the radical improvements to learning experiences and educational attainment. in 2011, the Review of Education Capital found that maintained schools spent ÂŁ487 million on icT equipment and services in 2009-2010.
1 since then, the education system has entered a state of flux with changes to the curriculum, shifts in funding, and increasing school autonomy. While ring-fenced funding for icT equipment and services has since ceased, a survey of 1,317 schools in July 2012 by the british educational suppliers association found they were assigning an increasing amount of their budget to technology. With greater freedom and enthusiasm towards technology in education, schools and teachers have become more discerning and are beginning to demand more evidence to justify their spending and strategies. This is both a challenge and an opportunity as it puts schools in greater charge of their spending and use of technolog
- âŠ