19,257 research outputs found

    Immersive Telepresence: A framework for training and rehearsal in a postdigital age

    Get PDF

    Squaring the circle: a new alternative to alternative-assessment

    Get PDF
    Many quality assurance systems rely on high-stakes assessment for course certification. Such methods are not as objective as they might appear; they can have detrimental effects on student motivation and may lack relevance to the needs of degree courses increasingly oriented to vocational utility. Alternative assessment methods can show greater formative and motivational value for students but are not well suited to the demands of course certification. The widespread use of virtual learning environments and electronic portfolios generates substantial learner activity data to enable new ways of monitoring and assessing students through Learning Analytics. These emerging practices have the potential to square the circle by generating objective, summative reports for course certification while at the same time providing formative assessment to personalise the student experience. This paper introduces conceptual models of assessment to explore how traditional reliance on numbers and grades might be displaced by new forms of evidence-intensive student profiling and engagement

    Assessing collaborative learning: big data, analytics and university futures

    Get PDF
    Traditionally, assessment in higher education has focused on the performance of individual students. This focus has been a practical as well as an epistemic one: methods of assessment are constrained by the technology of the day, and in the past they required the completion by individuals under controlled conditions, of set-piece academic exercises. Recent advances in learning analytics, drawing upon vast sets of digitally-stored student activity data, open new practical and epistemic possibilities for assessment and carry the potential to transform higher education. It is becoming practicable to assess the individual and collective performance of team members working on complex projects that closely simulate the professional contexts that graduates will encounter. In addition to academic knowledge this authentic assessment can include a diverse range of personal qualities and dispositions that are key to the computer-supported cooperative working of professionals in the knowledge economy. This paper explores the implications of such opportunities for the purpose and practices of assessment in higher education, as universities adapt their institutional missions to address 21st Century needs. The paper concludes with a strong recommendation for university leaders to deploy analytics to support and evaluate the collaborative learning of students working in realistic contexts

    Big data for monitoring educational systems

    Get PDF
    This report considers “how advances in big data are likely to transform the context and methodology of monitoring educational systems within a long-term perspective (10-30 years) and impact the evidence based policy development in the sector”, big data are “large amounts of different types of data produced with high velocity from a high number of various types of sources.” Five independent experts were commissioned by Ecorys, responding to themes of: students' privacy, educational equity and efficiency, student tracking, assessment and skills. The experts were asked to consider the “macro perspective on governance on educational systems at all levels from primary, secondary education and tertiary – the latter covering all aspects of tertiary from further, to higher, and to VET”, prioritising primary and secondary levels of education

    Improvement Research Carried Out Through Networked Communities: Accelerating Learning about Practices that Support More Productive Student Mindsets

    Get PDF
    The research on academic mindsets shows significant promise for addressing important problems facing educators. However, the history of educational reform is replete with good ideas for improvement that fail to realize the promises that accompany their introduction. As a field, we are quick to implement new ideas but slow to learn how to execute well on them. If we continue to implement reform as we always have, we will continue to get what we have always gotten. Accelerating the field's capacity to learn in and through practice to improve is one key to transforming the good ideas discussed at the White House meeting into tools, interventions, and professional development initiatives that achieve effectiveness reliably at scale. Toward this end, this paper discusses the function of networked communities engaged in improvement research and illustrates the application of these ideas in promoting greater student success in community colleges. Specifically, this white paper:* Introduces improvement research and networked communities as ideas that we believe can enhance educators' capacities to advance positive change. * Explains why improvement research requires a different kind of measures -- what we call practical measurement -- that are distinct from those commonly used by schools for accountability or by researchers for theory development.* Illustrates through a case study how systematic improvement work to promote student mindsets can be carried out. The case is based on the Carnegie Foundation's effort to address the poor success rates for students in developmental math at community colleges.Specifically, this case details:- How a practical theory and set of practical measures were created to assess the causes of "productive persistence" -- the set of "non-cognitive factors" thought to powerfully affect community college student success. In doing this work, a broad set of potential factors was distilled into a digestible framework that was useful topractitioners working with researchers, and a large set of potential measures was reduced to a practical (3-minute) set of assessments.- How these measures were used by researchers and practitioners for practical purposes -- specifically, to assess changes, predict which students were at-risk for course failure, and set priorities for improvement work.-How we organized researchersto work with practitioners to accelerate field-based experimentation on everyday practices that promote academic mindsets(what we call alpha labs), and how we organized practitioners to work with researchers to test, revise, refine, and iteratively improve their everyday practices (using plando-study-act cycles).While significant progress has already occurred, robust, practical, reliable efforts to improve students' mindsets remains at an early formative stage. We hope the ideas presented here are an instructive starting point for new efforts that might attempt to address other problems facing educators, most notably issues of inequality and underperformance in K-12 settings

    The Evidence Hub: harnessing the collective intelligence of communities to build evidence-based knowledge

    Get PDF
    Conventional document and discussion websites provide users with no help in assessing the quality or quantity of evidence behind any given idea. Besides, the very meaning of what evidence is may not be unequivocally defined within a community, and may require deep understanding, common ground and debate. An Evidence Hub is a tool to pool the community collective intelligence on what is evidence for an idea. It provides an infrastructure for debating and building evidence-based knowledge and practice. An Evidence Hub is best thought of as a filter onto other websites — a map that distills the most important issues, ideas and evidence from the noise by making clear why ideas and web resources may be worth further investigation. This paper describes the Evidence Hub concept and rationale, the breath of user engagement and the evolution of specific features, derived from our work with different community groups in the healthcare and educational sector
    corecore