80 research outputs found

    Marker effects and examination reliability: a comparative exploration from the perspectives of generalizability theory, Rasch modelling and multilevel modelling

    Get PDF
    This study looked at how three different analysis methods could help us to understand rater effects on exam reliability. The techniques we looked at were: generalizability theory (G-theory) item response theory (IRT): in particular the Many-Facets Partial Credit Rasch Model (MFRM) multilevel modelling (MLM) We used data from AS component papers in geography and psychology for 2009, 2010 and 2011 from Edexcel.</p

    On the supranational spell of PISA in policy

    Get PDF
    Background: PISA results appear to have a large impact upon government policy. The phenomenon is growing, with more countries taking part in PISA testing and politicians pointing to PISA results as reasons for their reforms. Purpose: The aims of this research were to depict the policy reactions to PISA across a number of jurisdictions, to see whether they exhibited similar patterns and whether the same reforms were evident. Sources of evidence: We investigated policy and media reactions to the 2009 and 2012 PISA results in six cases: Canada, China (Shanghai), England, France, Norway and Switzerland. Cases were selected to contrast high-performing jurisdictions (Canada, China) with average performers (England, France, Norway and Switzerland). Countries that had already been well reported on in the literature were excluded (Finland, Germany). Design and methods: Policy documents, media reports and academic articles in English, French, Mandarin and Norwegian relating to each of the cases were critically evaluated. Results: A policy reaction of ‘scandalisation’ was evident in four of the six cases; a technique used to motivate change. Five of the six cases showed ‘standards-based reforms’ and two had reforms in line with the ‘ideal-governance’ model. However, these are categorisations: the actual reforms had significant differences across countries. There are chronological problems with the notion that PISA results were causal with regard to policy in some instances. Countries with similar PISA results responded with different policies, reflecting their differing cultural and historical education system trajectories. Conclusions: The connection between PISA results and policy is not always obvious. The supranational spell of PISA in policy is in the way that PISA results are used as a magic wand in political rhetoric, as though they conjure particular policy choices. This serves as a distraction from the ideological basis for reforms. The same PISA results could motivate a range of different policy solutions

    Framework for the development and evaluation of complex interventions: gap analysis, workshop and consultation-informed update.

    Get PDF
    BACKGROUND: The Medical Research Council published the second edition of its framework in 2006 on developing and evaluating complex interventions. Since then, there have been considerable developments in the field of complex intervention research. The objective of this project was to update the framework in the light of these developments. The framework aims to help research teams prioritise research questions and design, and conduct research with an appropriate choice of methods, rather than to provide detailed guidance on the use of specific methods. METHODS: There were four stages to the update: (1) gap analysis to identify developments in the methods and practice since the previous framework was published; (2) an expert workshop of 36 participants to discuss the topics identified in the gap analysis; (3) an open consultation process to seek comments on a first draft of the new framework; and (4) findings from the previous stages were used to redraft the framework, and final expert review was obtained. The process was overseen by a Scientific Advisory Group representing the range of relevant National Institute for Health Research and Medical Research Council research investments. RESULTS: Key changes to the previous framework include (1) an updated definition of complex interventions, highlighting the dynamic relationship between the intervention and its context; (2) an emphasis on the use of diverse research perspectives: efficacy, effectiveness, theory-based and systems perspectives; (3) a focus on the usefulness of evidence as the basis for determining research perspective and questions; (4) an increased focus on interventions developed outside research teams, for example changes in policy or health services delivery; and (5) the identification of six 'core elements' that should guide all phases of complex intervention research: consider context; develop, refine and test programme theory; engage stakeholders; identify key uncertainties; refine the intervention; and economic considerations. We divide the research process into four phases: development, feasibility, evaluation and implementation. For each phase we provide a concise summary of recent developments, key points to address and signposts to further reading. We also present case studies to illustrate the points being made throughout. LIMITATIONS: The framework aims to help research teams prioritise research questions and design and conduct research with an appropriate choice of methods, rather than to provide detailed guidance on the use of specific methods. In many of the areas of innovation that we highlight, such as the use of systems approaches, there are still only a few practical examples. We refer to more specific and detailed guidance where available and note where promising approaches require further development. CONCLUSIONS: This new framework incorporates developments in complex intervention research published since the previous edition was written in 2006. As well as taking account of established practice and recent refinements, we draw attention to new approaches and place greater emphasis on economic considerations in complex intervention research. We have introduced a new emphasis on the importance of context and the value of understanding interventions as 'events in systems' that produce effects through interactions with features of the contexts in which they are implemented. The framework adopts a pluralist approach, encouraging researchers and research funders to adopt diverse research perspectives and to select research questions and methods pragmatically, with the aim of providing evidence that is useful to decision-makers. FUTURE WORK: We call for further work to develop relevant methods and provide examples in practice. The use of this framework should be monitored and the move should be made to a more fluid resource in the future, for example a web-based format that can be frequently updated to incorporate new material and links to emerging resources. FUNDING: This project was jointly funded by the Medical Research Council (MRC) and the National Institute for Health Research (Department of Health and Social Care 73514)

    Reprint of: 'A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance'

    Get PDF
    The UK Medical Research Council's widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research. Complex interventions are commonly used in the health and social care services, public health practice, and other areas of social and economic policy that have consequences for health. Such interventions are delivered and evaluated at different levels, from individual to societal levels. Examples include a new surgical procedure, the redesign of a healthcare programme, and a change in welfare policy. The UK Medical Research Council (MRC) published a framework for researchers and research funders on developing and evaluating complex interventions in 2000 and revised guidance in 2006 (Campbell et al., 2000; Craig et al., 2008; Craig et al., 2006). Although these documents continue to be widely used and are now accompanied by a range of more detailed guidance on specific aspects of the research process (Craig et al., 2012; Craig et al., 2018; Moore et al., 2015; Moore et al., 2021; O’Cathain et al., 2019), several important conceptual, methodological and theoretical developments have taken place since 2006. These developments have been included in a new framework commissioned by the National Institute of Health Research (NIHR) and the MRC (Skivington et al., 2021). The framework aims to help researchers work with other stakeholders to identify the key questions about complex interventions, and to design and conduct research with a diversity of perspectives and appropriate choice of methods

    A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance

    Get PDF
    The UK Medical Research Council’s widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research

    Predictability in high-stakes examinations: students’ perspectives on a perennial assessment dilemma *

    Get PDF
    Key debates within educational assessment continuously encourage us to reflect on the design, delivery and implementation of examination systems as well as their relevance to students. In more recent times, such reflections have also required a rethinking of who is authoritative about assessment issues and whose views we seek in order to better understand these perennial assessment dilemmas. This paper considers one such dilemma, predictability in high-stakes assessment, and presents students’ perspectives on this issue. The context is the Irish Leaving Certificate (LC) taken by upper secondary students (aged between 16 and 18) in order (mainly) to enter tertiary-level education. The data come from 13 group interviews with 81 students across a range of schools in Ireland. Listening to students about complex, high-stakes examining problems has a limited history within the educational assessment literature. The findings from the study address this shortcoming and depict how students’ insightful reflections can improve our understanding of these dilemmas. Further, students are more than able to reflect on their own situations with regard to high stakes examining contexts and have important contributions to make to our fuller understanding of those elements that will promote high quality and fair assessment
    • …
    corecore