1,016,976 research outputs found

    Tracking learning outcomes : evaluation of the impact of Ufi

    Get PDF

    Computer Testing to Document Student Achievement of Learning Outcomes

    Get PDF
    In lieu of an abstract, here is the letter\u27s first paragraph: To the Editor. Every course should have published learning outcomes describing what students should be able to do upon successful completion of the course.1 Assessing how well the students have achieved the learning outcomes for the course is very important as this can provide evidence of learning for the student and a measure of the effectiveness of the course. Evidence of learning is important for accreditation purposes and can provide data that can be used to improve the course.2 Documenting student achievement of outcomes can be done in many ways such as having students complete an objective structured clinical evaluation (OSCE), paper, presentation, or project. Other common methods to provide evidence of learning are to give assessments in the form of quizzes and examinations

    Service-learning @ Lingnan : facts & figures

    Full text link
    This booklet summarizes the results and findings from the ongoing research and evaluation studies of Service-Learning. It provides supporting evidence that Service-Learning enhances students’ development in seven learning outcomes: Subject-Related Knowledge, Communication Skills, Social Competence, Problem- Solving Skills, Research Skills, Organization Skills and Civic Orientation.https://commons.ln.edu.hk/osl_book/1011/thumbnail.jp

    Between analysis and transformation: technology, methodology and evaluation on the SPLICE project

    Get PDF
    This paper concerns the ways in which technological change may entail methodological development in e-learning research. The focus of our argument centres on the subject of evaluation in e-learning and how technology can contribute to consensus-building on the value of project outcomes, and the identification of mechanisms behind those outcomes. We argue that a critical approach to the methodology of evaluation which harnesses technology in this way is vital to agile and effective policy and strategy-making in institutions as the challenges of transformation in a rapidly changing educational and technological environment are grappled with. With its focus on mechanisms and multiple stakeholder perspectives, we identify Pawson and Tilley’s ‘Realistic Evaluation’ as an appropriate methodological approach for this purpose, and we report on its use within a JISC-funded project on social software, SPLICE (Social Practices, Learning and Interoperability in Connected Environments). The project created new tools to assist the identification of mechanisms responsible for change to personal and institutional technological practice. These tools included collaborative mind-mapping and focused questioning, and tools for the animated modelling of complex mechanisms. By using these tools, large numbers of project stakeholders could engage in a process where they were encouraged to articulate and share their theories and ideas as to why project outcomes occurred. Using the technology, this process led towards the identification and agreement of common mechanisms which had explanatory power for all stakeholders. In conclusion, we argue that SPLICE has shown the potential of technologically-mediated Realistic Evaluation. Given the technologies we now have, a methodology based on the mass cumulation of stakeholder theories and ideas about mechanisms is feasible. Furthermore, the summative outcomes of such a process are rich in explanatory and predictive power, and therefore useful to the immediate and strategic problems of the sector. Finally, we argue that as well as generating better explanations for phenomena, the evaluation process can itself become transformative for stakeholders

    Firelight Foundation: An interim evaluation report of the Early Learning Innovation Fund

    Get PDF
    The Hewlett Foundation in 2014 selected Management Systems International (MSI) to implement a midterm evaluation of the Early Learning Innovation Fund. This evaluation explores the concept and design of the Fund; progress in achieving the Hewlett Foundation's four intermediary outcomes; and Firelight's implementation of the innovation fund with a focus on its approach to capacity building and expanding innovative programs. This evaluation also reviews the quality of the sub-grantees' monitoring and evaluation (M&E) systems and explores the potential of conducting an impact evaluation of sub-grantee activities

    CAL evaluation: Future directions

    Get PDF
    Formal, experimental methods have proved increasingly difficult to implement, and lack the capacity to generate detailed results when evaluating the impact of CAL on teaching and learning. The rigid nature of experimental design restricts the scope of investigations and the conditions in which studies can be conducted It has also consistently failed to account for all influences on learning. In innovative CAL environments, practical and theoretical development depends on the ability fully to investigate the wide range of such influences. Over the past five years, a customizable evaluation framework has been developed specifically for CAL research. The conceptual approach is defined as Situated Evaluation of CAL (SECAL), and the primary focus is on quality of learning outcomes. Two important principles underpin this development. First, the widely accepted need to evaluate in authentic contexts includes examination of the combined effects of CAL with other resources and influential aspects of the learning environment. Secondly, evaluation design is based on a critical approach and qualitative, case‐based research. Positive outcomes from applications of SECAL include the easy satisfaction of practical and situation‐specific requirements and the relatively low cost of evaluation studies. Although there is little scope to produce generalizable results in the short term, the difficulty of doing so in experimental studies suggests that this objective is difficult to achieve in educational research. A more realistic, longer‐term aim is the development of grounded theory based on common findings from individual cases

    Using formal game design methods to embed learning outcomes into game mechanics and avoid emergent behaviour

    Get PDF
    This paper offers an approach to designing game based learning experiences inspired by the Mechanics-Dynamics-Aesthetics (MDA) model (Hunicke et al, 2004) and the elemental tetrad (Schell, 2008) model for game design. A case for game based learning as an active and social learning experience is presented including arguments from both teachers and game designers concerning the value of games as learning tools. The MDA model is introduced with a classic game- based example and a non-game based observation of human behaviour demonstrating a negative effect of extrinsic motivators (Pink, 2011) and the need to closely align or embed learning outcomes into game mechanics in order to deliver an effective learning experience. The MDA model will then be applied to create a game based learning experience with the goal of teaching some of the aspects of using source code control to groups of Computer Science students. First, clear aims in terms of learning outcomes for the game are set out. Following the learning outcomes the iterative design process is explained with careful consideration and reflection on the impact of specific design decisions on the potential learning experience, and the reasons those decisions have been made and where there may be conflict between mechanics contributing to learning and mechanics for reasons of gameplay. The paper will conclude with an evaluation of results from a trial of computer science students and staff, and the perceived effectiveness of the game at delivering specific learning outcomes, and the approach for game design will be assessed

    Assessment and learning outcomes: the evaluation of deep learning in an on-line course

    Get PDF
    Using an online learning environment, students from European countries collaborated and communicated to carry out problem based learning in occupational therapy. The effectiveness of this approach was evaluated by means of the final assessments and published learning outcomes. In particular, transcripts from peer-to-peer sessions of synchronous communication were analysed. The SOLO taxonomy was used and the development of deep learning was studied week by week. This allowed the quality of the course to be appraised and showed, to a certain extent, the impact of this online international course on the learning strategies of the students. Results indicate that deep learning can be supported by synchronous communication and online meetings between course participants.</p
    corecore