13 research outputs found

    Investigating the Essential of Meaningful Automated Formative Feedback for Programming Assignments

    Full text link
    This study investigated the essential of meaningful automated feedback for programming assignments. Three different types of feedback were tested, including (a) What's wrong - what test cases were testing and which failed, (b) Gap - comparisons between expected and actual outputs, and (c) Hint - hints on how to fix problems if test cases failed. 46 students taking a CS2 participated in this study. They were divided into three groups, and the feedback configurations for each group were different: (1) Group One - What's wrong, (2) Group Two - What's wrong + Gap, (3) Group Three - What's wrong + Gap + Hint. This study found that simply knowing what failed did not help students sufficiently, and might stimulate system gaming behavior. Hints were not found to be impactful on student performance or their usage of automated feedback. Based on the findings, this study provides practical guidance on the design of automated feedback

    Mining Student Submission Information to Refine Plagiarism Detection

    Get PDF
    Plagiarism is becoming an increasingly important issue in introductory programming courses. There are several tools to assist with plagiarism detection, but they are not effective for more basic programming assignments, like those in introductory courses. The proliferation of auto-grading platforms creates an opportunity to capture additional information about how students develop the solutions to their programming assignments. In this research, we identify how to extract information from an online autograding platform, Mimir Classroom, that can be useful in revealing patterns in solution development. We explore how and to what extent this additional information can be used to better support instructors when identifying cases of probable plagiarism. We have developed a tool that takes the raw student assignment submissions from Mimir, analyzes them, and produces data sets and visualizations that help instructors to refine information extracted by existing plagiarism detection platforms. The instructors can then take this information to further investigate any probable cases of plagiarism that have been found by the tool. Our main goal is to give insight into student behaviors and identify signals that can be effective indicatives of plagiarism. Furthermore, the framework can enable the analysis of other aspects of students’ solution development processes that may be useful when reasoning about their learning. As an initial exploration scenario of the framework developed in this work, we have used student code submissions from the CSCE 121: Introduction to Program Design and Concepts course at Texas A&M University. We experimented with the student code submissions from the Fall 2018 and Fall 2019 offerings of the course

    The relationship between online tutorials and academic performance in distance education: a predictive framework for Open University, Indonesia

    Full text link
    This study was administered to find new patterns and meaningful innovation that focuses on applying different machine learning approaches to predict students’ performance by analysing and identifying features in E-learning which strongly affects students’ performance. Moreover, it is a new phenomenon of prediction model that can be implemented in many fields

    Student Perceptions of Academic Integrity in an Online Psychiatric Nurse Education

    Get PDF
    This study explored student perceptions of academic integrity at an online nursing college with a high rate of plagiarism in Western Canada. Social cognitive theory and the theory of student cheating and plagiarism were used as the conceptual framework for this case study. Participants from a second-year cohort were sent a survey gauging their interest in participation. The ten students were interviewed to better understand what decision making they used to ensure academic integrity during their program of study. Employing a qualitative exploratory case study approach, each interview was taped and transcribed. The interview data were coded by numeric identifiers to ensure confidentiality. Themes which came through in the study included (a) deficit APA knowledge, (b) assignment instructions and academic writing in the curriculum, (c) frustrations unique to learning online, and institutional issues including teacher tactics to reduce plagiarism and the need for additional composition skill development resources. Student perceptions of academic integrity informed their decision to plagiarize, and the investigation suggested the need for a more holistic orientation experience aimed at decreasing incidents of academic integrity violations. Academic leadership at the college in Western Canada may benefit by this study as the insights led to a project and construction of a virtual writing lab with resources for faculty and students. Increasing student awareness of the importance of academic integrity in decision making has positive social change implications for ensuring online psychiatric nurse education quality for new nurses who support a vulnerable and marginalized population

    The Future of Information Sciences : INFuture2009 : Digital Resources and Knowledge Sharing

    Get PDF

    Annual report on research activities 2005-2006

    Full text link
    https://commons.ln.edu.hk/research_annual_report/1004/thumbnail.jp
    corecore