Customizable and scalable automated assessment of C/C++ programming assignments

Abstract

The correction of exercises in programming courses is a laborious task that has traditionally been performed in a manual way. This situation, in turn, delays the access by students to feedback that can contribute significantly to their training as future professionals. Over the years, several approaches have been proposed to automate the assessment of students' programs. Static analysis is a known technique that can partially simulate the process of manual code review performed by lecturers. As such, it is a plausible option to assess whether students' solutions meet the requirements imposed on the assignments. However, implementing a personalized analysis beyond the rules included in existing tools may be a complex task for the lecturer without a mechanism that guides the work. In this paper, we present a method to provide automated and specific feedback to immediately inform students about their mistakes in programming courses. To that end, we developed the CAC++ library, which enables constructing tailored static analysis programs for C/C++ practices. The library allows for great flexibility and personalization of verifications to adjust them to each particular task, overcoming the limitations of most of the existing assessment tools. Our approach to providing specific feedback has been evaluated for a period of three academic years in a course related to object-oriented programming. The library allowed lecturers to reduce the size of the static analysis programs developed for this course. During this period, the academic results improved and undergraduates positively valued the aid offered when undertaking the implementation of assignments.Universidad de Cádiz, Grant/Award Numbers: sol-201500054192-tra, sol-201600064680-tra; Ministerio de Ciencia, Innovación y Universidades, Grant/Award Number: RTI2018-093608-B-C33; European Regional Development Fun

    Similar works