25 research outputs found

    Facilitating Teaching And Research On Open Ended Problem Solving Through The Development Of A Dynamic Computer Tool

    Get PDF
    Model Eliciting Activities (MEAs) are realistic open-ended problems set in engineering contexts; student teams draw on their diverse experiences both in and out of the classroom to develop a mathematical model explicated in a memo to the client. These activities have been implemented in a required first-year engineering course with enrollments of as many as 1700 students in a given semester. The earliest MEA implementations had student teams write a single solution to a problem in the form of a memo to the client and receive feedback from their TA. For research purposes, a simple static online submission form, a static feedback form, and a single database table were quickly developed. Over time, research revealed that students need multiple feedback, revision, and reflection points to address misconceptions and achieve high quality solutions. As a result, the toolset has been expanded, patched, and re-patched multiple developers to increase both the functionality and the security of the system. Because the class is so large and the implementation sequence involved is not trivial, the technology has become a necessary to successfully manage the implementation of MEAs in the course. The resulting system has become a kluge of bloated inflexible code that now requires a part time graduate student to manage the deployment of 2-4 MEAs per semester. New functions are desired but are either not compatible or are too cumbersome to implement under the existing architecture. Based on this, a new system is currently being developed to allow for greater flexibility, easier expandability, and expanded functionality. The largest feature-set being developed for the new system are the administrative tools to ease the deployment process. Other features being planned are the ability to have students upload files and images as part of their solution. This paper will describe the history of the MEA Learning System (MEALS) and the lessons learned about developing custom teaching and research software, and will explore how the development of custom software tools can be used to facilitate the dual roles of teaching and educational research

    Comparing students\u27 solutions to an open-ended problem in an introductory programming course with and without explicit modeling interventions

    Get PDF
    Engineers must understand how to build, apply, and adapt various types of models in order to be successful. Throughout undergraduate engineering education, modeling is fundamental for many core concepts, though it is rarely explicitly taught. There are many benefits to explicitly teaching modeling, particularly in the first years of an engineering program. The research questions that drove this study are: (1) How do students\u27 solutions to a complex, open-ended problem (both written and coded solutions) develop over the course of multiple submissions? and (2) How do these developments compare across groups of students that did and did not participate in a course centered around modeling?. Students\u27 solutions to an open-ended problem across multiple sections of an introductory programming course were explored. These sections were all divided across two groups: (1) experimental group - these sections discussed and utilized mathematical and computational models explicitly throughout the course, and (2) comparison group - these sections focused on developing algorithms and writing code with a more traditional approach. All sections required students to complete a common open-ended problem that consisted of two versions of the problem (the first version with smaller data set and the other a larger data set). Each version had two submissions - (1) a mathematical model or algorithm (i.e. students\u27 written solution potentially with tables and figures) and (2) a computational model or program (i.e. students\u27 MATLAB code). The students\u27 solutions were graded by student graders after completing two required training sessions that consisted of assessing multiple sample student solutions using the rubrics to ensure consistency across grading. The resulting assessments of students\u27 works based on the rubrics were analyzed to identify patterns students\u27 submissions and comparisons across sections. The results identified differences existing in the mathematical and computational model development between students from the experimental and comparison groups. The students in the experimental group were able to better address the complexity of the problem. Most groups demonstrated similar levels and types of change across the submissions for the other dimensions related to the purpose of model components, addressing the users\u27 anticipated needs, and communicating their solutions. These findings help inform other researchers and instructors how to help students develop mathematical and computational modeling skills, especially in a programming course. This work is part of a larger NSF study about the impact of varying levels of modeling interventions related to different types of models on students\u27 awareness of different types of models and their applications, as well as their ability to apply and develop different types of models

    Change in student understanding of modeling during first year engineering courses

    Get PDF
    All engineers must be able to apply and create models to be effective problem solvers, critical thinkers, and innovative designers. To be more successful in their studies and careers, students need a foundational knowledge about models. An adaptable approach can help students develop their modeling skills across a variety of modeling types, including physical models, mathematical models, logical models, and computational models. Physical models (e.g., prototypes) are the most common type of models that engineering students identify and discuss during the design process. There is a need to explicitly focus on varying types of models, model application, and model development in the engineering curriculum, especially on mathematical and computational models. This NSF project proposes two approaches to creating a holistic modeling environment for learning at two universities. These universities require different levels of revision to the existing first-year engineering courses or programs. The proposed approaches change to a unified language and discussion around modeling with the intent of contextualizing modeling as a fundamental tool within engineering. To evaluate student learning on modeling in engineering, we conducted pre and post surveys across three different first-year engineering courses at these two universities with different student demographics. The comparison between the pre and post surveys highlighted student learning on engineering modeling based on different teaching and curriculum change approaches

    Selecting Effective Examples to Train Students for Peer Review of Open‐Ended Problem Solutions

    Get PDF
    Background Students conducting peer review on authentic artifacts require training. In the training studied here, individual students reviewed (score and provide feedback on) a randomly selected prototypical solution to a problem. Afterwards, they are shown a side-by-side comparison of their review and an expert’s review, along with prompts to reflect on the differences and similarities. Individuals were then assigned a peer team’s solution to review. Purpose This paper explores how the characteristics of five different prototypical solutions used in training (and their associated expert evaluations) impacted students’ abilities to score peer teams’ solutions. Design/Method An expert rater scored the prototypical solutions and 147 student teams’ solutions that were peer reviewed using an eight item rubric. Differences between the scores assigned by the expert and a student to a prototypical solution and an actual team solution were used to compute a measure of the student’s improvement as a peer reviewer from training to actual peer review. ANOVA testing with Tukey’s post-hoc analysis was done to identify statistical differences in improvement based on the prototypical solutions students saw during the training phase. Results Statistically significant differences were found in the amount of error a student made during peer review between high and low quality prototypical solutions seen by students during training. Specifically, a lower quality training solution (and associated expert evaluation) resulted in more accurate scoring during peer review. Conclusions While students typically ask to see exemplars of “good solutions”, this research suggests that there is likely greater value, for the purpose of preparing students to score peers’ solutions, in students seeing a low-quality solution and its corresponding expert review

    First Year Engineering Students’ Identification of Models in Engineering

    Get PDF
    Background To succeed in engineering careers, students must be able to create and apply models to certain problems. The different types of models include physical, mathematical, computational, graphical, and financial, which are used both in academics, research, and industry. However, many students struggle to define, create, and apply relevant models in their engineering courses. Purpose (Research Questions) The research questions investigated in this study are: (1) What types of models do engineering students identify before and after completing a first-year engineering course? (2) How do students’ responses compare across different courses (a graphical communications course - EGR 120 and a programming course - EGR 115), and sections? Design/Methods The data used for this study were collected in two introductory first-year engineering courses offered during Fall 2019, EGR 115 and EGR 120. Students’ responses to a survey about modeling were qualitatively analyzed. The survey was given at the beginning and the end of the courses. The data analyzed consisted of 560 pre and post surveys for EGR 115 and 384 pre and post surveys for EGR 120. Results Once the analysis is complete, we are hoping to find that the students can better define and apply models in their engineering courses after they have completed the EGR 115 and/or EGR 120 courses

    Student Awareness of Models in First-Year Engineering Courses

    Get PDF
    Contribution: This study assesses more than 800 students\u27 awareness of engineering model types before and after taking two first-year engineering courses across two semesters and evaluates the effect of each course. Background: All engineers must be able to apply and create models to be effective problem solvers, critical thinkers, and innovative designers. To help them develop these skills, as a first step, it is essential to assess how to increase students\u27 awareness of engineering models. According to Bloom\u27s taxonomy, the lower remember and understand levels, which encompass awareness, are necessary for achieving the higher levels, such as apply, analyze, evaluate, and create. Research Questions: To what extent did student awareness of model types change after taking introductory engineering courses? To what extent did student awareness of model types differ by course or semester? Methodology: In this study, a survey was designed and administered at the beginning and end of the semester in two first-year engineering courses during two semesters in a mid-sized private school. The survey asked students questions about their definition of engineering modeling and different types of models. Findings: Overall, student awareness of model types increased from the beginning of the semester toward the end of the semester, across both semesters and courses. There were some differences between course sections, however, the students\u27 awareness of the models at the end of the academic year was similar for both groups

    Types of Models Identified by First-Year Engineering Students

    Get PDF
    This is a Complete Research paper. Understanding models is important for engineering students, but not often taught explicitly in first-year courses. Although there are many types of models in engineering, studies have shown that engineering students most commonly identify prototyping or physical models when asked about modeling. In order to evaluate students\u27 understanding of different types of models used in engineering and the effectiveness of interventions designed to teach modeling, a survey was developed. This paper describes development of a framework to categorize the types of engineering models that first-year engineering students discuss based on both previous literature and students\u27 responses to survey questions about models. In Fall 2019, the survey was administered to first-year engineering students to investigate their awareness of types of models and understanding of how to apply different types of models in solving engineering problems. Students\u27 responses to three questions from the survey were analyzed in this study: 1. What is a model in science, technology, engineering, and mathematics (STEM) fields?, 2. List different types of models that you can think of., and 3. Describe each different type of model you listed. Responses were categorized by model type and the framework was updated through an iterative coding process. After four rounds of analysis of 30 different students\u27 responses, an acceptable percentage agreement was reached between independent researchers coding the data. Resulting frequencies of the various model types identified by students are presented along with representative student responses to provide insight into students\u27 understanding of models in STEM. This study is part of a larger project to understand the impact of modeling interventions on students\u27 awareness of models and their ability to build and apply models
    corecore