24 research outputs found
Gender differences in self reported long term outcomes following moderate to severe traumatic brain injury
<p>Abstract</p> <p>Background</p> <p>The majority of research on health outcomes after a traumatic brain injury is focused on male participants. Information examining gender differences in health outcomes post traumatic brain injury is limited. The purpose of this study was to investigate gender differences in symptoms reported after a traumatic brain injury and to examine the degree to which these symptoms are problematic in daily functioning.</p> <p>Methods</p> <p>This is a secondary data analysis of a retrospective cohort study of 306 individuals who sustained a moderate to severe traumatic brain injury 8 to 24 years ago. Data were collected using the Problem Checklist (PCL) from the Head Injury Family Interview (HIFI). Using Bonferroni correction, group differences between women and men were explored using Chi-square and Wilcoxon analysis.</p> <p>Results</p> <p>Chi-square analysis by gender revealed that significantly more men reported difficulty setting realistic goals and restlessness whereas significantly more women reported headaches, dizziness and loss of confidence. Wilcoxon analysis by gender revealed that men reported sensitivity to noise and sleep disturbances as significantly more problematic than women, whereas for women, lack of initiative and needing supervision were significantly more problematic in daily functioning.</p> <p>Conclusion</p> <p>This study provides insight into gender differences on outcomes after traumatic brain injury. There are significant differences between problems reported by men compared to women. This insight may facilitate health service planners and clinicians when developing programs for individuals with brain injury.</p
Improving Conceptual Learning in Introductory Astronomy through Mental Model Building
Misconceptions about the cause and process of the lunar phases persist for many students. In this project, the authors worked with over 350 students in introductory astronomy and physical science classes. Students completed an observational project based on the Mental Model Building methodology. Students recorded their preconceptions, then used their own 3-D observations of the Moon to build a revised, complete spatial model describing the lunar phases. Project evaluation was done using an established instrument, the Lunar Phases Concept Inventory (LPCI). Detailed analysis of pre- and post-project scores shows significant gains in student learning. Exploratory factor analysis of the LPCI questions indicate that there are two to three themes that can guide project improvements. Item analysis of the LPCI results indicate that most questions act as clear discriminators between the highest and lowest performing students on the post-project test, suggesting that the LPCI is a reliable tool for project evaluation
The Lunar Phases Project: A Mental Model-Based Observational Project for Undergraduate Nonscience Majors
We present our Lunar Phases Project, an ongoing effort utilizing students’ actual observations within a mental model building framework to improve student understanding of the causes and process of the lunar phases. We implement this project with a sample of undergraduate, nonscience major students enrolled in a midsized public university located in the southeast part of the United States. To quantitatively assess our activity, we use the Lunar Phases Concept Inventory, a research-validated assessment instrument. We observe significant gains in student understanding of the lunar phases for students who complete the Lunar Phases Project
Assessing Students’ Earth History Misconceptions Within an Introductory Geology Course
Assessing misconceptions within introductory geology courses can be difficult because they often go unidentified. Philips (1991) suggested that commonly held geology misconceptions should be identified before instruction is begun. In this study we gave students a 15 item pre- and post-test, generated with the Geoscience Concept Inventory (GCI; Libarkin & Anderson, 2005), with the hope to assess Earth history misconceptions that they have entering the introductory geology course at a mid-size public university in the Southeast United States. This test is also used to measure students\u27 learning gains within geology; however, these results will be presented at another time. Initial results of 51 students are below. GCI question 7 was used to assess what the students\u27 believed the Earth looked like when it first formed. On the pre-test 20% (d value = 0.29) of the students could answer this question correctly; however, 55% of the students thought Pangaea was correct. After instruction, 39% (d value = 0.51) of the students answered this question correctly; yet, 45% of the students still thought Pangaea was correct. GCI question 28 was used to assess the students\u27 misconceptions about the changes of life on Earth over time. On the pre-test 74% of students could correctly identify the order of the changes of life on Earth; but only 29% (d value = 0.48) of the students could put that order in a correct time sequence. Also, 26% of the students thought Humans and Dinosaurs lived at the same time. After instruction, 41% (d value = 0.50) of the students answered this question correctly. From our initial assessment we conclude that: 1) students do not have a strong comprehension of the Earth\u27s history when they enter an intro level geology course; 2) even after instruction, students have the misconception that Pangaea represents what the Earth looked like when it formed; and 3) students understand the sequence of changes in life on Earth; but, they do not understand this sequence with respect to geologic time. Also, 15 students in a 3000/5000 level geology course were given the 15 item GCI test; ≥60% of these students could answer both Earth history questions correctly, suggesting that their initial misconceptions may have been removed after further instruction
Assessing Students’ Earth History Misconceptions Within an Introductory Geology Course
Assessing misconceptions within introductory geology courses can be difficult because they often go unidentified. Philips (1991) suggested that commonly held geology misconceptions should be identified before instruction is begun. In this study we gave students a 15 item pre- and post-test, generated with the Geoscience Concept Inventory (GCI; Libarkin & Anderson, 2005), with the hope to assess Earth history misconceptions that they have entering the introductory geology course at a mid-size public university in the Southeast United States. This test is also used to measure students\u27 learning gains within geology; however, these results will be presented at another time. Initial results of 51 students are below.
GCI question 7 was used to assess what the students\u27 believed the Earth looked like when it first formed. On the pre-test 20% (d value = 0.29) of the students could answer this question correctly; however, 55% of the students thought Pangaea was correct. After instruction, 39% (d value = 0.51) of the students answered this question correctly; yet, 45% of the students still thought Pangaea was correct. GCI question 28 was used to assess the students\u27 misconceptions about the changes of life on Earth over time. On the pre-test 74% of students could correctly identify the order of the changes of life on Earth; but only 29% (d value = 0.48) of the students could put that order in a correct time sequence. Also, 26% of the students thought Humans and Dinosaurs lived at the same time. After instruction, 41% (d value = 0.50) of the students answered this question correctly. From our initial assessment we conclude that: 1) students do not have a strong comprehension of the Earth\u27s history when they enter an intro level geology course; 2) even after instruction, students have the misconception that Pangaea represents what the Earth looked like when it formed; and 3) students understand the sequence of changes in life on Earth; but, they do not understand this sequence with respect to geologic time. Also, 15 students in a 3000/5000 level geology course were given the 15 item GCI test; ≥60% of these students could answer both Earth history questions correctly, suggesting that their initial misconceptions may have been removed after further instruction
Validation of a Novel Statistical Method to Identify Aberrant Patient Logging: A Multi-Institutional Study
INTRODUCTION: Student patient encounter logging informs the quality of supervised clinical practice experiences (SCPEs). Yet, it is unknown whether logs accurately reflect patient encounters, and the faculty resources necessary to review for potential aberrant logging are significant. The purpose of this study was to identify a statistical method to identify aberrant logging. METHODS: A multi-institutional (n = 6) study examined a statistical method for identifying potentially aberrant logging behavior. An automated statistical Mahalanobis Distance (MD) measurement was used to categorize student logs as aberrant if they were identified as probable multivariate outliers. This approach was validated using a gold standard for aberrant logging behavior with manual review by 4 experienced faculty ( faculty consensus ) and then comparing interrater agreement between faculty and MD-based categorization. In secondary analyses, we compared the relative accuracy of MD-based categorization to individual faculty categorizing data from their own program ( own program categorization). RESULTS: 323 student logging records from 6 physician assistant (PA) programs were included. Compared to faculty consensus (the gold standard), MD-based categorization was highly sensitive (0.846, 95% CI: 0.650, 1.000) and specific (0.766, 95% CI: 0.645, 0.887). Additionally, there was no significant difference in sensitivity, specificity, positive predictive value, or negative predictive value between MD-based categorization and own program categorization. DISCUSSION: The MD-based method of identifying aberrant and nonaberrant student logging compared favorably to the more traditional, faculty-intensive approach of reviewing individual student logging records. This supports MD-based screening as a less labor-intensive alternative to individual faculty review to identify aberrant logging. Identification of aberrant logging may facilitate early intervention with students to improve clinical exposure logging during their SCPEs