32 research outputs found

    Comparing Categorical and Probabilistic Fingerprint Evidence

    Get PDF
    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury‐eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant\u27s fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities

    How Cross‐Examination on Subjectivity and Bias Affects Jurors’ Evaluations of Forensic Science Evidence

    Get PDF
    Contextual bias has been widely discussed as a possible problem in forensic science. The trial simulation experiment reported here examined reactions of jurors at a county courthouse to cross‐examination and arguments about contextual bias in a hypothetical case. We varied whether the key prosecution witness (a forensic odontologist) was cross‐examined about the subjectivity of his interpretations and about his exposure to potentially biasing task‐irrelevant information. Jurors found the expert less credible and were less likely to convict when the expert admitted that his interpretation rested on subjective judgment, and when he admitted having been exposed to potentially biasing task‐irrelevant contextual information (relative to when these issues were not raised by the lawyers). The findings suggest, however, that forensic scientists can immunize themselves against such challenges and maximize the weight jurors give their evidence by adopting context management procedures that blind them to task‐irrelevant information

    Creating an Instrument to Measure Student Response to Instructional Practices

    Full text link
    BackgroundCalls for the reform of education in science, technology, engineering, and mathematics (STEM) have inspired many instructional innovations, some research based. Yet adoption of such instruction has been slow. Research has suggested that students’ response may significantly affect an instructor’s willingness to adopt different types of instruction.PurposeWe created the Student Response to Instructional Practices (StRIP) instrument to measure the effects of several variables on student response to instructional practices. We discuss the step‐by‐step process for creating this instrument.Design/MethodThe development process had six steps: item generation and construct development, validity testing, implementation, exploratory factor analysis, confirmatory factor analysis, and instrument modification and replication. We discuss pilot testing of the initial instrument, construct development, and validation using exploratory and confirmatory factor analyses.ResultsThis process produced 47 items measuring three parts of our framework. Types of instruction separated into four factors (interactive, constructive, active, and passive); strategies for using in‐class activities into two factors (explanation and facilitation); and student responses to instruction into five factors (value, positivity, participation, distraction, and evaluation).ConclusionsWe describe the design process and final results for our instrument, a useful tool for understanding the relationship between type of instruction and students’ response.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/136692/1/jee20162_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/136692/2/jee20162.pd
    corecore