158 research outputs found
The relationship between childhood maltreatment and learning engagement of high school students: the role of growth mindset and beliefs about adversity
ObjectiveTo explore the relationship between childhood maltreatment, growth mindset, beliefs about adversity and learning engagement among high school students.MethodsResearch participants were selected by random cluster sampling.652 high school students (50.2% male and 49.8% female) from five high schools were investigated using paper-pencil survey versions of Child Trauma Questionnaire, The Utrecht Work Engagement Scale-student, Growth Mindset Scale, and The Beliefs About Adversity Scale.ResultsChildhood maltreatment had a significant negative effect on high school students’ learning engagement. Childhood maltreatment directly predicted high school students’ learning engagement and also had an indirect negative predictive effect on learning engagement via growth mindsetConclusionGrowth mindset plays a mediating role between childhood maltreatment and learning engagement. The beliefs about adversity moderated the relationship between childhood maltreatment and growth mindset, as well as the relationship between childhood maltreatment and learning engagement. This study has empirical implications for helping high school students who have experienced childhood maltreatment to develop growth mindset and teaching students to adopt positive adversity beliefs in response to trauma during psychological interventions, thereby increasing high school students’ engagement in learning
Characterizing Semantic Ambiguity of the Materials Science Ontologies
Growth in computational materials science and initiatives such as the
Materials Genome Initiative (MGI) and the European Materials Modelling Council
(EMMC) has motivated the development and application of ontologies. A key
factor has been increased adoption of the FAIR principles, making research data
findable, accessible, interoperable, and reusable (Wilkinson et al. 2016). This
paper characterizes semantic interoperability among a subset of materials
science ontologies in the MatPortal repository. Background context covers
semantic interoperability, ontological commitment, and the materials science
ontology landscape. The research focused on MatPortal's two interoperability
protocols: LOOM term matching and URI matching. Results report the degree of
overlap and demonstrate the different types of ambiguity among ontologies. The
discussion considers implications for FAIR and AI, and the conclusion highlight
key findings and next steps.Comment: 12 pages, International Society for Knowledge Organization (ISKO)
202
Geometry-Aware Adaptation for Pretrained Models
Machine learning models -- including prominent zero-shot models -- are often
trained on datasets whose labels are only a small proportion of a larger label
space. Such spaces are commonly equipped with a metric that relates the labels
via distances between them. We propose a simple approach to exploit this
information to adapt the trained model to reliably predict new classes -- or,
in the case of zero-shot prediction, to improve its performance -- without any
additional training. Our technique is a drop-in replacement of the standard
prediction rule, swapping argmax with the Fr\'echet mean. We provide a
comprehensive theoretical analysis for this approach, studying (i)
learning-theoretic results trading off label space diameter, sample complexity,
and model dimension, (ii) characterizations of the full range of scenarios in
which it is possible to predict any unobserved class, and (iii) an optimal
active learning-like next class selection procedure to obtain optimal training
classes for when it is not possible to predict the entire range of unobserved
classes. Empirically, using easily-available external metrics, our proposed
approach, Loki, gains up to 29.7% relative improvement over SimCLR on ImageNet
and scales to hundreds of thousands of classes. When no such metric is
available, Loki can use self-derived metrics from class embeddings and obtains
a 10.5% improvement on pretrained zero-shot models such as CLIP
- …