69773 research outputs found
Sort by 
Does learning more about others impact liking them? Replication and extension Registered Report of Norton et al.’s (2007) lure of ambiguity (Registered report)
Norton et al., 2007, demonstrated a counterintuitive phenomenon that knowing other people better and/or having more information about them is associated with decreased liking. They summarized it as ambiguity leads to liking, whereas familiarity can breed contempt. In a Registered Report with a US Prolific undergraduate student sample (N = 801), we directly replicated Studies 1a, 1b and 2 and conceptually replicated Studies 3 and 4 from Norton et al., 2007. Extending their research, we also proposed that curiosity provides an alternative path to liking, hypothesizing that curiosity mediates the relationship between knowledge and liking. Overall, we found weak support for the original findings. Consistent with the original article, participants believed they would like someone who they knew more about (original: h = 0.52–0.70; replication: h = 0.55–0.75) and that knowledge positively predicts liking (original: h = 0.21–0.45; replication: h = 0.57–0.76). However, we found no indication of the number of traits known influencing liking (original: r = −0.43 to −0.005; replication: r = −0.05 to 0.06) or perceived similarity to the target (d = 0.00), for a mediating effect of perceived similarity, for a dissimilarity cascade effect, or for changes in liking or perceived similarity as a factor of learning more about the target. In our extensions, we found support for a positive relationship between curiosity and liking (r = 0.62–0.70), but not for knowledge and curiosity (r = −0.06 to 0.05). Overall, our findings suggest that learning more about others may not influence perceptions of liking, similarity or curiosity towards them. Materials, data and code are available on https://osf.io/j6tqr/. This Registered Report has been officially endorsed by Peer Community in Registered Reports: https://doi.org/10.24072/pci.rr.100947
Boosting global time series forecasting models: a two-stage modelling framework
A time series forecasting model---which is typically built on a single time series---is known as a local time series model (tsLM). In contrast, a forecasting model trained on multiple time series is referred to as a global time series model (tsGM). tsGMs can enhance forecasting accuracy and improve generalisation by learning cross-series information. As such, developing tsGMs has become a prominent research focus within the time series forecasting community. However, the benefits of tsGMs may not always be realised if the given set of time series is heterogeneous. While increasing model complexity can help tsGMs adapt to such a set of data, it can also increase the risk of overfitting and forecasting error. Additionally, the definition of homogeneity remains ambiguous in the literature. To address these challenges, this paper explores how to define data heterogeneity and proposes a two-stage modelling framework: At stage one, a tsGM is learnt to identify homogeneous patterns; and at stage two,  tsLMs (e.g., ARIMA) or sub-tsGMs tailored to different groups are learnt to capture the heterogeneity. Numerical experiments on four open datasets demonstrate that the proposed approach significantly outperforms six state-of-the-art models. These results highlight its effectiveness in unlocking the full potential of global forecasting models for heterogeneous datasets.
\end{abstract
The Oxford Handbook of Shakespeare and Early Modern Authorship
The Oxford Handbook of Shakespeare and Early Modern Authorship draws together leading and emerging scholars of Shakespeare and early modern literature to consider anew how authorship worked in the time in which Shakespeare wrote, and to interrogate the construction of the Shakespeare-as-author figure. Composed of four main sections, it offers fresh analysis of the literary and cultural influences and forces that 'formed' authors in the period; the 'mechanics' of early modern authorship; the 'mediation' of Shakespeare and others' works in performance, manuscript, and print; and the critical and popular reimagining across times of Shakespeare as an author figure.
Diving into modern debates about early modern authorship, authority, and identity politics, contributors supply rich new accounts of the wider scene of professional authorship in early modern England, of how Shakespeare's writings contributed to it, and of what made him distinctive within it. Looking beyond Shakespeare, the Handbook seeks to provide a vital testing ground for new research into early modern literature and culture more broadly
Pricing VXX options with observable volatility dynamics from high-frequency VIX index
This paper develops a discrete-time joint analytical framework for pricing volatility index (VIX) and VXX options consistently. We show that our framework is more flexible than continuous-time VXX models as it allows the information contained in the high-frequency VIX index to be incorporated for the joint pricing of VIX and VXX options, and the joint pricing formula is derived. Our empirical analysis shows that the model that utilizes the realized variance (RV) computed from the high-frequency VIX index data significantly outperforms the model that does not rely on the VIX RV in the joint pricing both in-sample and out-of-sample, reinforcing the beliefs that high-frequency data are informative about the derivatives pricing
Tackling health inequalities: what exactly do we mean? Evidence from health policy in England
Objectives
To develop a model to support health systems in clarifying how they might target action to reduce health inequalities, and to use it to understand current policy on health inequalities in England.
Methods
We used the wider literature on the definitions of health inequalities to draw together a schematic model which attempts to link together the different conceptualisations of health inequalities present in the literature with potential sites of action that could be taken by local health systems. We then undertook a document analysis of the policy documents and programmes underlying the recent reorganisation of the NHS in England.
Results
The need to tackle health inequalities is cited as one of the main rationales underlying the changes. However, there is a lack of clarity within the documents around: the type of inequality being addressed; the identification of the group(s) suffering from inequalities; and the ways in which the assumed ameliorative mechanisms will work in practice. The documents place considerable emphasis on the assumption that closer partnership working will address inequalities, although the mechanisms by which this will be achieved are not specified and previous research demonstrates how difficult this can be.
Conclusions
The aspiration to tackle health inequalities through newly constituted Integrated Care Systems and Boards is welcome. However, it is well known that the contribution that health care services can make to addressing inequalities is relatively limited. Greater clarity is required of policy and local strategy if efforts are to be appropriately targeted
Linear or Cyclic? Theoretical Investigation of Astrophysically Relevant Magnesium-Bearing MgCnH Carbon Chains and Related Isomers
Magnesium-containing molecules, including MgC2H, MgC4H, and MgC6H, have been detected in the interstellar medium, largely facilitated by their high dipole moments. However, despite great efforts, MgC2m + 1H species remain elusive. Given the challenges in obtaining experimental data for these molecules, theoretical studies play a crucial role in guiding their detection. In this work, we present a theoretical analysis of MgCnH isomers (n = 4-7) using density functional theory and coupled-cluster methods to identify low-lying isomers and characterize their structural and electronic properties. Our findings reveal that across the entire series investigated, the global minimum geometry is linear for even values of n, whereas for odd values, a cyclic geometry is favored. Additionally, our calculations highlight the enhanced stability of anionic MgCnH- systems, providing insights that could aid future astronomical detections in the interstellar medium
Is there a reproducibility crisis? on the need for evidence-based approaches
The ‘Sixth Report—Reproducibility and Research Integrity’ (UK House of Commons Science, Innovation and Technology Committee 2023. ‘Sixth Report—Reproducibility and Research Integrity’) (‘The Report’) recommends measures designed to tackle an alleged ‘reproducibility crisis’ in scientific research. Our systematic analysis of the content of this report revealed that its findings and recommendations are consistent with the scientific literature, including the acknowledgement that conclusive evidence demonstrating the existence of a ‘reproducibility crisis’ is lacking. Though conceding that there is currently no way to determine the size of the crisis or whether it even exists, The Report nevertheless proposes actions to tackle the alleged crisis. However, without a quantitative understanding, the efficacy of the proposed measures cannot be verified. Hence, the current approach towards the alleged reproducibility crisis, here exemplified by The Report, does not adhere to the standards that would normally applied to the scientific method. An evidence-based approach requires the establishment of a quantitative understanding of whether data variability in the research literature exceeds technically achievable levels of reproducibility. If it does, the resulting understanding will enable the design of actions, whose success can be monitored. Our findings emphasise that the research environment requires the same level of rigour and scrutiny as the scientific experiments themselves
A mixed method evaluation of a novel targeted health messaging intervention to promote COVID-19 protective behaviours and vaccination among black and South Asian communities living in the UK (The COBHAM study)
Aim: To evaluate an intervention (a film and electronic leaflet) disseminated via text message by general practices to promote COVID-19 preventative behaviours in Black and South Asian communities.
Methods: We carried out a before-and-after questionnaire study of attitudes to and implementation of COVID-19 preventative behaviours and qualitative interviews about the intervention with people registered with 26 general practices in England who identified as Black or South Asian.  
Results: In the 108 people who completed both questionnaires, we found no significant change in attitudes to and implementation of COVID-19 preventative behaviours, although power was too low to detect significant effects. A key qualitative finding was that participants felt they did not ‘belong’ to the group targeted by the intervention. 
Conclusion: Interventions targeting ethnic minorities in the UK need to acknowledge the heterogeneity of experience and circumstances of the target group so that people feel that the intervention is relevant to them
Balanced-detection visible optical coherence tomography with a low-noise supercontinuum laser
This paper comprehensively demonstrates the efficiency of balanced detection in a visible optical coherence tomography instrument employing a low-noise supercontinuum laser. By using an innovative technique for digitally aligning camera pixels, we achieved a noise floor reduction of up to 12.8 dB across the entire imaging depth range, particularly near the zero optical path difference between the interferometer arms. The instrument presented here operates at a central wavelength of 590 nm. It delivers high-resolution images with a sensitivity of up to 74 dB in a single spectrometer configuration and 92.8 dB in a balanced configuration. The enhancement in image contrast is exemplified through images of an optical phantom and in-vivo images of a human thumb and nail
Foundations of Expected Points in Rugby Union: A Methodological Approach
This study explores the feasibility of an Expected Points metric for rugby union, aiming to shift performance analysis from descriptive indicators to a predictive metric of possession quality. Notational analysis was conducted on 132 Premiership Rugby matches, producing a dataset of 35,199 unique phases of play containing variables such as team in possession, pitch location, type of play, score differences, time remaining, cards and the next scoring outcome. Four
machine learning algorithms were explored to predict scoring outcomes: multinomial logistic regression, random forest, support vector machine and k-nearest neighbors. After extensive feature engineering and hyperparameter optimisation, the best-performing model (a random forest classifier) achieved 39.7% ±2.8 ppts accuracy. However, this did not meet a literature-derived baseline for practical usability (44.3%), thus the model was not suitable for applied contexts. A key
challenge was predicting minority scoring outcomes due to severe class imbalance. SMOTE was explored to address this imbalance, resulting in a lower accuracy (35.7%) but an improved F1-score of 34.4%. This study highlights the inherent limitations of modelling scoring outcomes in dynamic, open-play team sports, challenging the predominant positivist paradigm in sports performance analysis. The methodology provides critical foundational groundwork and a benchmark for future research to build upon. It recommends exploring advanced samplers for minority classes, expanded feature sets and alternative modelling techniques, such as recurrent neural networks