520 research outputs found

    Bayesian model averaging: improved variable selection for matched case-control studies

    Get PDF
    Background: The problem of variable selection for risk factor modeling is an ongoing challenge in statistical practice. Classical methods that select one subset of exploratory risk factors dominate the medical research field. However, this approach has been criticized for not taking into account the uncertainty of the model selection process itself. This limitation can be addressed by a Bayesian model averaging approach: instead of focusing on a single model and a few factors, Bayesian model averaging considers all the models with non-negligible probabilities to make inference. Methods: This paper reports on a simulation study designed to emulate a matched case-control study and compares classical versus Bayesian model averaging selection methods. We used Matthews’s correlation coefficient to measure the quality of binary classifications. Both classical and Bayesian model averaging were also applied and compared for the analysis of a matched case-control study of patients with methicillin-resistant Staphylococcus aureus infections after hospital discharge 2011-2013. Results: Bayesian model averaging outperformed the classical approach with much lower false positive rates and higher Matthew’s correlation scores. Bayesian model averaging also produced more reliable and robust effect estimates. Conclusion: Bayesian model averaging is a conceptually simple, unified approach that produces robust results. It can be used to replace controversial P-values for case-control study in medical research

    Cognition-Enhancing Drugs: Can We Say No?

    Get PDF
    Normative analysis of cognition-enhancing drugs frequently weighs the liberty interests of drug users against egalitarian commitments to a level playing field. Yet those who would refuse to engage in neuroenhancement may well find their liberty to do so limited in a society where such drugs are widespread. To the extent that unvarnished emotional responses are world-disclosive, neurocosmetic practices also threaten to provide a form of faulty data to their users. This essay examines underappreciated liberty-based and epistemic rationales for regulating cognition-enhancing drugs

    Response properties in a model for granular matter

    Full text link
    We investigate the response properties of granular media in the framework of the so-called {\em Random Tetris Model}. We monitor, for different driving procedures, several quantities: the evolution of the density and of the density profiles, the ageing properties through the two-times correlation functions and the two-times mean-square distance between the potential energies, the response function defined in terms of the difference in the potential energies of two replica driven in two slightly different ways. We focus in particular on the role played by the spatial inhomogeneities (structures) spontaneously emerging during the compaction process, the history of the sample and the driving procedure. It turns out that none of these ingredients can be neglected for the correct interpretation of the experimental or numerical data. We discuss the problem of the optimization of the compaction process and we comment on the validity of our results for the description of granular materials in a thermodynamic framework.Comment: 22 pages, 35 eps files (21 figures

    Numerical study of a non-equilibrium interface model

    Full text link
    We have carried out extensive computer simulations of one-dimensional models related to the low noise (solid-on-solid) non-equilibrium interface of a two dimensional anchored Toom model with unbiased and biased noise. For the unbiased case the computed fluctuations of the interface in this limit provide new numerical evidence for the logarithmic correction to the subnormal L^(1/2) variance which was predicted by the dynamic renormalization group calculations on the modified Edwards-Wilkinson equation. In the biased case the simulations are in close quantitative agreement with the predictions of the Collective Variable Approximation (CVA), which gives the same L^(2/3) behavior of the variance as the KPZ equation.Comment: 15 pages revtex, 4 Postscript Figure

    Order in glassy systems

    Full text link
    A directly measurable correlation length may be defined for systems having a two-step relaxation, based on the geometric properties of density profile that remains after averaging out the fast motion. We argue that the length diverges if and when the slow timescale diverges, whatever the microscopic mechanism at the origin of the slowing down. Measuring the length amounts to determining explicitly the complexity from the observed particle configurations. One may compute in the same way the Renyi complexities K_q, their relative behavior for different q characterizes the mechanism underlying the transition. In particular, the 'Random First Order' scenario predicts that in the glass phase K_q=0 for q>x, and K_q>0 for q<x, with x the Parisi parameter. The hypothesis of a nonequilibrium effective temperature may also be directly tested directly from configurations.Comment: Typos corrected, clarifications adde

    Octet baryon electromagnetic form factors in nuclear medium

    Get PDF
    We study the octet baryon electromagnetic form factors in nuclear matter using the covariant spectator quark model extended to the nuclear matter regime. The parameters of the model in vacuum are fixed by the study of the octet baryon electromagnetic form factors. In nuclear matter the changes in hadron properties are calculated by including the relevant hadron masses and the modification of the pion-baryon coupling constants calculated in the quark-meson coupling model. In nuclear matter the magnetic form factors of the octet baryons are enhanced in the low Q2Q^2 region, while the electric form factors show a more rapid variation with Q2Q^2. The results are compared with the modification of the bound proton electromagnetic form factors observed at Jefferson Lab. In addition, the corresponding changes for the bound neutron are predicted.Comment: Version accepted for publication in J.Phys. G. Few changes. 40 pages, 14 figures and 8 table

    Telehealth for patients at high risk of cardiovascular disease: pragmatic randomised controlled trial

    Get PDF
    Objective: To assess whether non-clinical staff can effectively manage people at high risk of cardiovascular disease using digital health technologies. Design: Pragmatic, multicentre, randomised controlled trial. Setting: 42 general practices in three areas of England. Participants: Between 3 December 2012 and 23 July 2013 we recruited 641 adults aged 40 to 74 years with a 10 year cardiovascular disease risk of 20% or more, no previous cardiovascular event, at least one modifiable risk factor (systolic blood pressure ≥140 mm Hg, body mass index ≥30, current smoker), and access to a telephone, the internet, and email. Participants were individually allocated to intervention (n=325) or control (n=316) groups using automated randomisation stratified by site, minimised by practice and baseline risk score. Interventions: Intervention was the Healthlines service (alongside usual care), comprising regular telephone calls from trained lay health advisors following scripts generated by interactive software. Advisors facilitated self-management by supporting participants to use online resources to reduce risk factors, and sought to optimise drug use, improve treatment adherence, and encourage healthier lifestyles. The control group comprised usual care alone. Main outcome measures: The primary outcome was the proportion of participants responding to treatment, defined as maintaining or reducing their cardiovascular risk after 12 months. Outcomes were collected six and 12 months after randomisation and analysed masked. Participants were not masked. Results: 50% (148/295) of participants in the intervention group responded to treatment compared with 43% (124/291) in the control group (adjusted odds ratio 1.3, 95% confidence interval 1.0 to 1.9; number needed to treat=13); a difference possibly due to chance (P=0.08). The intervention was associated with reductions in blood pressure (difference in mean systolic −2.7 mm Hg (95% confidence interval −4.7 to −0.6 mm Hg), mean diastolic −2.8 (−4.0 to −1.6 mm Hg); weight −1.0 kg (−1.8 to −0.3 kg), and body mass index −0.4 (−0.6 to −0.1) but not cholesterol −0.1 (−0.2 to 0.0), smoking status (adjusted odds ratio 0.4, 0.2 to 1.0), or overall cardiovascular risk as a continuous measure (−0.4, −1.2 to 0.3)). The intervention was associated with improvements in diet, physical activity, drug adherence, and satisfaction with access to care, treatment received, and care coordination. One serious related adverse event occurred, when a participant was admitted to hospital with low blood pressure. Conclusions: This evidence based telehealth approach was associated with small clinical benefits for a minority of people with high cardiovascular risk, and there was no overall improvement in average risk. The Healthlines service was, however, associated with improvements in some risk behaviours, and in perceptions of support and access to care
    • …
    corecore