3,158 research outputs found

    Jet substructure as a new Higgs search channel at the LHC

    Get PDF
    It is widely considered that, for Higgs boson searches at the Large Hadron Collider, WH and ZH production where the Higgs boson decays to b anti-b are poor search channels due to large backgrounds. We show that at high transverse momenta, employing state-of-the-art jet reconstruction and decomposition techniques, these processes can be recovered as promising search channels for the standard model Higgs boson around 120 GeV in mass.Comment: 4 pages, 3 figure

    Jet substructure as a new Higgs search channel at the Large Hadron Collider

    Get PDF
    We show that W H and Z H production where the Higgs boson decays to bbbar can be recovered as good search channels for the Standard Model Higgs at the Large Hadron Collider. This is done by requiring the Higgs to have high transverse momentum, and employing state-of-the-art jet reconstruction and decomposition techniques.Comment: Talk presented by J.M.Butterworth at 34th International Conference on High Energy Physics, ICHEP08, Philadelphia, July 200

    MediChainTM: A Secure Decentralized Medical Data Asset Management System

    Full text link
    The set of distributed ledger architectures known as blockchain is best known for cryptocurrency applications such as Bitcoin and Ethereum. These permissionless block chains are showing the potential to be disruptive to the financial services industry. Their broader adoption is likely to be limited by the maximum block size, the cost of the Proof of Work consensus mechanism, and the increasing size of any given chain overwhelming most of the participating nodes. These factors have led to many cryptocurrency blockchains to become centralized in the nodes with enough computing power and storage to be a dominant miner and validator. Permissioned chains operate in trusted environments and can, therefore, avoid the computationally expensive consensus mechanisms. Permissioned chains are still susceptible to asset storage demands and non-standard user interfaces that will impede their adoption. This paper describes an approach to addressing these limitations: permissioned blockchain that uses off-chain storage of the data assets and this is accessed through a standard browser and mobile app. The implementation in the Hyperledger framework is described as is an example use of patient-centered health data management.Comment: 2018 IEEE Confs on Internet of Things, Green Computing and Communications, Cyber, Physical and Social Computing, Smart Data, Blockchain, Computer and Information Technology, Congress on Cybermatic

    Network Mendelian randomization: using genetic variants as instrumental variables to investigate mediation in causal pathways.

    Get PDF
    BACKGROUND: Mendelian randomization uses genetic variants, assumed to be instrumental variables for a particular exposure, to estimate the causal effect of that exposure on an outcome. If the instrumental variable criteria are satisfied, the resulting estimator is consistent even in the presence of unmeasured confounding and reverse causation. METHODS: We extend the Mendelian randomization paradigm to investigate more complex networks of relationships between variables, in particular where some of the effect of an exposure on the outcome may operate through an intermediate variable (a mediator). If instrumental variables for the exposure and mediator are available, direct and indirect effects of the exposure on the outcome can be estimated, for example using either a regression-based method or structural equation models. The direction of effect between the exposure and a possible mediator can also be assessed. Methods are illustrated in an applied example considering causal relationships between body mass index, C-reactive protein and uric acid. RESULTS: These estimators are consistent in the presence of unmeasured confounding if, in addition to the instrumental variable assumptions, the effects of both the exposure on the mediator and the mediator on the outcome are homogeneous across individuals and linear without interactions. Nevertheless, a simulation study demonstrates that even considerable heterogeneity in these effects does not lead to bias in the estimates. CONCLUSIONS: These methods can be used to estimate direct and indirect causal effects in a mediation setting, and have potential for the investigation of more complex networks between multiple interrelated exposures and disease outcomes

    Network Mendelian randomization: using genetic variants as instrumental variables to investigate mediation in causal pathways

    Get PDF
    Background: Mendelian randomization uses genetic variants, assumed to be instrumental variables for a particular exposure, to estimate the causal effect of that exposure on an outcome. If the instrumental variable criteria are satisfied, the resulting estimator is consistent even in the presence of unmeasured confounding and reverse causation. Methods: We extend the Mendelian randomization paradigm to investigate more complex networks of relationships between variables, in particular where some of the effect of an exposure on the outcome may operate through an intermediate variable (a mediator). If instrumental variables for the exposure and mediator are available, direct and indirect effects of the exposure on the outcome can be estimated, for example using either a regression-based method or structural equation models. The direction of effect between the exposure and a possible mediator can also be assessed. Methods are illustrated in an applied example considering causal relationships between body mass index, C-reactive protein and uric acid. Results: These estimators are consistent in the presence of unmeasured confounding if, in addition to the instrumental variable assumptions, the effects of both the exposure on the mediator and the mediator on the outcome are homogeneous across individuals and linear without interactions. Nevertheless, a simulation study demonstrates that even considerable heterogeneity in these effects does not lead to bias in the estimates. Conclusions: These methods can be used to estimate direct and indirect causal effects in a mediation setting, and have potential for the investigation of more complex networks between multiple interrelated exposures and disease outcomes

    High-throughput multivariable Mendelian randomization analysis prioritizes apolipoprotein B as key lipid risk factor for coronary artery disease.

    Get PDF
    BACKGROUND: Genetic variants can be used to prioritize risk factors as potential therapeutic targets via Mendelian randomization (MR). An agnostic statistical framework using Bayesian model averaging (MR-BMA) can disentangle the causal role of correlated risk factors with shared genetic predictors. Here, our objective is to identify lipoprotein measures as mediators between lipid-associated genetic variants and coronary artery disease (CAD) for the purpose of detecting therapeutic targets for CAD. METHODS: As risk factors we consider 30 lipoprotein measures and metabolites derived from a high-throughput metabolomics study including 24 925 participants. We fit multivariable MR models of genetic associations with CAD estimated in 453 595 participants (including 113 937 cases) regressed on genetic associations with the risk factors. MR-BMA assigns to each combination of risk factors a model score quantifying how well the genetic associations with CAD are explained. Risk factors are ranked by their marginal score and selected using false-discovery rate (FDR) criteria. We perform supplementary and sensitivity analyses varying the dataset for genetic associations with CAD. RESULTS: In the main analysis, the top combination of risk factors ranked by the model score contains apolipoprotein B (ApoB) only. ApoB is also the highest ranked risk factor with respect to the marginal score (FDR <0.005). Additionally, ApoB is selected in all sensitivity analyses. No other measure of cholesterol or triglyceride is consistently selected otherwise. CONCLUSIONS: Our agnostic genetic investigation prioritizes ApoB across all datasets considered, suggesting that ApoB, representing the total number of hepatic-derived lipoprotein particles, is the primary lipid determinant of CAD

    Leucocyte telomere length and risk of cardiovascular disease: systematic review and meta-analysis.

    Get PDF
    OBJECTIVE: To assess the association between leucocyte telomere length and risk of cardiovascular disease. DESIGN: Systematic review and meta-analysis. DATA SOURCES: Studies published up to March 2014 identified through searches of Medline, Web of Science, and Embase. ELIGIBILITY CRITERIA: Prospective and retrospective studies that reported on associations between leucocyte telomere length and coronary heart disease (defined as non-fatal myocardial infarction, coronary heart disease death, or coronary revascularisation) or cerebrovascular disease (defined as non-fatal stroke or death from cerebrovascular disease) and were broadly representative of general populations--that is, they did not select cohort or control participants on the basis of pre-existing cardiovascular disease or diabetes. RESULTS: Twenty four studies involving 43,725 participants and 8400 patients with cardiovascular disease (5566 with coronary heart disease and 2834 with cerebrovascular disease) were found to be eligible. In a comparison of the shortest versus longest third of leucocyte telomere length, the pooled relative risk for coronary heart disease was 1.54 (95% confidence interval 1.30 to 1.83) in all studies, 1.40 (1.15 to 1.70) in prospective studies, and 1.80 (1.32 to 2.44) in retrospective studies. Heterogeneity between studies was moderate (I(2) = 64%, 41% to 77%, Phet<0.001) and was not significantly explained by mean age of participants (P = 0.23), the proportion of male participants (P = 0.45), or distinction between retrospective versus prospective studies (P = 0.32). Findings for coronary heart disease were similar in meta-analyses restricted to studies that adjusted for conventional vascular risk factors (relative risk 1.42, 95% confidence interval 1.17 to 1.73); studies with ≥ 200 cases (1.44, 1.20 to 1.74); studies with a high quality score (1.53, 1.22 to 1.92); and in analyses that corrected for publication bias (1.34, 1.12 to 1.60). The pooled relative risk for cerebrovascular disease was 1.42 (1.11 to 1.81), with no significant heterogeneity between studies (I(2) = 41%, 0% to 72%, Phet = 0.08). Shorter telomeres were not significantly associated with cerebrovascular disease risk in prospective studies (1.14, 0.85 to 1.54) or in studies with a high quality score (1.21, 0.83 to 1.76). CONCLUSION: Available observational data show an inverse association between leucocyte telomere length and risk of coronary heart disease independent of conventional vascular risk factors. The association with cerebrovascular disease is less certain

    A Light Scalar in Low-Scale Technicolor

    Full text link
    In addition to the narrow spin-one resonances ρT\rho_T, ωT\omega_T and aTa_T occurring in low-scale technicolor, there will be relatively narrow scalars in the mass range 200 to 600--700 GeV. We study the lightest isoscalar state, σT\sigma_T. In several important respects it is like a heavy Higgs boson with a small vev. It may be discovered with high luminosity at the LHC where it is produced via weak boson fusion and likely has substantial W+WW^+ W^- and Z0Z0Z^0Z^0 decay modes.Comment: 10 pages, 5 figures. References added and minor changes made. To appear in Physics Letters

    Beyond Mendelian randomization: how to interpret evidence of shared genetic predictors.

    Get PDF
    OBJECTIVE: Mendelian randomization is a popular technique for assessing and estimating the causal effects of risk factors. If genetic variants which are instrumental variables for a risk factor are shown to be additionally associated with a disease outcome, then the risk factor is a cause of the disease. However, in many cases, the instrumental variable assumptions are not plausible, or are in doubt. In this paper, we provide a theoretical classification of scenarios in which a causal conclusion is justified or not justified, and discuss the interpretation of causal effect estimates. RESULTS: A list of guidelines based on the 'Bradford Hill criteria' for judging the plausibility of a causal finding from an applied Mendelian randomization study is provided. We also give a framework for performing and interpreting investigations performed in the style of Mendelian randomization, but where the choice of genetic variants is statistically, rather than biologically motivated. Such analyses should not be assigned the same evidential weight as a Mendelian randomization investigation. CONCLUSION: We discuss the role of such investigations (in the style of Mendelian randomization), and what they add to our understanding of potential causal mechanisms. If the genetic variants are selected solely according to statistical criteria, and the biological roles of genetic variants are not investigated, this may be little more than what can be learned from a well-designed classical observational study.Stephen Burgess is supported by the Wellcome Trust (grant number 100114).This is the final version of the article. It first appeared from Elsevier via http://dx.doi.org/10.1016/j.jclinepi.2015.08.00
    corecore