101 research outputs found

    Detector Description and Performance for the First Coincidence Observations between LIGO and GEO

    Get PDF
    For 17 days in August and September 2002, the LIGO and GEO interferometer gravitational wave detectors were operated in coincidence to produce their first data for scientific analysis. Although the detectors were still far from their design sensitivity levels, the data can be used to place better upper limits on the flux of gravitational waves incident on the earth than previous direct measurements. This paper describes the instruments and the data in some detail, as a companion to analysis papers based on the first data.Comment: 41 pages, 9 figures 17 Sept 03: author list amended, minor editorial change

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Whole-genome sequencing reveals host factors underlying critical COVID-19

    Get PDF
    Critical COVID-19 is caused by immune-mediated inflammatory lung injury. Host genetic variation influences the development of illness requiring critical care1 or hospitalization2,3,4 after infection with SARS-CoV-2. The GenOMICC (Genetics of Mortality in Critical Care) study enables the comparison of genomes from individuals who are critically ill with those of population controls to find underlying disease mechanisms. Here we use whole-genome sequencing in 7,491 critically ill individuals compared with 48,400 controls to discover and replicate 23 independent variants that significantly predispose to critical COVID-19. We identify 16 new independent associations, including variants within genes that are involved in interferon signalling (IL10RB and PLSCR1), leucocyte differentiation (BCL11A) and blood-type antigen secretor status (FUT2). Using transcriptome-wide association and colocalization to infer the effect of gene expression on disease severity, we find evidence that implicates multiple genes—including reduced expression of a membrane flippase (ATP11A), and increased expression of a mucin (MUC1)—in critical disease. Mendelian randomization provides evidence in support of causal roles for myeloid cell adhesion molecules (SELE, ICAM5 and CD209) and the coagulation factor F8, all of which are potentially druggable targets. Our results are broadly consistent with a multi-component model of COVID-19 pathophysiology, in which at least two distinct mechanisms can predispose to life-threatening disease: failure to control viral replication; or an enhanced tendency towards pulmonary inflammation and intravascular coagulation. We show that comparison between cases of critical illness and population controls is highly efficient for the detection of therapeutically relevant mechanisms of disease

    Reinterpretation and Long-Term Preservation of Data and Code

    No full text
    Careful preservation of experimental data, simulations, analysis products, and theoretical work maximizes their long-term scientific return on investment by enabling new analyses and reinterpretation of the results in the future. Key infrastructure and technical developments needed for some high-value science targets are not in scope for the operations program of the large experiments and are often not effectively funded. Increasingly, the science goals of our projects require contributions that span the boundaries between individual experiments and surveys, and between the theoretical and experimental communities. Furthermore, the computational requirements and technical sophistication of this work is increasing. As a result, it is imperative that the funding agencies create programs that can devote significant resources to these efforts outside of the context of the operations of individual major experiments, including smaller experiments and theory/simulation work. In this Snowmass 2021 Computational Frontier topical group report (CompF7: Reinterpretation and long-term preservation of data and code), we summarize the current state of the field and make recommendations for the future

    Reinterpretation and Long-Term Preservation of Data and Code

    No full text
    Careful preservation of experimental data, simulations, analysis products, and theoretical work maximizes their long-term scientific return on investment by enabling new analyses and reinterpretation of the results in the future. Key infrastructure and technical developments needed for some high-value science targets are not in scope for the operations program of the large experiments and are often not effectively funded. Increasingly, the science goals of our projects require contributions that span the boundaries between individual experiments and surveys, and between the theoretical and experimental communities. Furthermore, the computational requirements and technical sophistication of this work is increasing. As a result, it is imperative that the funding agencies create programs that can devote significant resources to these efforts outside of the context of the operations of individual major experiments, including smaller experiments and theory/simulation work. In this Snowmass 2021 Computational Frontier topical group report (CompF7: Reinterpretation and long-term preservation of data and code), we summarize the current state of the field and make recommendations for the future

    Reinterpretation and Long-Term Preservation of Data and Code

    No full text
    Careful preservation of experimental data, simulations, analysis products, and theoretical work maximizes their long-term scientific return on investment by enabling new analyses and reinterpretation of the results in the future. Key infrastructure and technical developments needed for some high-value science targets are not in scope for the operations program of the large experiments and are often not effectively funded. Increasingly, the science goals of our projects require contributions that span the boundaries between individual experiments and surveys, and between the theoretical and experimental communities. Furthermore, the computational requirements and technical sophistication of this work is increasing. As a result, it is imperative that the funding agencies create programs that can devote significant resources to these efforts outside of the context of the operations of individual major experiments, including smaller experiments and theory/simulation work. In this Snowmass 2021 Computational Frontier topical group report (CompF7: Reinterpretation and long-term preservation of data and code), we summarize the current state of the field and make recommendations for the future

    Apoptosis and proliferation of acinar and islet cells in chronic pancreatitis: evidence for differential cell loss mediating preservation of islet function

    No full text
    BACKGROUND: Chronic pancreatitis is characterised clinically by early exocrine insufficiency, with diabetes mellitus occurring as a late phenomenon. This is mirrored pathologically by extensive acinar cell destruction and islet preservation. The mechanisms underlying this differential rate of cellular destruction are unknown. AIMS: To test the hypothesis that acinar loss and islet preservation in chronic pancreatitis occurs due to differential epithelial kinetics and investigate the role of inflammatory cells and cell cycle associated molecules. METHODS: Archival tissue from six chronic pancreatitis cases was compared with six normal controls using TUNEL and immunohistochemistry for CD3, CD20, CD68, MIB-1, Bcl-2, Bax, Fas, Fas ligand, retinoblastoma protein (Rb), and tissue inhibitor of metalloproteinases 1 (TIMP-1) and 2 (TIMP-2). RESULTS: The acinar cell apoptotic index (AI) and proliferation index were higher in chronic pancreatitis than controls. T lymphocytes diffusely infiltrated fibrous bands and acini but rarely islets. Acinar Bcl-2 expression exceeded islet expression in chronic pancreatitis and controls while Bax was strongly expressed by a subset of islet cells and weakly by centroacinar cells. Islet Fas and Fas ligand expression exceeded acinar expression in chronic pancreatitis and controls. Acinar Rb expression was higher in chronic pancreatitis than in controls. Islets in chronic pancreatitis and controls showed intense TIMP-1 and TIMP-2 expression. CONCLUSION: Apoptosis plays a significant role in acinar loss in chronic pancreatitis. Acinar Bcl-2 and islet Bax expression indicates complex AI control. Increased acinar Rb expression in chronic pancreatitis may differentially promote acinar loss. Fas ligand expression may be restricted to islet cell membranes through TIMP-1 expression and inhibit islet damage by promoting apoptosis of cytotoxic T lymphocytes
    corecore