132 research outputs found

    Observing Supermassive Black Holes across cosmic time: from phenomenology to physics

    Full text link
    In the last decade, a combination of high sensitivity, high spatial resolution observations and of coordinated multi-wavelength surveys has revolutionized our view of extra-galactic black hole (BH) astrophysics. We now know that supermassive black holes reside in the nuclei of almost every galaxy, grow over cosmological times by accreting matter, interact and merge with each other, and in the process liberate enormous amounts of energy that influence dramatically the evolution of the surrounding gas and stars, providing a powerful self-regulatory mechanism for galaxy formation. The different energetic phenomena associated to growing black holes and Active Galactic Nuclei (AGN), their cosmological evolution and the observational techniques used to unveil them, are the subject of this chapter. In particular, I will focus my attention on the connection between the theory of high-energy astrophysical processes giving rise to the observed emission in AGN, the observable imprints they leave at different wavelengths, and the methods used to uncover them in a statistically robust way. I will show how such a combined effort of theorists and observers have led us to unveil most of the SMBH growth over a large fraction of the age of the Universe, but that nagging uncertainties remain, preventing us from fully understating the exact role of black holes in the complex process of galaxy and large-scale structure formation, assembly and evolution.Comment: 46 pages, 21 figures. This review article appears as a chapter in the book: "Astrophysical Black Holes", Haardt, F., Gorini, V., Moschella, U and Treves A. (Eds), 2015, Springer International Publishing AG, Cha

    Alirocumab Reduces Total Nonfatal Cardiovascular and Fatal Events : The ODYSSEY OUTCOMES Trial

    Get PDF
    The ODYSSEY OUTCOMES (Evaluation of Cardiovascular Outcomes After an Acute Coronary Syndrome During Treatment With Alirocumab) trial compared alirocumab with placebo, added to high-intensity or maximum-tolerated statin treatment, after acute coronary syndrome (ACS) in 18,924 patients. Alirocumab reduced the first occurrence of the primary composite endpoint and was associated with fewer all-cause deaths. This pre-specified analysis determined the extent to which alirocumab reduced total (first and subsequent) nonfatal cardiovascular events and all-cause deaths in ODYSSEY OUTCOMES. Hazard functions for total nonfatal cardiovascular events (myocardial infarction, stroke, ischemia-driven coronary revascularization, and hospitalization for unstable angina or heart failure) and death were jointly estimated, linked by a shared frailty accounting for patient risk heterogeneity and correlated within-patient nonfatal events. An association parameter also quantified the strength of the linkage between risk of nonfatal events and death. The model provides accurate relative estimates of nonfatal event risk if nonfatal events are associated with increased risk for death. With 3,064 first and 5,425 total events, 190 fewer first and 385 fewer total nonfatal cardiovascular events or deaths were observed with alirocumab compared with placebo. Alirocumab reduced total nonfatal cardiovascular events (hazard ratio: 0.87; 95% confidence interval: 0.82 to 0.93) and death (hazard ratio: 0.83; 95% confidence interval: 0.71 to 0.97) in the presence of a strong association between nonfatal and fatal event risk. In patients with ACS, the total number of nonfatal cardiovascular events and deaths prevented with alirocumab was twice the number of first events prevented. Consequently, total event reduction is a more comprehensive metric to capture the totality of alirocumab clinical efficacy after ACS

    Materiality, health informatics and the limits of knowledge production

    Get PDF
    © IFIP International Federation for Information Processing 2014 Contemporary societies increasingly rely on complex and sophisticated information systems for a wide variety of tasks and, ultimately, knowledge about the world in which we live. Those systems are central to the kinds of problems our systems and sub-systems face such as health and medical diagnosis, treatment and care. While health information systems represent a continuously expanding field of knowledge production, we suggest that they carry forward significant limitations, particularly in their claims to represent human beings as living creatures and in their capacity to critically reflect on the social, cultural and political origins of many forms of data ‘representation’. In this paper we take these ideas and explore them in relation to the way we see healthcare information systems currently functioning. We offer some examples from our own experience in healthcare settings to illustrate how unexamined ideas about individuals, groups and social categories of people continue to influence health information systems and practices as well as their resulting knowledge production. We suggest some ideas for better understanding how and why this still happens and look to a future where the reflexivity of healthcare administration, the healthcare professions and the information sciences might better engage with these issues. There is no denying the role of health informatics in contemporary healthcare systems but their capacity to represent people in those datascapes has a long way to go if the categories they use to describe and analyse human beings are to produce meaningful knowledge about the social world and not simply to replicate past ideologies of those same categories

    Cost-effectiveness of alirocumab in patients with acute coronary syndromes the ODYSSEY OUTCOMES trial

    Get PDF
    BACKGROUND Cholesterol reduction with proprotein convertase subtitisin-kexin type 9 inhibitors reduces ischemic events; however, the cost-effectiveness in statin-treated patients with recent acute coronary syndrome remains uncertain.OBJECTIVES This study sought to determine whether further cholesterol reduction with atirocumab would be cost-effective in patients with a recent acute coronary syndrome on optimal statin therapy.METHODS A cost-effectiveness model leveraging patient-level data from ODYSSEY OUTCOMES (Evaluation of Cardiovascular Outcomes After an Acute Coronary Syndrome During Treatment With Atirocumab) was developed to estimate costs and outcomes over a lifetime horizon. Patients (n = 18,924) had a recent acute coronary syndrome and were on high-intensity or maximum-tolerated statin therapy, with a baseline tow-density lipoprotein cholesterol (LDL-C) level >= 70 mg/l, non-high-density lipoprotein cholesterol >= 100 mg/dl, or apotipoprotein B >= 80 mg/l. Atirocumab 75 mg or placebo was administered subcutaneously every 2 weeks. Atirocumab was blindly titrated to 150 mg if LDL-C remained >= 50 mg/dl or switched to placebo if 2 consecutive LDL-C levels were = 100 mg/dl.RESULTS Across the overall population recruited to the ODYSSEY OUTCOMES trial, using an annual treatment cost of US5,850,themeanoverallincrementalcost−effectivenessratiowasUS5,850, the mean overall incremental cost-effectiveness ratio was US92,200 per QALY (base case). The cost was US41,800perQALYinpatientswithbaselineLDL−C>=100mg/dl,whereasinthosewithLDL−C>=100mg/dlthecostperQALYwasUS41,800 per QALY in patients with baseline LDL-C >= 100 mg/dl, whereas in those with LDL-C >= 100 mg/dl the cost per QALY was US299,400. Among patients with LDL-C a100 mg/dl, incremental cost-effectiveness ratios remained below US$100,000 per QALY across a wide variety of sensitivity analyses.CONCLUSIONS In patients with a recent acute coronary syndrome on optimal statin therapy, atirocumab improves cardiovascular outcomes at costs considered intermediate value, with good value in patients with baseline LDL-C mg/dt but less economic value with LDL-C >= 100 mg/dl. (Evaluation of Cardiovascular Outcomes After an Acute Coronary Syndrome During Treatment With Atirocumab [ODYSSEY OUTCOMES]; NCT01663402) (J Am Colt Cardiol 2020;75:2297-308) (C) 2020 The Authors. Published by Elsevier on behalf of the American College of Cardiology Foundation.Cardiolog

    School-based prevention for adolescent Internet addiction: prevention is the key. A systematic literature review

    Get PDF
    Adolescents’ media use represents a normative need for information, communication, recreation and functionality, yet problematic Internet use has increased. Given the arguably alarming prevalence rates worldwide and the increasingly problematic use of gaming and social media, the need for an integration of prevention efforts appears to be timely. The aim of this systematic literature review is (i) to identify school-based prevention programmes or protocols for Internet Addiction targeting adolescents within the school context and to examine the programmes’ effectiveness, and (ii) to highlight strengths, limitations, and best practices to inform the design of new initiatives, by capitalizing on these studies’ recommendations. The findings of the reviewed studies to date presented mixed outcomes and are in need of further empirical evidence. The current review identified the following needs to be addressed in future designs to: (i) define the clinical status of Internet Addiction more precisely, (ii) use more current psychometrically robust assessment tools for the measurement of effectiveness (based on the most recent empirical developments), (iii) reconsider the main outcome of Internet time reduction as it appears to be problematic, (iv) build methodologically sound evidence-based prevention programmes, (v) focus on skill enhancement and the use of protective and harm-reducing factors, and (vi) include IA as one of the risk behaviours in multi-risk behaviour interventions. These appear to be crucial factors in addressing future research designs and the formulation of new prevention initiatives. Validated findings could then inform promising strategies for IA and gaming prevention in public policy and education

    Search for dark matter produced in association with a hadronically decaying vector boson in pp collisions at sqrt (s) = 13 TeV with the ATLAS detector

    Get PDF
    A search is presented for dark matter produced in association with a hadronically decaying W or Z boson using 3.2 fb−1 of pp collisions at View the MathML sources=13 TeV recorded by the ATLAS detector at the Large Hadron Collider. Events with a hadronic jet compatible with a W or Z boson and with large missing transverse momentum are analysed. The data are consistent with the Standard Model predictions and are interpreted in terms of both an effective field theory and a simplified model containing dark matter

    5-Lipoxygenase Metabolic Contributions to NSAID-Induced Organ Toxicity

    Full text link

    Search for new phenomena in monophoton final states in proton-proton collisions at root s=8 TeV

    Get PDF
    Peer reviewe

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype
    • 

    corecore