1,681 research outputs found

    Venous thromboembolism research priorities: A scientific statement from the American Heart Association and the International Society on Thrombosis and Haemostasis

    Full text link
    Venous thromboembolism (VTE) is a major cause of morbidity and mortality. The impact of the Surgeon General’s Call to Action in 2008 has been lower than expected given the public health impact of this disease. This scientific statement highlights future research priorities in VTE, developed by experts and a crowdsourcing survey across 16 scientific organizations. At the fundamental research level (T0), researchers need to identify pathobiologic causative mechanisms for the 50% of patients with unprovoked VTE and better understand mechanisms that differentiate hemostasis from thrombosis. At the human level (T1), new methods for diagnosing, treating, and preventing VTE will allow tailoring of diagnostic and therapeutic approaches to individuals. At the patient level (T2), research efforts are required to understand how foundational evidence impacts care of patients (eg, biomarkers). New treatments, such as catheter‐based therapies, require further testing to identify which patients are most likely to experience benefit. At the practice level (T3), translating evidence into practice remains challenging. Areas of overuse and underuse will require evidence‐based tools to improve care delivery. At the community and population level (T4), public awareness campaigns need thorough impact assessment. Large population‐based cohort studies can elucidate the biologic and environmental underpinings of VTE and its complications. To achieve these goals, funding agencies and training programs must support a new generation of scientists and clinicians who work in multidisciplinary teams to solve the pressing public health problem of VTE.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/156163/2/rth212373_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/156163/1/rth212373.pd

    Effect of angiotensin receptor blockade on insulin sensitivity and endothelial function in abdominally obese hypertensive patients with impaired fasting glucose

    Get PDF
    AngII (angiotensin II) may contribute to cardiovascular risk in obesity via adverse effects on insulin sensitivity and endothelial function. In the present study, we examined the effects of ARB (angiotensin receptor blocker) therapy (losartan, 100 mg/day) on insulin sensitivity and endothelial function in 53 subjects with stage I hypertension, abdominal obesity and impaired fasting glucose. The study design was a randomized double-blinded parallel design placebo-controlled multi-centre trial of 8 weeks duration. We used the hyperinsulinaemic-euglycaemic clamp technique to measure insulin sensitivity (expressed as the 'M/I' value) and RH-PAT (reactive hyperaemia-peripheral arterial tonometry) to measure endothelial function. Additional measures included HOMA (homoeostasis model assessment)-B, an index of pancreatic ÎČ-cell function, and markers of inflammation [e.g. CRP (C-reactive protein)] and oxidative stress (e.g. F2-isoprostanes). ARB therapy did not alter insulin sensitivity [5.2 (2.7) pre-treatment and 4.6 (1.6) post-treatment] compared with placebo therapy [6.1 (2.9) pre-treatment and 5.3 (2.7) post-treatment; P value not significant], but did improve the HOMA-B compared with placebo therapy (P=0.05). ARB therapy also did not change endothelial function [RH-PAT, 2.15 (0.7) pre-treatment and 2.11 (0.7) post-treatment] compared with placebo therapy [RH-PAT, 1.81 (0.5) pre-treatment and 1.76 (0.7) post-treatment; P value not significant]. Markers of inflammation and oxidative stress were not significantly changed by ARB therapy. In conclusion, ARB therapy did not alter peripheral insulin sensitivity or endothelial function in this cohort of patients with essential hypertension, abdominal obesity and impaired fasting glucose, but did improve pancreatic ÎČ-cell function

    Measurement of the cross-section and charge asymmetry of WW bosons produced in proton-proton collisions at s=8\sqrt{s}=8 TeV with the ATLAS detector

    Get PDF
    This paper presents measurements of the W+→Ό+ÎœW^+ \rightarrow \mu^+\nu and W−→Ό−ΜW^- \rightarrow \mu^-\nu cross-sections and the associated charge asymmetry as a function of the absolute pseudorapidity of the decay muon. The data were collected in proton--proton collisions at a centre-of-mass energy of 8 TeV with the ATLAS experiment at the LHC and correspond to a total integrated luminosity of 20.2~\mbox{fb^{-1}}. The precision of the cross-section measurements varies between 0.8% to 1.5% as a function of the pseudorapidity, excluding the 1.9% uncertainty on the integrated luminosity. The charge asymmetry is measured with an uncertainty between 0.002 and 0.003. The results are compared with predictions based on next-to-next-to-leading-order calculations with various parton distribution functions and have the sensitivity to discriminate between them.Comment: 38 pages in total, author list starting page 22, 5 figures, 4 tables, submitted to EPJC. All figures including auxiliary figures are available at https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/STDM-2017-13

    Search for chargino-neutralino production with mass splittings near the electroweak scale in three-lepton final states in √s=13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for supersymmetry through the pair production of electroweakinos with mass splittings near the electroweak scale and decaying via on-shell W and Z bosons is presented for a three-lepton final state. The analyzed proton-proton collision data taken at a center-of-mass energy of √s=13  TeV were collected between 2015 and 2018 by the ATLAS experiment at the Large Hadron Collider, corresponding to an integrated luminosity of 139  fb−1. A search, emulating the recursive jigsaw reconstruction technique with easily reproducible laboratory-frame variables, is performed. The two excesses observed in the 2015–2016 data recursive jigsaw analysis in the low-mass three-lepton phase space are reproduced. Results with the full data set are in agreement with the Standard Model expectations. They are interpreted to set exclusion limits at the 95% confidence level on simplified models of chargino-neutralino pair production for masses up to 345 GeV

    THE COMMUNITY LEVERAGED UNIFIED ENSEMBLE (CLUE) IN THE 2016 NOAA/HAZARDOUS WEATHER TESTBED SPRING FORECASTING EXPERIMENT

    Get PDF
    One primary goal of annual Spring Forecasting Experiments (SFEs), which are coorganized by NOAA’s National Severe Storms Laboratory and Storm Prediction Center and conducted in the National Oceanic and Atmospheric Administration’s (NOAA) Hazardous Weather Testbed, is documenting performance characteristics of experimental, convection-allowing modeling systems (CAMs). Since 2007, the number of CAMs (including CAM ensembles) examined in the SFEs has increased dramatically, peaking at six different CAM ensembles in 2015. Meanwhile, major advances have been made in creating, importing, processing, verifying, and developing tools for analyzing and visualizing these large and complex datasets. However, progress toward identifying optimal CAM ensemble configurations has been inhibited because the different CAM systems have been independently designed, making it difficult to attribute differences in performance characteristics. Thus, for the 2016 SFE, a much more coordinated effort among many collaborators was made by agreeing on a set of model specifications (e.g., model version, grid spacing, domain size, and physics) so that the simulations contributed by each collaborator could be combined to form one large, carefully designed ensemble known as the Community Leveraged Unified Ensemble (CLUE). The 2016 CLUE was composed of 65 members contributed by five research institutions and represents an unprecedented effort to enable an evidence-driven decision process to help guide NOAA’s operational modeling efforts. Eight unique experiments were designed within the CLUE framework to examine issues directly relevant to the design of NOAA’s future operational CAM-based ensembles. This article will highlight the CLUE design and present results from one of the experiments examining the impact of single versus multicore CAM ensemble configurations

    THE COMMUNITY LEVERAGED UNIFIED ENSEMBLE (CLUE) IN THE 2016 NOAA/HAZARDOUS WEATHER TESTBED SPRING FORECASTING EXPERIMENT

    Get PDF
    One primary goal of annual Spring Forecasting Experiments (SFEs), which are coorganized by NOAA’s National Severe Storms Laboratory and Storm Prediction Center and conducted in the National Oceanic and Atmospheric Administration’s (NOAA) Hazardous Weather Testbed, is documenting performance characteristics of experimental, convection-allowing modeling systems (CAMs). Since 2007, the number of CAMs (including CAM ensembles) examined in the SFEs has increased dramatically, peaking at six different CAM ensembles in 2015. Meanwhile, major advances have been made in creating, importing, processing, verifying, and developing tools for analyzing and visualizing these large and complex datasets. However, progress toward identifying optimal CAM ensemble configurations has been inhibited because the different CAM systems have been independently designed, making it difficult to attribute differences in performance characteristics. Thus, for the 2016 SFE, a much more coordinated effort among many collaborators was made by agreeing on a set of model specifications (e.g., model version, grid spacing, domain size, and physics) so that the simulations contributed by each collaborator could be combined to form one large, carefully designed ensemble known as the Community Leveraged Unified Ensemble (CLUE). The 2016 CLUE was composed of 65 members contributed by five research institutions and represents an unprecedented effort to enable an evidence-driven decision process to help guide NOAA’s operational modeling efforts. Eight unique experiments were designed within the CLUE framework to examine issues directly relevant to the design of NOAA’s future operational CAM-based ensembles. This article will highlight the CLUE design and present results from one of the experiments examining the impact of single versus multicore CAM ensemble configurations

    Masonry dams : analysis of the historical profiles of Sazilly, Delocre and Rankine

    Get PDF
    The significant advances in masonry dam design that took place in the second half of the 19th century are analyzed and discussed within the context of the historical development of dam construction. Particular reference is made to the gravity dam profiles proposed by Sazilly, Delocre and Rankine, who pioneered the application of engineering concepts to dam design, basing the dam profile on the allowable stresses for the conditions of empty and full reservoir. These historical profiles are analyzed taking into consideration the present safety assessment procedures, by means of a numerical application developed for this purpose, based on limit analysis equilibrium methods, which considers the sliding failure mechanisms, the most critical for these structures. The study underlines the key role of uplift pressures, which was only addressed by LĂ©vy after the accident of Bouzey dam, and provides a critical understanding of the original design concepts, which is essential for the rehabilitation of these historical structures.This work has been funded by FCT (Portuguese Foundation for Science and Technology) through the PhD grant SFRH/BD/43585/2008, for which the first author is grateful
    • 

    corecore