735 research outputs found

    Ten-year follow-up results from the Goettingen Risk, Incidence and Prevalence Study (GRIPS), I: risk factors for myocardial infarction in a cohort of 5790 men. Atherosclerosis

    Get PDF
    Abstract Besides the accepted major risk factors for myocardial infarction (MI), cholesterol, hypertension and smoking, several other variables such as lipoproteins, apolipoproteins, fibrinogen and family history of MI, have been considered, but their usefulness as predictors of MI is controversially discussed. The Gö ttingen Risk Incidence and Prevalence Study (GRIPS) aimed to evaluate the independent impact of the latter in comparison to the established risk factors. GRIPS is a prospective cohort study, which included 5790 men, aged 40-59.9 years, without cardiovascular disease at baseline. Multivariate logistic regression models for the estimation of the MI risk based on the 10-year follow-up data from 97.4% of the study participants established LDL cholesterol as the strongest predictor of MI. It was followed by family history of MI, Lp(a), age, smoking, systolic blood pressure, HDL cholesterol (inversely related) and plasma glucose (PB0.00l). Apolipoprotein B as well as the ratios total/HDL cholesterol, LDL/HDL cholesterol and Apo B/AI were less effective predictors than LDL cholesterol and did not contribute independently to the estimation of MI risk. Similarly apoprotein AI was a weaker predictor of MI risk then HDL cholesterol. GRIPS is the first prospective cohort study which clearly justifies the key role of LDL cholesterol in preventive strategies. However, the data also give strong support for the additional consideration of other risk factors for a valid estimation of the MI risk for an individual subject. © 1997 Elsevier Science Ireland Ltd

    Prüfung des Frost-Tausalz-Widerstandes von Beton für die Expositionsklasse XF2

    Get PDF
    In cold and moderate climates concrete can be subjected to a combined salt frost attack, which can cause scaling damage. Consequently, numerous test procedures were developed to determine the resistance of concrete compositions against this kind of attack. These tests typically mimic a severe attack with high levels of saturation, e.g. as for concrete pavements. Very few approaches exist for testing the salt frost scaling resistance of concretes, which are subjected only to medium levels of saturation, as such concrete elements typically don’t show notable scaling damage. However, the increasing use of low carbon cements with high clinker substitution rates might affect the salt frost scaling resistance of such concrete elements to some extent. To ensure adequate durability of such concretes it is thus desirable to determine their performance in an actual test procedure instead of relying on past experience. Thus, less severe test methods was developed, which are based on the Slab test and the CDF test, repectively.In kalten und gemäßigten Klimazonen kann Beton einem kombinierten Frost-Tausalz-Angriff ausgesetzt sein, der zu Schäden in Form von Abwitterungen führen kann. Daher wurden zahlreiche Prüfverfahren entwickelt, um die Widerstandsfähigkeit von Betonzusammensetzungen gegen diese Art des Angriffs zu bestimmen. Diese Tests simulieren in der Regel einen starken Angriff mit hohen Sättigungsgraden, wie z. B. bei Betonfahrbahnen. Es gibt nur sehr wenige Ansätze für die Prüfung des Widerstands von Betonen, die nur einem mittleren Sättigungsgrad ausgesetzt sind, da solche Betonelemente in der Regel keine nennenswerten Abwitterungen aufweisen. Die zunehmende Verwendung von klinkereffizienten Zementen könnte sich jedoch in gewissem Maße auf den Frost-Tausalz-Widerstand solcher Betonelemente auswirken. Um eine angemessene Dauerhaftigkeit zu gewährleisten ist es daher wünschenswert, ihre Leistungsfähigkeit in einem tatsächlichen Prüfverfahren zu ermitteln, anstatt sich auf Erfahrungswerte zu verlassen. Daher wurden Ansätze für abgeschwächte Prüfverfahren entwickelt, die auf dem Slab-Test bzw. dem CDF-Test beruhen

    First narrow-band search for continuous gravitational waves from known pulsars in advanced detector data

    Get PDF
    Spinning neutron stars asymmetric with respect to their rotation axis are potential sources of continuous gravitational waves for ground-based interferometric detectors. In the case of known pulsars a fully coherent search, based on matched filtering, which uses the position and rotational parameters obtained from electromagnetic observations, can be carried out. Matched filtering maximizes the signalto- noise (SNR) ratio, but a large sensitivity loss is expected in case of even a very small mismatch between the assumed and the true signal parameters. For this reason, narrow-band analysis methods have been developed, allowing a fully coherent search for gravitational waves from known pulsars over a fraction of a hertz and several spin-down values. In this paper we describe a narrow-band search of 11 pulsars using data from Advanced LIGO’s first observing run. Although we have found several initial outliers, further studies show no significant evidence for the presence of a gravitational wave signal. Finally, we have placed upper limits on the signal strain amplitude lower than the spin-down limit for 5 of the 11 targets over the bands searched; in the case of J1813-1749 the spin-down limit has been beaten for the first time. For an additional 3 targets, the median upper limit across the search bands is below the spin-down limit. This is the most sensitive narrow-band search for continuous gravitational waves carried out so far

    A novel category of antigens enabling CTL immunity to tumor escape variants: Cinderella antigens

    Get PDF
    Deficiencies in MHC class I antigen presentation are a common feature of tumors and allows escape from cytotoxic T lymphocyte (CTL)-mediated killing. It is crucial to take this capacity of tumors into account for the development of T-cell-based immunotherapy, as it may strongly impair their effectiveness. A variety of escape mechanisms has been described thus far, but progress in counteracting them is poor. Here we review a novel strategy to target malignancies with defects in the antigenic processing machinery (APM). The concept is based on a unique category of CD8+ T-cell epitopes that is associated with impaired peptide processing, which we named TEIPP. We characterized this alternative peptide repertoire emerging in MHC-I on tumors lacking classical antigen processing due to defects in the peptide transporter TAP (transporter associated with peptide processing). These TEIPPs exemplify interesting parallels with the folktale figure Cinderella: they are oppressed and neglected by a stepmother (like functional TAP prevents TEIPP presentation), until the suppression is released and Cinderella/TEIPP achieves unexpected recognition. TEIPP-specific CTLs and their cognate peptide-epitopes provide a new strategy to counteract immune evasion by APM defects and bear potential to targeting escape variants observed in a wide range of cancers

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    First measurement of the Hubble Constant from a Dark Standard Siren using the Dark Energy Survey Galaxies and the LIGO/Virgo Binary–Black-hole Merger GW170814

    Get PDF
    International audienceWe present a multi-messenger measurement of the Hubble constant H 0 using the binary–black-hole merger GW170814 as a standard siren, combined with a photometric redshift catalog from the Dark Energy Survey (DES). The luminosity distance is obtained from the gravitational wave signal detected by the Laser Interferometer Gravitational-Wave Observatory (LIGO)/Virgo Collaboration (LVC) on 2017 August 14, and the redshift information is provided by the DES Year 3 data. Black hole mergers such as GW170814 are expected to lack bright electromagnetic emission to uniquely identify their host galaxies and build an object-by-object Hubble diagram. However, they are suitable for a statistical measurement, provided that a galaxy catalog of adequate depth and redshift completion is available. Here we present the first Hubble parameter measurement using a black hole merger. Our analysis results in , which is consistent with both SN Ia and cosmic microwave background measurements of the Hubble constant. The quoted 68% credible region comprises 60% of the uniform prior range [20, 140] km s−1 Mpc−1, and it depends on the assumed prior range. If we take a broader prior of [10, 220] km s−1 Mpc−1, we find (57% of the prior range). Although a weak constraint on the Hubble constant from a single event is expected using the dark siren method, a multifold increase in the LVC event rate is anticipated in the coming years and combinations of many sirens will lead to improved constraints on H 0

    Differential cross section measurements for the production of a W boson in association with jets in proton–proton collisions at √s = 7 TeV

    Get PDF
    Measurements are reported of differential cross sections for the production of a W boson, which decays into a muon and a neutrino, in association with jets, as a function of several variables, including the transverse momenta (pT) and pseudorapidities of the four leading jets, the scalar sum of jet transverse momenta (HT), and the difference in azimuthal angle between the directions of each jet and the muon. The data sample of pp collisions at a centre-of-mass energy of 7 TeV was collected with the CMS detector at the LHC and corresponds to an integrated luminosity of 5.0 fb[superscript −1]. The measured cross sections are compared to predictions from Monte Carlo generators, MadGraph + pythia and sherpa, and to next-to-leading-order calculations from BlackHat + sherpa. The differential cross sections are found to be in agreement with the predictions, apart from the pT distributions of the leading jets at high pT values, the distributions of the HT at high-HT and low jet multiplicity, and the distribution of the difference in azimuthal angle between the leading jet and the muon at low values.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI
    corecore