1,658 research outputs found

    Sliding charge density wave in manganites

    Full text link
    The so-called stripe phase of the manganites is an important example of the complex behaviour of metal oxides, and has long been interpreted as the localisation of charge at atomic sites. Here, we demonstrate via resistance measurements on La_{0.50}Ca_{0.50}MnO_3 that this state is in fact a prototypical charge density wave (CDW) which undergoes collective transport. Dramatic resistance hysteresis effects and broadband noise properties are observed, both of which are typical of sliding CDW systems. Moreover, the high levels of disorder typical of manganites result in behaviour similar to that of well-known disordered CDW materials. Our discovery that the manganite superstructure is a CDW shows that unusual transport and structural properties do not require exotic physics, but can emerge when a well-understood phase (the CDW) coexists with disorder.Comment: 13 pages; 4 figure

    On the Interface Formation Model for Dynamic Triple Lines

    Full text link
    This paper revisits the theory of Y. Shikhmurzaev on forming interfaces as a continuum thermodynamical model for dynamic triple lines. We start with the derivation of the balances for mass, momentum, energy and entropy in a three-phase fluid system with full interfacial physics, including a brief review of the relevant transport theorems on interfaces and triple lines. Employing the entropy principle in the form given in [Bothe & Dreyer, Acta Mechanica, doi:10.1007/s00707-014-1275-1] but extended to this more general case, we arrive at the entropy production and perform a linear closure, except for a nonlinear closure for the sorption processes. Specialized to the isothermal case, we obtain a thermodynamically consistent mathematical model for dynamic triple lines and show that the total available energy is a strict Lyapunov function for this system

    Looking inside the black box : a theory-based process evaluation alongside a randomised controlled trial of printed educational materials (the Ontario printed educational message, OPEM) to improve referral and prescribing practices in primary care in Ontario, Canada

    Get PDF
    Background: Randomised controlled trials of implementation strategies tell us whether (or not) an intervention results in changes in professional behaviour but little about the causal mechanisms that produce any change. Theory-based process evaluations collect data on theoretical constructs alongside randomised trials to explore possible causal mechanisms and effect modifiers. This is similar to measuring intermediate endpoints in clinical trials to further understand the biological basis of any observed effects (for example, measuring lipid profiles alongside trials of lipid lowering drugs where the primary endpoint could be reduction in vascular related deaths). This study protocol describes a theory-based process evaluation alongside the Ontario Printed Educational Message (OPEM) trial. We hypothesize that the OPEM interventions are most likely to operate through changes in physicians' behavioural intentions due to improved attitudes or subjective norms with little or no change in perceived behavioural control. We will test this hypothesis using a well-validated social cognition model, the theory of planned behaviour (TPB) that incorporates these constructs. Methods/design: We will develop theory-based surveys using standard methods based upon the TPB for the second and third replications, and survey a subsample of Ontario family physicians from each arm of the trial two months before and six months after the dissemination of the index edition of informed, the evidence based newsletter used for the interventions. In the third replication, our study will converge with the "TRY-ME" protocol (a second study conducted alongside the OPEM trial), in which the content of educational messages was constructed using both standard methods and methods informed by psychological theory. We will modify Dillman's total design method to maximise response rates. Preliminary analyses will initially assess the internal reliability of the measures and use regression to explore the relationships between predictor and dependent variable (intention to advise diabetic patients to have annual retinopathy screening and to prescribe thiazide diuretics for first line treatment of uncomplicated hypertension). We will then compare groups using methods appropriate for comparing independent samples to determine whether there have been changes in the predicted constructs (attitudes, subjective norms, or intentions) across the study groups as hypothesised, and will assess the convergence between the process evaluation results and the main trial results.The OPEM trial and OPEM process evaluation are funded by the Canadian Institute of Health Research (CIHR). The OPEM process evaluation study was developed as part of the CIHR funded interdisciplinary capacity enhancement team KT-ICEBeRG. Gaston Godin, Jeremy Grimshaw and France Légaré hold Canada Research Chairs. Louise Lemyre holds an R.S. McLaughlin Research Chair

    Calculating confidence intervals for impact numbers

    Get PDF
    BACKGROUND: Standard effect measures such as risk difference and attributable risk are frequently used in epidemiological studies and public health research to describe the effect of exposures. Recently, so-called impact numbers have been proposed, which express the population impact of exposures in form of specific person or case numbers. To describe estimation uncertainty, it is necessary to calculate confidence intervals for these new effect measures. In this paper, we present methods to calculate confidence intervals for the new impact numbers in the situation of cohort studies. METHODS: Beside the exposure impact number (EIN), which is equivalent to the well-known number needed to treat (NNT), two other impact numbers are considered: the case impact number (CIN) and the exposed cases impact number (ECIN), which describe the number of cases (CIN) and the number of exposed cases (ECIN) with an outcome among whom one case is attributable to the exposure. The CIN and ECIN represent reciprocals of the population attributable risk (PAR) and the attributable fraction among the exposed (AF(e)), respectively. Thus, confidence intervals for these impact numbers can be calculated by inverting and exchanging the confidence limits of the PAR and AF(e). EXAMPLES: We considered a British and a Japanese cohort study that investigated the association between smoking and death from coronary heart disease (CHD) and between smoking and stroke, respectively. We used the reported death and disease rates and calculated impact numbers with corresponding 95% confidence intervals. In the British study, the CIN was 6.46, i.e. on average, of any 6 to 7 persons who died of CHD, one case was attributable to smoking with corresponding 95% confidence interval of [3.84, 20.36]. For the exposed cases, the results of ECIN = 2.64 with 95% confidence interval [1.76, 5.29] were obtained. In the Japanese study, the CIN was 6.67, i.e. on average, of the 6 to 7 persons who had a stroke, one case was attributable to smoking with corresponding 95% confidence interval of [3.80, 27.27]. For the exposed cases, the results of ECIN = 4.89 with 95% confidence interval of [2.86, 16.67] were obtained. CONCLUSION: The consideration of impact numbers in epidemiological analyses provides additional information and helps the interpretation of study results, e.g. in public health research. In practical applications, it is necessary to describe estimation uncertainty. We have shown that the calculation of confidence intervals for the new impact numbers is possible by means of known methods for attributable risk measures. Therefore, estimated impact numbers should always be complemented by appropriate confidence intervals

    Results and harmonization guidelines from two large-scale international Elispot proficiency panels conducted by the Cancer Vaccine Consortium (CVC/SVI)

    Get PDF
    The Cancer Vaccine Consortium of the Sabin Vaccine Institute (CVC/SVI) is conducting an ongoing large-scale immune monitoring harmonization program through its members and affiliated associations. This effort was brought to life as an external validation program by conducting an international Elispot proficiency panel with 36 laboratories in 2005, and was followed by a second panel with 29 participating laboratories in 2006 allowing for application of learnings from the first panel. Critical protocol choices, as well as standardization and validation practices among laboratories were assessed through detailed surveys. Although panel participants had to follow general guidelines in order to allow comparison of results, each laboratory was able to use its own protocols, materials and reagents. The second panel recorded an overall significantly improved performance, as measured by the ability to detect all predefined responses correctly. Protocol choices and laboratory practices, which can have a dramatic effect on the overall assay outcome, were identified and lead to the following recommendations: (A) Establish a laboratory SOP for Elispot testing procedures including (A1) a counting method for apoptotic cells for determining adequate cell dilution for plating, and (A2) overnight rest of cells prior to plating and incubation, (B) Use only pre-tested serum optimized for low background: high signal ratio, (C) Establish a laboratory SOP for plate reading including (C1) human auditing during the reading process and (C2) adequate adjustments for technical artifacts, and (D) Only allow trained personnel, which is certified per laboratory SOPs to conduct assays. Recommendations described under (A) were found to make a statistically significant difference in assay performance, while the remaining recommendations are based on practical experiences confirmed by the panel results, which could not be statistically tested. These results provide initial harmonization guidelines to optimize Elispot assay performance to the immunotherapy community. Further optimization is in process with ongoing panels

    A common missense variant in BRCA2 predisposes to early onset breast cancer

    Get PDF
    INTRODUCTION: Mutations in the BRCA2 gene are one of the two major causes of hereditary breast cancer. Protein-truncating mutations of BRCA2 are usually deleterious and increase the risk of breast cancer up to 80% over a lifetime. A few missense mutations in BRCA2 are believed to have a similarly high penetrance, apart from more common neutral polymorphisms. It is often difficult to classify a particular sequence variant as a mutation or a polymorphism. For a deleterious variant, one would expect a greater allele frequency in breast cancer cases than in ethnic-matched controls. In contrast, neutral polymorphic variants should be equally frequent in the two groups. METHODS: We genotyped 3,241 cases of breast cancer diagnosed at under 51 years of age, unselected for family history, from 18 hospitals throughout Poland and 2,791 ethnic-matched controls for a single BRCA2 C5972T variant. RESULTS: The variant was present in approximately 6% of the Polish population. In the study, 13 women (11 cases and two controls (OR = 4.7; p = 0.02)) were homozygous for the variant allele. The overall odds ratio for breast cancer in women with a single copy of the BRCA2 C5972T variant was 1.1 (p = 0.7); however, the effect was significant for patients diagnosed at or before age 40 (OR = 1.4; p = 0.04). We reviewed the association between the BRCA2 variant in different histologic subgroups and found the effect most pronounced in women who had ductal carcinoma in situ (DCIS) with micro-invasion (OR = 2.8; p < 0.0001). CONCLUSION: The BRCA2 C5972T allele is a common variant in Poland that increases the risk of DCIS with micro-invasion. The homozygous state is rare but increases the risk of breast cancer five-fold

    Do changes in traditional coronary heart disease risk factors over time explain the association between socio-economic status and coronary heart disease?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Socioeconomic status (SES) predicts coronary heart disease independently of the traditional risk factors included in the Framingham risk score. However, it is unknown whether <it>changes </it>in Framingham risk score variables over time explain the association between SES and coronary heart disease. We examined this question given its relevance to risk assessment in clinical decision making.</p> <p>Methods</p> <p>The Atherosclerosis Risk in Communities study data (initiated in 1987 with 10-years follow-up of 15,495 adults aged 45-64 years in four Southern and Mid-Western communities) were used. SES was assessed at baseline, dichotomized as low SES (defined as low education and/or low income) or not. The time dependent variables - smoking, total and high density lipoprotein cholesterol, systolic blood pressure and use of blood pressure lowering medication - were assessed every three years. Ten-year incidence of coronary heart disease was based on EKG and cardiac enzyme criteria, or adjudicated death certificate data. Cox survival analyses examined the contribution of SES to heart disease risk independent of baseline Framingham risk score, without and with further adjustment for the time dependent variables.</p> <p>Results</p> <p>Adjusting for baseline Framingham risk score, low SES was associated with an increased coronary heart disease risk (hazard ratio [HR] = 1.53; 95% Confidence Interval [CI], 1.27 to1.85). After further adjustment for the time dependent variables, the SES effect remained significant (HR = 1.44; 95% CI, 1.19 to1.74).</p> <p>Conclusion</p> <p>Using Framingham Risk Score alone under estimated the coronary heart disease risk in low SES persons. This bias was not eliminated by subsequent changes in Framingham risk score variables.</p

    Asteroseismology

    Full text link
    Asteroseismology is the determination of the interior structures of stars by using their oscillations as seismic waves. Simple explanations of the astrophysical background and some basic theoretical considerations needed in this rapidly evolving field are followed by introductions to the most important concepts and methods on the basis of example. Previous and potential applications of asteroseismology are reviewed and future trends are attempted to be foreseen.Comment: 38 pages, 13 figures, to appear in: "Planets, Stars and Stellar Systems", eds. T. D. Oswalt et al., Springer Verla
    corecore