2,605 research outputs found

    High-mobility solution-processed copper phthalocyanine-based organic field-effect transistors

    Get PDF
    © 2011 National Institute for Materials ScienceSolution-processed films of 1,4,8,11,15,18,22,25-octakis(hexyl) copper phthalocyanine (CuPc6) were utilized as an active semiconducting layer in the fabrication of organic field-effect transistors (OFETs) in the bottom-gate configurations using chemical vapour deposited silicon dioxide (SiO2) as gate dielectrics. The surface treatment of the gate dielectric with a self-assembled monolayer of octadecyltrichlorosilane (OTS) resulted in values of 4×10−2 cm2 V−1 s−1 and 106 for saturation mobility and on/off current ratio, respectively. This improvement was accompanied by a shift in the threshold voltage from 3V for untreated devices to −2V for OTS treated devices. The trap density at the interface between the gate dielectric and semiconductor decreased by about one order of magnitude after the surface treatment. The transistors with the OTS treated gate dielectrics were more stable over a 30-day period in air than untreated ones.Technology Strategy Board, UK (Project No: TP/6/EPH/6/S/K2536J)

    Analysis of time-to-event for observational studies: Guidance to the use of intensity models

    Full text link
    This paper provides guidance for researchers with some mathematical background on the conduct of time-to-event analysis in observational studies based on intensity (hazard) models. Discussions of basic concepts like time axis, event definition and censoring are given. Hazard models are introduced, with special emphasis on the Cox proportional hazards regression model. We provide check lists that may be useful both when fitting the model and assessing its goodness of fit and when interpreting the results. Special attention is paid to how to avoid problems with immortal time bias by introducing time-dependent covariates. We discuss prediction based on hazard models and difficulties when attempting to draw proper causal conclusions from such models. Finally, we present a series of examples where the methods and check lists are exemplified. Computational details and implementation using the freely available R software are documented in Supplementary Material. The paper was prepared as part of the STRATOS initiative.Comment: 28 pages, 12 figures. For associated Supplementary material, see http://publicifsv.sund.ku.dk/~pka/STRATOSTG8

    Common Genetic Variant Association with Altered HLA Expression, Synergy with Pyrethroid Exposure, and Risk for Parkinson's Disease: An Observational and Case-Control Study.

    Get PDF
    Background/objectivesThe common non-coding single nucleotide polymorphism (SNP) rs3129882 in HLA-DRA is associated with risk for idiopathic Parkinson's disease (PD). The location of the SNP in the major histocompatibility complex class II (MHC-II) locus implicates regulation of antigen presentation as a potential mechanism by which immune responses link genetic susceptibility to environmental factors in conferring lifetime risk for PD.MethodsFor immunophenotyping, blood cells from 81 subjects were analyzed by qRT-PCR and flow cytometry. A case-control study was performed on a separate cohort of 962 subjects to determine association of pesticide exposure and the SNP with risk of PD.ResultsHomozygosity for G at this SNP was associated with heightened baseline expression and inducibility of MHC class II molecules in B cells and monocytes from peripheral blood of healthy controls and PD patients. In addition, exposure to a commonly used class of insecticide, pyrethroids, synergized with the risk conferred by this SNP (OR = 2.48, p = 0.007), thereby identifying a novel gene-environment interaction that promotes risk for PD via alterations in immune responses.ConclusionsIn sum, these novel findings suggest that the MHC-II locus may increase susceptibility to PD through presentation of pathogenic, immunodominant antigens and/or a shift toward a more pro-inflammatory CD4+ T cell response in response to specific environmental exposures, such as pyrethroid exposure through genetic or epigenetic mechanisms that modulate MHC-II gene expression

    Improved maximum likelihood estimators in a heteroskedastic errors-in-variables model

    Full text link
    This paper develops a bias correction scheme for a multivariate heteroskedastic errors-in-variables model. The applicability of this model is justified in areas such as astrophysics, epidemiology and analytical chemistry, where the variables are subject to measurement errors and the variances vary with the observations. We conduct Monte Carlo simulations to investigate the performance of the corrected estimators. The numerical results show that the bias correction scheme yields nearly unbiased estimates. We also give an application to a real data set.Comment: 12 pages. Statistical Paper

    Ocean and land forcing of the record-breaking Dust Bowl heat waves across central United States

    Get PDF
    International audienceThe severe drought of the 1930s Dust Bowl decade coincided with record-breaking summer heatwaves that contributed to the socioeconomic and ecological disaster over North America's Great Plains. It remains unresolved to what extent these exceptional heatwaves, hotter than in historically forced coupled climate model simulations, were forced by sea surface temperatures (SSTs) and exacerbated through human-induced deterioration of land cover. Here we show, using an atmospheric-only model, that anomalously warm North Atlantic SSTs enhance heatwave activity through an association with drier spring conditions resulting from weaker moisture transport. Model devegetation simulations, that represent the widespread exposure of bare soil in the 1930s, suggest human activity fueled stronger and more frequent heatwaves through greater evaporative drying in the warmer months. This study highlights the potential for the amplification of naturally occurring extreme events like droughts by vegetation feedbacks to create more extreme heatwaves in a warmer world

    Present day greenhouse gases could cause more frequent and longer Dust Bowl heatwaves

    Get PDF
    Substantial warming occurred across North America, Europe and the Arctic over the early twentieth century1, including an increase in global drought2, that was partially forced by rising greenhouse gases (GHGs)3. The period included the 1930s Dust Bowl drought4,5,6,7 across North America’s Great Plains that caused widespread crop failures4,8, large dust storms9 and considerable out-migration10. This coincided with the central United States experiencing its hottest summers of the twentieth century11,12 in 1934 and 1936, with over 40 heatwave days and maximum temperatures surpassing 44 °C at some locations13,14. Here we use a large-ensemble regional modelling framework to show that GHG increases caused slightly enhanced heatwave activity over the eastern United States during 1934 and 1936. Instead of asking how a present-day heatwave would behave in a world without climate warming, we ask how these 1930s heatwaves would behave with present-day GHGs. Heatwave activity in similarly rare events would be much larger under today’s atmospheric GHG forcing and the return period of a 1-in-100-year heatwave summer (as observed in 1936) would be reduced to about 1-in-40 years. A key driver of the increasing heatwave activity and intensity is reduced evaporative cooling and increased sensible heating during dry springs and summers

    A review of RCTs in four medical journals to assess the use of imputation to overcome missing data in quality of life outcomes

    Get PDF
    Background: Randomised controlled trials (RCTs) are perceived as the gold-standard method for evaluating healthcare interventions, and increasingly include quality of life (QoL) measures. The observed results are susceptible to bias if a substantial proportion of outcome data are missing. The review aimed to determine whether imputation was used to deal with missing QoL outcomes. Methods: A random selection of 285 RCTs published during 2005/6 in the British Medical Journal, Lancet, New England Journal of Medicine and Journal of American Medical Association were identified. Results: QoL outcomes were reported in 61 (21%) trials. Six (10%) reported having no missing data, 20 (33%) reported ≀ 10% missing, eleven (18%) 11%–20% missing, and eleven (18%) reported >20% missing. Missingness was unclear in 13 (21%). Missing data were imputed in 19 (31%) of the 61 trials. Imputation was part of the primary analysis in 13 trials, but a sensitivity analysis in six. Last value carried forward was used in 12 trials and multiple imputation in two. Following imputation, the most common analysis method was analysis of covariance (10 trials). Conclusion: The majority of studies did not impute missing data and carried out a complete-case analysis. For those studies that did impute missing data, researchers tended to prefer simpler methods of imputation, despite more sophisticated methods being available.The Health Services Research Unit is funded by the Chief Scientist Office of the Scottish Government Health Directorate. Shona Fielding is also currently funded by the Chief Scientist Office on a Research Training Fellowship (CZF/1/31)
    • 

    corecore