3,617 research outputs found

    Systematic review and meta-analysis of the sero-epidemiological association between Epstein-Barr virus and rheumatoid arthritis

    Get PDF
    Acknowledgements The authors would like to thank Cynthia Fraser for helping run the literature search, Dr Neil Basu for providing advice on search terms for rheumatoid arthritis and to Xueli Jia, Katie Bannister and Kubra Boza for their help with foreign language papers. The authors would also like to thank the University of Aberdeen librarians at the Foresterhill medical library for their help in locating articles used for this systematic review and meta-analysis.Peer reviewedPublisher PD

    EC1567 You can Control Corn Rootworms

    Get PDF
    Extension Circular 1567 discusses how you can control corn rootworms

    Matrix isolation as a tool for studying interstellar chemical reactions

    Get PDF
    Since the identification of the OH radical as an interstellar species, over 50 molecular species were identified as interstellar denizens. While identification of new species appears straightforward, an explanation for their mechanisms of formation is not. Most astronomers concede that large bodies like interstellar dust grains are necessary for adsorption of molecules and their energies of reactions, but many of the mechanistic steps are unknown and speculative. It is proposed that data from matrix isolation experiments involving the reactions of refractory materials (especially C, Si, and Fe atoms and clusters) with small molecules (mainly H2, H2O, CO, CO2) are particularly applicable to explaining mechanistic details of likely interstellar chemical reactions. In many cases, matrix isolation techniques are the sole method of studying such reactions; also in many cases, complexations and bond rearrangements yield molecules never before observed. The study of these reactions thus provides a logical basis for the mechanisms of interstellar reactions. A list of reactions is presented that would simulate interstellar chemical reactions. These reactions were studied using FTIR-matrix isolation techniques

    Relaxation energies and excited state structures of poly(para-phenylene)

    Full text link
    We investigate the relaxation energies and excited state geometries of the light emitting polymer, poly(para-phenylene). We solve the Pariser-Parr-Pople-Peierls model using the density matrix renormalization group method. We find that the lattice relaxation of the dipole-active 11B1u1^1B_{1u}^- state is quite different from that of the 13B1u+1^3B_{1u}^+ state and the dipole-inactive 21Ag+2^1A_g^+ state. In particular, the 11B1u1^1B_{1u}^- state is rather weakly coupled to the lattice and has a rather small relaxation energy ca. 0.1 eV. In contrast, the 13B1u+1^3B_{1u}^+ and 21Ag+2^1A_g^+ states are strongly coupled with relaxation energies of ca. 0.5 and ca. 1.0 eV, respectively. By analogy to linear polyenes, we argue that this difference can be understood by the different kind of solitons present in the 11B1u1^1B_{1u}^-, 13B1u+1^3B_{1u}^+ and 21Ag+2^1A_g^+ states. The difference in relaxation energies of the 11B1u1^1B_{1u}^- and 13B1u+1^3B_{1u}^+ states accounts for approximately one-third of the exchange gap in light-emitting polymers.Comment: Submitted to Physical Review

    Shale oil : potential economies of large-scale production, preliminary phase

    Get PDF
    Producing shale oil on a large scale is one of the possible alternatives for reducing dependence of the United States on imported petroleum. Industry is not producing shale oil on a commercial scale now because costs are too high even though industry dissatisfaction is most frequently expressed about "non-economic" barriers: innumerable permits, changing environmental regulations, lease limitations, water rights conflicts, legal challenges, and so on. The overall purpose of this study is to estimate whether improved technology might significantly reduce unit costs for production of shale oil in a planned large-scale industry as contrasted to the case usually contemplated: a small industry evolving slowly on a project-by-project basis. In this preliminary phase of the study, we collected published data on the costs of present shale oil technology and adjusted them to common conditions; these data were assembled to help identify the best targets for cost reduction through improved large-scale technology They show that the total cost of producing upgraded shale oil (i.e. shale oil accpetable as a feed to a petroleum refinery) by surface retorting ranges from about 18to18 to 28/barrel in late '78 dollars with a 20% chance that the costs would be lower than and 20% higher than that range. The probability distribution reflects our assumptions about ranges of shale richness, process performance, rate of return, and other factors that seem likely in a total industry portfolio of projects. About 40% of the total median cost is attributable to retorting, 20% to upgrading, and the remaining 40% to resource acquisition, mining, crushing, and spent shale disposal and revegetation. Capital charges account for about 70% of the median total cost and operating costs for the other 30%. There is a reasonable chance that modified in-situ processes (like Occidental's) may be able to produce shale oil more cheaply than surface retorting, but no reliable cost data have been published; in 1978, DOE estimated a saving of roughly $5/B for in-situ. Because the total costs of shale oil are spread over many steps in the production process, improvements in most or all of those steps are required if we seek a significant reduction in total cost. A June 1979 workshop of industry experts was held to help us identify possible cost-reduction technologies. Examples of the improved large-scale technologies proposed (for further evaluation) to the workshop were: - Instead of hydrotreating raw shale oil to make syncrude capable of being refined conventionally, rebalance all of a refinery's processes (or develop new catalysts/processes less sensitive to feed nitrogen) to accommodate shale oil feed -- a change analogous to a shift from sweet crude to sour crude. - Instead of refining at or near the retort site, use heated pipelines to move raw shale oil to existing major refining areas. - Instead of operating individual mines, open-pit mine all or much of the Piceance Creek Basin. - Instead of building individual retorts, develop new methods for mass production of hundreds of retorts

    The deterministic Kermack-McKendrick model bounds the general stochastic epidemic

    Get PDF
    We prove that, for Poisson transmission and recovery processes, the classic Susceptible \to Infected \to Recovered (SIR) epidemic model of Kermack and McKendrick provides, for any given time t>0t>0, a strict lower bound on the expected number of suscpetibles and a strict upper bound on the expected number of recoveries in the general stochastic SIR epidemic. The proof is based on the recent message passing representation of SIR epidemics applied to a complete graph

    The Indiana Hatchery Industry

    Get PDF

    Robust Machine Learning Applied to Astronomical Datasets I: Star-Galaxy Classification of the SDSS DR3 Using Decision Trees

    Get PDF
    We provide classifications for all 143 million non-repeat photometric objects in the Third Data Release of the Sloan Digital Sky Survey (SDSS) using decision trees trained on 477,068 objects with SDSS spectroscopic data. We demonstrate that these star/galaxy classifications are expected to be reliable for approximately 22 million objects with r < ~20. The general machine learning environment Data-to-Knowledge and supercomputing resources enabled extensive investigation of the decision tree parameter space. This work presents the first public release of objects classified in this way for an entire SDSS data release. The objects are classified as either galaxy, star or nsng (neither star nor galaxy), with an associated probability for each class. To demonstrate how to effectively make use of these classifications, we perform several important tests. First, we detail selection criteria within the probability space defined by the three classes to extract samples of stars and galaxies to a given completeness and efficiency. Second, we investigate the efficacy of the classifications and the effect of extrapolating from the spectroscopic regime by performing blind tests on objects in the SDSS, 2dF Galaxy Redshift and 2dF QSO Redshift (2QZ) surveys. Given the photometric limits of our spectroscopic training data, we effectively begin to extrapolate past our star-galaxy training set at r ~ 18. By comparing the number counts of our training sample with the classified sources, however, we find that our efficiencies appear to remain robust to r ~ 20. As a result, we expect our classifications to be accurate for 900,000 galaxies and 6.7 million stars, and remain robust via extrapolation for a total of 8.0 million galaxies and 13.9 million stars. [Abridged]Comment: 27 pages, 12 figures, to be published in ApJ, uses emulateapj.cl
    corecore