348 research outputs found

    Efficient eucalypt cell wall deconstruction and conversion for sustainable lignocellulosic biofuels

    Get PDF
    In order to meet the world’s growing energy demand and reduce the impact of greenhouse gas emissions resulting from fossil fuel combustion, renewable plant-based feedstocks for biofuel production must be considered. The first-generation biofuels, derived from starches of edible feedstocks, such as corn, create competition between food and fuel resources, both for the crop itself and the land on which it is grown. As such, biofuel synthesized from non-edible plant biomass (lignocellulose) generated on marginal agricultural land will help to alleviate this competition. Eucalypts, the broadly defined taxa encompassing over 900 species of Eucalyptus, Corymbia, and Angophora are the most widely planted hardwood tree in the world, harvested mainly for timber, pulp and paper, and biomaterial products. More recently, due to their exceptional growth rate and amenability to grow under a wide range of environmental conditions, eucalypts are a leading option for the development of a sustainable lignocellulosic biofuels. However, efficient conversion of woody biomass into fermentable monomeric sugars is largely dependent on pretreatment of the cell wall, whose formation and complexity lend itself toward natural recalcitrance against its efficient deconstruction. A greater understanding of this complexity within the context of various pretreatments will allow the design of new and effective deconstruction processes for bioenergy production. In this review, we present the various pretreatment options for eucalypts, including research into understanding structure and formation of the eucalypt cell wall

    Do (and say) as I say: Linguistic adaptation in human-computer dialogs

    Get PDF
    © Theodora Koulouri, Stanislao Lauria, and Robert D. Macredie. This article has been made available through the Brunel Open Access Publishing Fund.There is strong research evidence showing that people naturally align to each other’s vocabulary, sentence structure, and acoustic features in dialog, yet little is known about how the alignment mechanism operates in the interaction between users and computer systems let alone how it may be exploited to improve the efficiency of the interaction. This article provides an account of lexical alignment in human–computer dialogs, based on empirical data collected in a simulated human–computer interaction scenario. The results indicate that alignment is present, resulting in the gradual reduction and stabilization of the vocabulary-in-use, and that it is also reciprocal. Further, the results suggest that when system and user errors occur, the development of alignment is temporarily disrupted and users tend to introduce novel words to the dialog. The results also indicate that alignment in human–computer interaction may have a strong strategic component and is used as a resource to compensate for less optimal (visually impoverished) interaction conditions. Moreover, lower alignment is associated with less successful interaction, as measured by user perceptions. The article distills the results of the study into design recommendations for human–computer dialog systems and uses them to outline a model of dialog management that supports and exploits alignment through mechanisms for in-use adaptation of the system’s grammar and lexicon

    In defense of the epistemic view of quantum states: a toy theory

    Full text link
    We present a toy theory that is based on a simple principle: the number of questions about the physical state of a system that are answered must always be equal to the number that are unanswered in a state of maximal knowledge. A wide variety of quantum phenomena are found to have analogues within this toy theory. Such phenomena include: the noncommutativity of measurements, interference, the multiplicity of convex decompositions of a mixed state, the impossibility of discriminating nonorthogonal states, the impossibility of a universal state inverter, the distinction between bi-partite and tri-partite entanglement, the monogamy of pure entanglement, no cloning, no broadcasting, remote steering, teleportation, dense coding, mutually unbiased bases, and many others. The diversity and quality of these analogies is taken as evidence for the view that quantum states are states of incomplete knowledge rather than states of reality. A consideration of the phenomena that the toy theory fails to reproduce, notably, violations of Bell inequalities and the existence of a Kochen-Specker theorem, provides clues for how to proceed with this research program.Comment: 32 pages, REVTEX, based on a talk given at the Rob Clifton Memorial Conference, College Park, May 2003; v2: minor modifications throughout, updated reference

    The Australia Telescope 20 GHz Survey: The Source Catalogue

    Get PDF
    We present the full source catalogue from the Australia Telescope 20 GHz (AT20G) Survey. The AT20G is a blind radio survey carried out at 20 GHz with the Australia Telescope Compact Array (ATCA) from 2004 to 2008, and covers the whole sky south of declination 0 deg. The AT20G source catalogue presented here is an order of magnitude larger than any previous catalogue of high-frequency radio sources, and includes 5890 sources above a 20 GHz flux-density limit of 40 mJy. All AT20G sources have total intensity and polarisation measured at 20 GHz, and most sources south of declination -15 deg also have near-simultaneous flux-density measurements at 5 and 8 GHz. A total of 1559 sources were detected in polarised total intensity at one or more of the three frequencies. We detect a small but significant population of non-thermal sources that are either undetected or have only weak detections in low-frequency catalogues. We introduce the term Ultra-Inverted Spectrum (UIS) to describe these radio sources, which have a spectral index alpha(5, 20) > +0.7 and which constitute roughly 1.2 per cent of the AT20G sample. The 20 GHz flux densities measured for the strongest AT20G sources are in excellent agreement with the WMAP 5-year source catalogue of Wright et al. (2009), and we find that the WMAP source catalogue is close to complete for sources stronger than 1.5 Jy at 23 GHz.Comment: 21 pages, accepted for publication in MNRA

    Einstein, incompleteness, and the epistemic view of quantum states

    Get PDF
    Does the quantum state represent reality or our knowledge of reality? In making this distinction precise, we are led to a novel classification of hidden variable models of quantum theory. Indeed, representatives of each class can be found among existing constructions for two-dimensional Hilbert spaces. Our approach also provides a fruitful new perspective on arguments for the nonlocality and incompleteness of quantum theory. Specifically, we show that for models wherein the quantum state has the status of something real, the failure of locality can be established through an argument considerably more straightforward than Bell's theorem. The historical significance of this result becomes evident when one recognizes that the same reasoning is present in Einstein's preferred argument for incompleteness, which dates back to 1935. This fact suggests that Einstein was seeking not just any completion of quantum theory, but one wherein quantum states are solely representative of our knowledge. Our hypothesis is supported by an analysis of Einstein's attempts to clarify his views on quantum theory and the circumstance of his otherwise puzzling abandonment of an even simpler argument for incompleteness from 1927.Comment: 18 pages, 8 figures, 1 recipe for cupcakes; comments welcom

    Common variants in the ATM, BRCA1, BRCA2, CHEK2 and TP53 cancer susceptibility genes are unlikely to increase breast cancer risk

    Get PDF
    RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are.Abstract Introduction Certain rare, familial mutations in the ATM, BRCA1, BRCA2, CHEK2 or TP53 genes increase susceptibility to breast cancer but it has not, until now, been clear whether common polymorphic variants in the same genes also increase risk. Methods We have attempted a comprehensive, single nucleotide polymorphism (SNP)- and haplotype-tagging association study on each of these five genes in up to 4,474 breast cancer cases from the British, East Anglian SEARCH study and 4,560 controls from the EPIC-Norfolk study, using a two-stage study design. Nine tag SNPs were genotyped in ATM, together with five in BRCA1, sixteen in BRCA2, ten in CHEK2 and five in TP53, with the aim of tagging all other known, common variants. SNPs generating the common amino acid substitutions were specifically forced into the tagging set for each gene. Results No significant breast cancer associations were detected with any individual or combination of tag SNPs. Conclusion It is unlikely that there are any other common variants in these genes conferring measurably increased risks of breast cancer in our study population

    An Intraocular Pressure Polygenic Risk Score Stratifies Multiple Primary Open-Angle Glaucoma Parameters Including Treatment Intensity

    Get PDF
    Purpose: To examine the combined effects of common genetic variants associated with intraocular pressure (IOP) on primary open-angle glaucoma (POAG) phenotype using a polygenic risk score (PRS) stratification. Design: Cross-sectional study. Participants: For the primary analysis, we examined the glaucoma phenotype of 2154 POAG patients enrolled in the Australian and New Zealand Registry of Advanced Glaucoma, including patients recruited from the United Kingdom. For replication, we examined an independent cohort of 624 early POAG patients. Methods Using IOP genome-wide association study summary statistics, we developed a PRS derived solely from IOP-associated variants and stratified POAG patients into 3 risk tiers. The lowest and highest quintiles of the score were set as the low- and high-risk groups, respectively, and the other quintiles were set as the intermediate risk group. Main Outcome Measures: Clinical glaucoma phenotype including maximum recorded IOP, age at diagnosis, number of family members affected by glaucoma, cup-to-disc ratio, visual field mean deviation, and treatment intensity. Results: A dose–response relationship was found between the IOP PRS and the maximum recorded IOP, with the high genetic risk group having a higher maximum IOP by 1.7 mmHg (standard deviation [SD], 0.62 mmHg) than the low genetic risk group (P = 0.006). Compared with the low genetic risk group, the high genetic risk group had a younger age of diagnosis by 3.7 years (SD, 1.0 years; P < 0.001), more family members affected by 0.46 members (SD, 0.11 members; P < 0.001), and higher rates of incisional surgery (odds ratio, 1.5; 95% confidence interval, 1.1–2.0; P = 0.007). No statistically significant difference was found in mean deviation. We further replicated the maximum IOP, number of family members affected by glaucoma, and treatment intensity (number of medications) results in the early POAG cohort (P ≤ 0.01). Conclusions: The IOP PRS was correlated positively with maximum IOP, disease severity, need for surgery, and number of affected family members. Genes acting via IOP-mediated pathways, when considered in aggregate, have clinically important and reproducible implications for glaucoma patients and their close family members
    corecore