767 research outputs found

    Which lipid measurement should we monitor? An analysis of the LIPID study

    Get PDF
    OBJECTIVES: To evaluate the optimal lipid to measure in monitoring patients, we assessed three factors that influence the choice of monitoring tests: (1) clinical validity; (2) responsiveness to therapy changes and (3) the size of the long-term ‘signal-to-noise’ ratio. DESIGN: Longitudinal analyses of repeated lipid measurement over 5 years. SETTING: Subsidiary analysis of a Long-Term Intervention with Pravastatin in Ischaemic Disease (LIPID) study—a clinical trial in Australia, New Zealand and Finland. PARTICIPANTS: 9014 patients aged 31–75 years with previous acute coronary syndromes. INTERVENTIONS: Patients were randomly assigned to 40 mg daily pravastatin or placebo. PRIMARY AND SECONDARY OUTCOME MEASURES: We used data on serial lipid measurements—at randomisation, 6 months and 12 months, and then annually to 5 years—of total cholesterol; low-density lipoprotein (LDL) cholesterol, high-density lipoprotein (HDL) cholesterol and their ratios; triglycerides; and apolipoproteins A and B and their ratio and their ability to predict coronary events. RESULTS: All the lipid measures were statistically significantly associated with future coronary events, but the associations between each of the three ratio measures (total or LDL cholesterol to HDL cholesterol, and apolipoprotein B to apolipoprotein A1) and the time to a coronary event were better than those for any of the single lipid measures. The two cholesterol ratios also ranked highly for the long-term signal-to-noise ratios. However, LDL cholesterol and non-HDL cholesterol showed the most responsiveness to treatment change. CONCLUSIONS: Lipid monitoring is increasingly common, but current guidelines vary. No single measure was best on all three criteria. Total cholesterol did not rank highly on any single criterion. However, measurements based on cholesterol subfractions—non-HDL cholesterol (total cholesterol minus HDL cholesterol) and the two ratios—appeared superior to total cholesterol or any of the apolipoprotein options. Guidelines should consider using non-HDL cholesterol or a ratio measure for initial treatment decisions and subsequent monitoring

    The brain plus the cultural transmission mechanism determine the nature of language

    Get PDF
    We agree that language adapts to the brain, but we note that language also has to adapt to brain-external constraints, such as those arising from properties of the cultural transmission medium. The hypothesis that Christiansen & Chater (C&C) raise in the target article not only has profound consequences for our understanding of language, but also for our understanding of the biological evolution of the language faculty

    Pliocene–Pleistocene basin evolution along the Garlock fault zone, Pilot Knob Valley, California

    Get PDF
    Exposed Pliocene–Pleistocene terrestrial strata provide an archive of the spatial and temporal development of a basin astride the sinistral Garlock fault in California. In the southern Slate Range and Pilot Knob Valley, an ∼2000-m-thick package of Late Cenozoic strata has been uplifted and tilted to the northeast. We name this succession the formation of Pilot Knob Valley and provide new chronologic, stratigraphic, and provenance data for these rocks. The unit is divided into five members that record different source areas and depositional patterns: (1) the lowest exposed strata are conglomeratic rocks derived from Miocene Eagle Crags volcanic field to the south and east across the Garlock fault; (2) the second member consists mostly of fine-grained rocks with coarser material derived from both southern and northern sources; and (3) the upper three members are primarily coarse-grained conglomerates and sandstones derived from the adjacent Slate Range to the north. Tephrochronologic data from four ash samples bracket deposition of the second member to 3.6–3.3 Ma and the fourth member to between 1.1 and 0.6 Ma. A fifth tephrochronologic sample from rocks south of the Garlock fault near Christmas Canyon brackets deposition of a possible equivalent to the second member of the formation of Pilot Knob Valley at ca. 3.1 Ma. Although the age of the base of the lowest member is not directly dated, regional stratigraphic and tectonic associations suggest that the basin started forming ca. 4–5 Ma. By ca. 3.6 Ma, the northward progradation fanglomerate sourced in the Eagle Crags region waned, and subsequent deposition occurred in shallow lacustrine systems. At ca. 3.3 Ma, southward progradation of conglomerates derived from the Slate Range began. Circa 1.1 Ma, continued southward progradation of fanglomerate with Slate Range sources is characterized by a shift to coarser grain sizes, interpreted to reflect uplift of the Slate Range. Overall, basin architecture and the temporal evolution of different source regions were controlled by activity on three regionally important faults—the Garlock, the Marine Gate, and the Searles Valley faults. The timing and style of motions on these faults appear to be directly linked to patterns of basin development

    Autosomal Monoallelic Expression in the Mouse

    Get PDF
    Background: Random monoallelic expression defines an unusual class of genes displaying random choice for expression between the maternal and paternal alleles. Once established, the allele-specific expression pattern is stably maintained and mitotically inherited. Examples of random monoallelic genes include those found on the X-chromosome and a subset of autosomal genes, which have been most extensively studied in humans. Here, we report a genome-wide analysis of random monoallelic expression in the mouse. We used high density mouse genome polymorphism mapping arrays to assess allele-specific expression in clonal cell lines derived from heterozygous mouse strains. Results: Over 1,300 autosomal genes were assessed for allele-specific expression, and greater than 10% of them showed random monoallelic expression. When comparing mouse and human, the number of autosomal orthologs demonstrating random monoallelic expression in both organisms was greater than would be expected by chance. Random monoallelic expression on the mouse autosomes is broadly similar to that in human cells: it is widespread throughout the genome, lacks chromosome-wide coordination, and varies between cell types. However, for some mouse genes, there appears to be skewing, in some ways resembling skewed X-inactivation, wherein one allele is more frequently active. Conclusions: These data suggest that autosomal random monoallelic expression was present at least as far back as the last common ancestor of rodents and primates. Random monoallelic expression can lead to phenotypic variation beyond the phenotypic variation dictated by genotypic variation. Thus, it is important to take into account random monoallelic expression when examining genotype-phenotype correlation

    Spontaneous excretion of a pseudomembranous intestinal cast in an infant with an acute diarrhoeal illness – a case report and literature review

    Get PDF
    We present a case of an 8-week-old infant with acute bloody diarrhoea andsubsequent passage of an intestinal cast. An extensive immune and infection work-updid not reveal a causative aetiology. Histopathology indicated the cast represented anintestinal pseudomembrane. 16S bacterial PCR of the pathology specimen wasnegative. The infant required a period of parenteral nutrition due to diarrhoeal lossesbut made a full recovery and had no sequelae from this illness. Intestinal casts are arare occurrence, particularly in paediatrics. It prompts a wide differential which includesacute infection, immunodeficiency and ischaemia. Accurate quantification of stoollosses, appropriate nutrition support and liaison with microbiology colleagues wereessential in this case

    A Stabilizer Framework for the Contextual Subspace Variational Quantum Eigensolver and the Noncontextual Projection Ansatz

    Get PDF
    Quantum chemistry is a promising application for noisy intermediate-scale quantum (NISQ) devices. However, quantum computers have thus far not succeeded in providing solutions to problems of real scientific significance, with algorithmic advances being necessary to fully utilize even the modest NISQ machines available today. We discuss a method of ground state energy estimation predicated on a partitioning of the molecular Hamiltonian into two parts: one that is noncontextual and can be solved classically, supplemented by a contextual component that yields quantum corrections obtained via a Variational Quantum Eigensolver (VQE) routine. This approach has been termed Contextual Subspace VQE (CS-VQE); however, there are obstacles to overcome before it can be deployed on NISQ devices. The problem we address here is that of the ansatz, a parametrized quantum state over which we optimize during VQE; it is not initially clear how a splitting of the Hamiltonian should be reflected in the CS-VQE ansätze. We propose a "noncontextual projection" approach that is illuminated by a reformulation of CS-VQE in the stabilizer formalism. This defines an ansatz restriction from the full electronic structure problem to the contextual subspace and facilitates an implementation of CS-VQE that may be deployed on NISQ devices. We validate the noncontextual projection ansatz using a quantum simulator and demonstrate chemically precise ground state energy calculations for a suite of small molecules at a significant reduction in the required qubit count and circuit depth

    A stabilizer framework for Contextual Subspace VQE and the noncontextual projection ansatz

    Full text link
    Quantum chemistry is a promising application for noisy intermediate-scale quantum (NISQ) devices. However, quantum computers have thus far not succeeded in providing solutions to problems of real scientific significance, with algorithmic advances being necessary to fully utilise even the modest NISQ machines available today. We discuss a method of ground state energy estimation predicated on a partitioning the molecular Hamiltonian into two parts: one that is noncontextual and can be solved classically, supplemented by a contextual component that yields quantum corrections obtained via a Variational Quantum Eigensolver (VQE) routine. This approach has been termed Contextual Subspace VQE (CS-VQE), but there are obstacles to overcome before it can be deployed on NISQ devices. The problem we address here is that of the ansatz - a parametrized quantum state over which we optimize during VQE. It is not initially clear how a splitting of the Hamiltonian should be reflected in our CS-VQE ans\"atze. We propose a 'noncontextual projection' approach that is illuminated by a reformulation of CS-VQE in the stabilizer formalism. This defines an ansatz restriction from the full electronic structure problem to the contextual subspace and facilitates an implementation of CS-VQE that may be deployed on NISQ devices. We validate the noncontextual projection ansatz using a quantum simulator, with results obtained herein for a suite of trial molecules.Comment: 42 pages, 4 figure

    Incorporating Learning Analytics into Basic Course Administration: How to Embrace the Opportunity to Identify Inconsistencies and Inform Responses

    Get PDF
    Consistency is imperative to the success of a multi-section basic course. However, establishing consistent practices is a difficult task, especially when coupled with maintaining instructor autonomy. Learning analytics tools, designed to improve learning and teaching by collecting and analyzing pertinent information through interactive databases, can be used by basic course administrators to improve consistency. Using a reflective case study methodology we share our experience incorporating a learning analytics platform into our basic course. In doing so, we highlight the role this technology can play in terms of identifying areas of inconsistency as well as informing ways to improve overall course delivery. Three major areas of inconsistency were uncovered: (1) the use of online grade books; (2) utilization of course-wide rubrics; (3) and instances of grade inflation. Stemming from these findings is a set of very practical implications regarding the coupling of learning analytics and basic course administration. These include clarifying the two-step process of identifying inconsistencies and informing solutions as well as introducing the concept of collaborative consistency, the term we use to describe the co-construction of course materials (e.g., rubrics, schedules) and activities (e.g., norming). The case ultimately provides the opportunity for basic course directors to embrace the role of learning analytics technology
    • …
    corecore