21 research outputs found

    Analysis of Population Structure: A Unifying Framework and Novel Methods Based on Sparse Factor Analysis

    Get PDF
    We consider the statistical analysis of population structure using genetic data. We show how the two most widely used approaches to modeling population structure, admixture-based models and principal components analysis (PCA), can be viewed within a single unifying framework of matrix factorization. Specifically, they can both be interpreted as approximating an observed genotype matrix by a product of two lower-rank matrices, but with different constraints or prior distributions on these lower-rank matrices. This opens the door to a large range of possible approaches to analyzing population structure, by considering other constraints or priors. In this paper, we introduce one such novel approach, based on sparse factor analysis (SFA). We investigate the effects of the different types of constraint in several real and simulated data sets. We find that SFA produces similar results to admixture-based models when the samples are descended from a few well-differentiated ancestral populations and can recapitulate the results of PCA when the population structure is more “continuous,” as in isolation-by-distance models

    Michael Gove’s war on professional historical expertise : conservative curriculum reform, extreme Whig history and the place of imperial heroes in modern multicultural Britain

    Get PDF
    Six years of continuously baiting his opponents within the history profession eventually amounted to little where it mattered most. UK Secretary of State for Education, Michael Gove, finally backtracked in 2013 on his plans to impose a curriculum for English schools based on a linear chronology of the achievements of British national heroes. His ‘history as celebration’ curriculum was designed to instil pride amongst students in a supposedly shared national past, but would merely have accentuated how many students in modern multicultural Britain fail to recognise themselves in what is taught in school history lessons. Now that the dust has settled on Gove’s tenure as Secretary of State, the time is right for retrospective analysis of how his plans for the history curriculum made it quite so far. How did he construct an ‘ideological’ conception of expertise which allowed him to go toe-to-toe for so long with the ‘professional’ expertise of academic historians and history teachers? What does the content of this ideological expertise tell us about the politics of race within Conservative Party curriculum reforms? This article answers these questions to characterise Gove as a ‘whig historian’ of a wilfully extreme nature in his attachment to imperial heroes as the best way to teach national history in modern multicultural Britain

    Assessing and mapping language, attention and executive multidimensional deficits in stroke aphasia.

    Get PDF
    There is growing awareness that aphasia following a stroke can include deficits in other cognitive functions and that these are predictive of certain aspects of language function, recovery and rehabilitation. However, data on attentional and executive (dys)functions in individuals with stroke aphasia are still scarce and the relationship to underlying lesions is rarely explored. Accordingly in this investigation, an extensive selection of standardized non-verbal neuropsychological tests was administered to 38 individuals with chronic post-stroke aphasia, in addition to detailed language testing and MRI. To establish the core components underlying the variable patients' performance, behavioural data were explored with rotated principal component analyses, first separately for the non-verbal and language tests, then in a combined analysis including all tests. Three orthogonal components for the non-verbal tests were extracted, which were interpreted as shift-update, inhibit-generate and speed. Three components were also extracted for the language tests, representing phonology, semantics and speech quanta. Individual continuous scores on each component were then included in a voxel-based correlational methodology analysis, yielding significant clusters for all components. The shift-update component was associated with a posterior left temporo-occipital and bilateral medial parietal cluster, the inhibit-generate component was mainly associated with left frontal and bilateral medial frontal regions, and the speed component with several small right-sided fronto-parieto-occipital clusters. Two complementary multivariate brain-behaviour mapping methods were also used, which showed converging results. Together the results suggest that a range of brain regions are involved in attention and executive functioning, and that these non-language domains play a role in the abilities of patients with chronic aphasia. In conclusion, our findings confirm and extend our understanding of the multidimensionality of stroke aphasia, emphasize the importance of assessing non-verbal cognition in this patient group and provide directions for future research and clinical practice. We also briefly compare and discuss univariate and multivariate methods for brain-behaviour mapping

    Use of Repeated Blood Pressure and Cholesterol Measurements to Improve Cardiovascular Disease Risk Prediction: An Individual-Participant-Data Meta-Analysis

    Get PDF
    The added value of incorporating information from repeated blood pressure and cholesterol measurements to predict cardiovascular disease (CVD) risk has not been rigorously assessed. We used data on 191,445 adults from the Emerging Risk Factors Collaboration (38 cohorts from 17 countries with data encompassing 1962-2014) with more than 1 million measurements of systolic blood pressure, total cholesterol, and high-density lipoprotein cholesterol. Over a median 12 years of follow-up, 21,170 CVD events occurred. Risk prediction models using cumulative mean values of repeated measurements and summary measures from longitudinal modeling of the repeated measurements were compared with models using measurements from a single time point. Risk discrimination (Cindex) and net reclassification were calculated, and changes in C-indices were meta-analyzed across studies. Compared with the single-time-point model, the cumulative means and longitudinal models increased the C-index by 0.0040 (95% confidence interval (CI): 0.0023, 0.0057) and 0.0023 (95% CI: 0.0005, 0.0042), respectively. Reclassification was also improved in both models; compared with the single-time-point model, overall net reclassification improvements were 0.0369 (95% CI: 0.0303, 0.0436) for the cumulative-means model and 0.0177 (95% CI: 0.0110, 0.0243) for the longitudinal model. In conclusion, incorporating repeated measurements of blood pressure and cholesterol into CVD risk prediction models slightly improves risk prediction

    An in vivo platform for identifying inhibitors of protein aggregation

    Get PDF
    Protein aggregation underlies an array of human diseases, yet only one small molecule therapeutic has been successfully developed to date. Here, we introduce an in vivo system, based on a β-lactamase tripartite fusion construct, capable of identifying aggregation-prone sequences in the periplasm of Escherichia coli and inhibitors that prevent their aberrant self-assembly. We demonstrate the power of the system using a range of proteins, from small unstructured peptides (islet amyloid polypeptide and amyloid β) to larger, folded immunoglobulin domains. Configured in a 48-well format, the split β-lactamase sensor readily differentiates between aggregation-prone and soluble sequences. Performing the assay in the presence of 109 compounds enabled a rank ordering of inhibition and revealed a new inhibitor of IAPP aggregation. This platform can be applied to both amyloidogenic and other aggregation-prone systems, independent of sequence or size, and can identify small molecules or other factors able to ameliorate or inhibit protein aggregation

    A new era for understanding amyloid structures and disease

    Get PDF
    The aggregation of proteins into amyloid fibrils and their deposition into plaques and intracellular inclusions is the hallmark of amyloid disease. The accumulation and deposition of amyloid fibrils, collectively known as amyloidosis, is associated with many pathological conditions that can be associated with ageing, such as Alzheimer disease, Parkinson disease, type II diabetes and dialysis-related amyloidosis. However, elucidation of the atomic structure of amyloid fibrils formed from their intact protein precursors and how fibril formation relates to disease has remained elusive. Recent advances in structural biology techniques, including cryo-electron microscopy and solid-state NMR spectroscopy, have finally broken this impasse. The first near-atomic-resolution structures of amyloid fibrils formed in vitro, seeded from plaque material and analysed directly ex vivo are now available. The results reveal cross-β structures that are far more intricate than anticipated. Here, we describe these structures, highlighting their similarities and differences, and the basis for their toxicity. We discuss how amyloid structure may affect the ability of fibrils to spread to different sites in the cell and between organisms in a prion-like manner, along with their roles in disease. These molecular insights will aid in understanding the development and spread of amyloid diseases and are inspiring new strategies for therapeutic intervention

    Equalization of four cardiovascular risk algorithms after systematic recalibration: individual-participant meta-analysis of 86 prospective studies

    Full text link
    Aims: There is debate about the optimum algorithm for cardiovascular disease (CVD) risk estimation. We conducted head-to-head comparisons of four algorithms recommended by primary prevention guidelines, before and after ‘recalibration’, a method that adapts risk algorithms to take account of differences in the risk characteristics of the populations being studied. Methods & Results: Using individual-participant data on 360737 participants without CVD at baseline in 86 prospective studies from 22 countries, we compared the Framingham risk score (FRS), Systematic COronary Risk Evaluation (SCORE), pooled cohort equations (PCE), and Reynolds risk score (RRS). We calculated measures of risk discrimination and calibration, and modelled clinical implications of initiating statin therapy in people judged to be at ‘high’ 10 year CVD risk. Original risk algorithms were recalibrated using the risk factor profile and CVD incidence of target populations. The four algorithms had similar risk discrimination. Before recalibration, FRS, SCORE, and PCE overpredicted CVD risk on average by 10%, 52%, and 41%, respectively, whereas RRS under-predicted by 10%. Original versions of algorithms classified 29–39% of individuals aged \u3e_40years as high risk. By contrast, recalibration reduced this proportion to 22–24% for every algorithm. We estimated that to prevent one CVD event, it would be necessary to initiate statin therapy in 44–51 such individuals using original algorithms, in contrast to 37–39 individuals with recalibrated algorithms. Conclusions: Before recalibration, the clinical performance of four widely used CVD risk algorithms varied substantially. By contrast, simple recalibration nearly equalized their performance and improved modelled targeting of preventive action to clinical need

    Effect of remote ischaemic conditioning on clinical outcomes in patients with acute myocardial infarction (CONDI-2/ERIC-PPCI): a single-blind randomised controlled trial.

    Get PDF
    BACKGROUND: Remote ischaemic conditioning with transient ischaemia and reperfusion applied to the arm has been shown to reduce myocardial infarct size in patients with ST-elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PPCI). We investigated whether remote ischaemic conditioning could reduce the incidence of cardiac death and hospitalisation for heart failure at 12 months. METHODS: We did an international investigator-initiated, prospective, single-blind, randomised controlled trial (CONDI-2/ERIC-PPCI) at 33 centres across the UK, Denmark, Spain, and Serbia. Patients (age >18 years) with suspected STEMI and who were eligible for PPCI were randomly allocated (1:1, stratified by centre with a permuted block method) to receive standard treatment (including a sham simulated remote ischaemic conditioning intervention at UK sites only) or remote ischaemic conditioning treatment (intermittent ischaemia and reperfusion applied to the arm through four cycles of 5-min inflation and 5-min deflation of an automated cuff device) before PPCI. Investigators responsible for data collection and outcome assessment were masked to treatment allocation. The primary combined endpoint was cardiac death or hospitalisation for heart failure at 12 months in the intention-to-treat population. This trial is registered with ClinicalTrials.gov (NCT02342522) and is completed. FINDINGS: Between Nov 6, 2013, and March 31, 2018, 5401 patients were randomly allocated to either the control group (n=2701) or the remote ischaemic conditioning group (n=2700). After exclusion of patients upon hospital arrival or loss to follow-up, 2569 patients in the control group and 2546 in the intervention group were included in the intention-to-treat analysis. At 12 months post-PPCI, the Kaplan-Meier-estimated frequencies of cardiac death or hospitalisation for heart failure (the primary endpoint) were 220 (8·6%) patients in the control group and 239 (9·4%) in the remote ischaemic conditioning group (hazard ratio 1·10 [95% CI 0·91-1·32], p=0·32 for intervention versus control). No important unexpected adverse events or side effects of remote ischaemic conditioning were observed. INTERPRETATION: Remote ischaemic conditioning does not improve clinical outcomes (cardiac death or hospitalisation for heart failure) at 12 months in patients with STEMI undergoing PPCI. FUNDING: British Heart Foundation, University College London Hospitals/University College London Biomedical Research Centre, Danish Innovation Foundation, Novo Nordisk Foundation, TrygFonden

    Quantification of natural DOM from UV absorption at two wavelengths

    Get PDF
    The precise simulation of ultraviolet absorption by 23 contrasting surface-water DOM samples was achieved with a model based on two components, one absorbing light strongly (A) and the other weakly (B). The parameterised model can be used to predict [DOC] in water samples simply from absorbance values at two wavelengths, while information on DOM quality is provided by the calculated fractionation into A and B. The model was tested by predicting [DOC] for a separate dataset obtained by combining results for 12 samples each from surface waters in the UK, Canada and the USA, and from UK groundwaters. A close correlation (R2 = 0.997) was obtained, with only slight underestimation of the true [DOC]. The proportions of components A and B varied considerably among the sites, which explains why precise prediction of [DOC] from absorbance data at a single wavelength was not possible. When the model was applied to samples collected from river locations in a heterogeneous UK catchment with areas of industry and high human population, [DOC] was underestimated in many cases, which may indicate the presence of non-absorbing pollutant DOM
    corecore