436 research outputs found

    Predictability of evolutionary trajectories in fitness landscapes

    Get PDF
    Experimental studies on enzyme evolution show that only a small fraction of all possible mutation trajectories are accessible to evolution. However, these experiments deal with individual enzymes and explore a tiny part of the fitness landscape. We report an exhaustive analysis of fitness landscapes constructed with an off-lattice model of protein folding where fitness is equated with robustness to misfolding. This model mimics the essential features of the interactions between amino acids, is consistent with the key paradigms of protein folding and reproduces the universal distribution of evolutionary rates among orthologous proteins. We introduce mean path divergence as a quantitative measure of the degree to which the starting and ending points determine the path of evolution in fitness landscapes. Global measures of landscape roughness are good predictors of path divergence in all studied landscapes: the mean path divergence is greater in smooth landscapes than in rough ones. The model-derived and experimental landscapes are significantly smoother than random landscapes and resemble additive landscapes perturbed with moderate amounts of noise; thus, these landscapes are substantially robust to mutation. The model landscapes show a deficit of suboptimal peaks even compared with noisy additive landscapes with similar overall roughness. We suggest that smoothness and the substantial deficit of peaks in the fitness landscapes of protein evolution are fundamental consequences of the physics of protein folding.Comment: 14 pages, 7 figure

    Emplacement of inflated Pāhoehoe flows in the Naude’s Nek Pass, Lesotho remnant, Karoo continental flood basalt province: use of flow-lobe tumuli in understanding flood basalt emplacement

    Get PDF
    Physical volcanological features are presented for a 710-m-thick section, of the Naude’s Nek Pass, within the lower part of the Lesotho remnant of the Karoo Large Igneous Province. The section consists of inflated pāhoehoe lava with thin, impersistent sedimentary interbeds towards the base. There are seven discreet packages of compound and hummocky pāhoehoe lobes containing flow-lobe tumuli, making up approximately 50% of the section. Approximately 45% of the sequence consists of 14 sheet lobes, between 10 and 52-m-thick. The majority of the sheet lobes are in two packages indicating prolonged periods of lava supply capable of producing thick sheet lobes. The other sheet lobes are as individual lobes or pairs, within compound flows, suggesting brief increases in lava supply rate. We suggest, contrary to current belief, that there is no evidence that compound flows are proximal to source and sheet lobes (simple flows) are distal to source and we propose that the presence of flow-lobe tumuli in compound flows could be an indicator that a flow is distal to source. We use detailed, previously published, studies of the Thakurvadi Formation (Deccan Traps) as an example. We show that the length of a lobe and therefore the sections that are ‘medial or distal to source’ are specific to each individual lobe and are dependent on the lava supply of each eruptive event, and as such flow lobe tumuli can be used as an indicator of relative distance from source

    Corruption and bicameral reforms

    Get PDF
    During the last decade unicameral proposals have been put forward in fourteen US states. In this paper we analyze the effects of the proposed constitutional reforms, in a setting where decision making is subject to ‘hard time constraints’, and lawmakers face the opposing interests of a lobby and the electorate. We show that bicameralism might lead to a decline in the lawmakers’ bargaining power vis-a-vis the lobby, thus compromising their accountability to voters. Hence, bicameralism is not a panacea against the abuse of power by elected legislators and the proposed unicameral reforms could be effective in reducing corruption among elected representatives

    Proceedings of a Sickle Cell Disease Ontology workshop - Towards the first comprehensive ontology for Sickle Cell Disease

    Get PDF
    Sickle cell disease (SCD) is a debilitating single gene disorder caused by a single point mutation that results in physical deformation (i.e. sickling) of erythrocytes at reduced oxygen tensions. Up to 75% of SCD in newborns world-wide occurs in sub-Saharan Africa, where neonatal and childhood mortality from sickle cell related complications is high. While SCD research across the globe is tackling the disease on multiple fronts, advances have yet to significantly impact on the health and quality of life of SCD patients, due to lack of coordination of these disparate efforts. Ensuring data across studies is directly comparable through standardization is a necessary step towards realizing this goal. Such a standardization requires the development and implementation of a disease-specific ontology for SCD that is applicable globally. Ontology development is best achieved by bringing together experts in the domain to contribute their knowledge. The SCD community and H3ABioNet members joined forces at a recent SCD Ontology workshop to develop an ontology covering aspects of SCD under the classes: phenotype, diagnostics, therapeutics, quality of life, disease modifiers and disease stage. The aim of the workshop was for participants to contribute their expertise to development of the structure and contents of the SCD ontology. Here we describe the proceedings of the Sickle Cell Disease Ontology Workshop held in Cape Town South Africa in February 2016 and its outcomes. The objective of the workshop was to bring together experts in SCD from around the world to contribute their expertise to the development of various aspects of the SCD ontology

    Chondrogenic and Gliogenic Subpopulations of Neural Crest Play Distinct Roles during the Assembly of Epibranchial Ganglia

    Get PDF
    In vertebrates, the sensory neurons of the epibranchial (EB) ganglia transmit somatosensory signals from the periphery to the CNS. These ganglia are formed during embryogenesis by the convergence and condensation of two distinct populations of precursors: placode-derived neuroblasts and neural crest- (NC) derived glial precursors. In addition to the gliogenic crest, chondrogenic NC migrates into the pharyngeal arches, which lie in close proximity to the EB placodes and ganglia. Here, we examine the respective roles of these two distinct NC-derived populations during development of the EB ganglia using zebrafish morphant and mutants that lack one or both of these NC populations. Our analyses of mutant and morphant zebrafish that exhibit deficiencies in chondrogenic NC at early stages reveal a distinct requirement for this NC subpopulation during early EB ganglion assembly and segmentation. Furthermore, restoration of wildtype chondrogenic NC in one of these mutants, prdm1a, is sufficient to restore ganglion formation, indicating a specific requirement of the chondrogenic NC for EB ganglia assembly. By contrast, analysis of the sox10 mutant, which lacks gliogenic NC, reveals that the initial assembly of ganglia is not affected. However, during later stages of development, EB ganglia are dispersed in the sox10 mutant, suggesting that glia are required to maintain normal EB ganglion morphology. These results highlight novel roles for two subpopulations of NC cells in the formation and maintenance of EB ganglia: chondrogenic NC promotes the early-stage formation of the developing EB ganglia while glial NC is required for the late-stage maintenance of ganglion morphology

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Effects of etizolam and ethyl loflazepate on the P300 event-related potential in healthy subjects

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Benzodiazepines carry the risk of inducing cognitive impairments, which may go unnoticed while profoundly disturbing social activity. Furthermore, these impairments are partly associated with the elimination half-life (EH) of the substance from the body. The object of the present study was to examine the effects of etizolam and ethyl loflazepate, with EHs of 6 h and 122 h, respectively, on information processing in healthy subjects.</p> <p>Methods</p> <p>Healthy people were administered etizolam and ethyl loflazepate acutely and subchronically (14 days). The auditory P300 event-related potential and the neuropsychological batteries described below were employed to assess the effects of drugs on cognition. The P300 event-related potential was recorded before and after drug treatments. The digit symbol test, trail making test, digit span test and verbal paired associates test were administered to examine mental slowing and memory functioning.</p> <p>Results</p> <p>Acute administration of drugs caused prolongation in P300 latency and reduction in P300 amplitude. Etizolam caused a statistically significant prolongation in P300 latency compared to ethyl loflazepate. Furthermore, subchronic administration of etizolam, but not ethyl loflazepate, still caused a weak prolongation in P300 latency. In contrast, neuropsychological tests showed no difference.</p> <p>Conclusions</p> <p>The results indicate that acute administration of ethyl loflazepate induces less effect on P300 latency than etizolam.</p

    Data analysis issues for allele-specific expression using Illumina's GoldenGate assay.

    Get PDF
    BACKGROUND: High-throughput measurement of allele-specific expression (ASE) is a relatively new and exciting application area for array-based technologies. In this paper, we explore several data sets which make use of Illumina's GoldenGate BeadArray technology to measure ASE. This platform exploits coding SNPs to obtain relative expression measurements for alleles at approximately 1500 positions in the genome. RESULTS: We analyze data from a mixture experiment where genomic DNA samples from pairs of individuals of known genotypes are pooled to create allelic imbalances at varying levels for the majority of SNPs on the array. We observe that GoldenGate has less sensitivity at detecting subtle allelic imbalances (around 1.3 fold) compared to extreme imbalances, and note the benefit of applying local background correction to the data. Analysis of data from a dye-swap control experiment allowed us to quantify dye-bias, which can be reduced considerably by careful normalization. The need to filter the data before carrying out further downstream analysis to remove non-responding probes, which show either weak, or non-specific signal for each allele, was also demonstrated. Throughout this paper, we find that a linear model analysis of the data from each SNP is a flexible modelling strategy that allows for testing of allelic imbalances in each sample when replicate hybridizations are available. CONCLUSIONS: Our analysis shows that local background correction carried out by Illumina's software, together with quantile normalization of the red and green channels within each array, provides optimal performance in terms of false positive rates. In addition, we strongly encourage intensity-based filtering to remove SNPs which only measure non-specific signal. We anticipate that a similar analysis strategy will prove useful when quantifying ASE on Illumina's higher density Infinium BeadChips.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are

    Opposing effects of final population density and stress on Escherichia coli mutation rate

    Get PDF
    Evolution depends on mutations. For an individual genotype, the rate at which mutations arise is known to increase with various stressors (stress-induced mutagenesis-SIM) and decrease at high final population density (density-associated mutation-rate plasticity-DAMP). We hypothesised that these two forms of mutation-rate plasticity would have opposing effects across a nutrient gradient. Here we test this hypothesis, culturing Escherichia coli in increasingly rich media. We distinguish an increase in mutation rate with added nutrients through SIM (dependent on error-prone polymerases Pol IV and Pol V) and an opposing effect of DAMP (dependent on MutT, which removes oxidised G nucleotides). The combination of DAMP and SIM results in a mutation rate minimum at intermediate nutrient levels (which can support 7 × 10  cells ml ). These findings demonstrate a strikingly close and nuanced relationship of ecological factors-stress and population density-with mutation, the fuel of all evolution

    Benchmarking natural-language parsers for biological applications using dependency graphs

    Get PDF
    BACKGROUND: Interest is growing in the application of syntactic parsers to natural language processing problems in biology, but assessing their performance is difficult because differences in linguistic convention can falsely appear to be errors. We present a method for evaluating their accuracy using an intermediate representation based on dependency graphs, in which the semantic relationships important in most information extraction tasks are closer to the surface. We also demonstrate how this method can be easily tailored to various application-driven criteria. RESULTS: Using the GENIA corpus as a gold standard, we tested four open-source parsers which have been used in bioinformatics projects. We first present overall performance measures, and test the two leading tools, the Charniak-Lease and Bikel parsers, on subtasks tailored to reflect the requirements of a system for extracting gene expression relationships. These two tools clearly outperform the other parsers in the evaluation, and achieve accuracy levels comparable to or exceeding native dependency parsers on similar tasks in previous biological evaluations. CONCLUSION: Evaluating using dependency graphs allows parsers to be tested easily on criteria chosen according to the semantics of particular biological applications, drawing attention to important mistakes and soaking up many insignificant differences that would otherwise be reported as errors. Generating high-accuracy dependency graphs from the output of phrase-structure parsers also provides access to the more detailed syntax trees that are used in several natural-language processing techniques
    corecore