19 research outputs found

    Combining Experiments and Simulations Using the Maximum Entropy Principle

    Get PDF
    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges

    Fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin with gemtuzumab ozogamicin improves event-free survival in younger patients with newly diagnosed aml and overall survival in patients with npm1 and flt3 mutations

    Get PDF
    Purpose To determine the optimal induction chemotherapy regimen for younger adults with newly diagnosed AML without known adverse risk cytogenetics. Patients and Methods One thousand thirty-three patients were randomly assigned to intensified (fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin [FLAG-Ida]) or standard (daunorubicin and Ara-C [DA]) induction chemotherapy, with one or two doses of gemtuzumab ozogamicin (GO). The primary end point was overall survival (OS). Results There was no difference in remission rate after two courses between FLAG-Ida + GO and DA + GO (complete remission [CR] + CR with incomplete hematologic recovery 93% v 91%) or in day 60 mortality (4.3% v 4.6%). There was no difference in OS (66% v 63%; P = .41); however, the risk of relapse was lower with FLAG-Ida + GO (24% v 41%; P < .001) and 3-year event-free survival was higher (57% v 45%; P < .001). In patients with an NPM1 mutation (30%), 3-year OS was significantly higher with FLAG-Ida + GO (82% v 64%; P = .005). NPM1 measurable residual disease (MRD) clearance was also greater, with 88% versus 77% becoming MRD-negative in peripheral blood after cycle 2 (P = .02). Three-year OS was also higher in patients with a FLT3 mutation (64% v 54%; P = .047). Fewer transplants were performed in patients receiving FLAG-Ida + GO (238 v 278; P = .02). There was no difference in outcome according to the number of GO doses, although NPM1 MRD clearance was higher with two doses in the DA arm. Patients with core binding factor AML treated with DA and one dose of GO had a 3-year OS of 96% with no survival benefit from FLAG-Ida + GO. Conclusion Overall, FLAG-Ida + GO significantly reduced relapse without improving OS. However, exploratory analyses show that patients with NPM1 and FLT3 mutations had substantial improvements in OS. By contrast, in patients with core binding factor AML, outcomes were excellent with DA + GO with no FLAG-Ida benefit

    Surgery and risk for multiple sclerosis: a systematic review and meta-analysis of case–control studies

    Full text link

    The neoclassical Trojan horse of steady-state economics

    No full text
    The vision of a steady-state economy elaborated by Herman Daly describes an economy that uses materials and energy within the regenerative and assimilative limits of the planet's ecosystems. Sustainable scale, just distribution, and efficient allocation are its constitutive theoretical goals. This paper is a critique of the theoretical foundations of steady-state economics. It argues that steady-state economics consists in an attempt to squeeze neoclassical economics into a biophysical and ethical corset. As a result, many fundamental flaws and criticisms of neoclassical economics remain. As a consequence, steady-state economics does not lead to a radical departure from, or improvement upon, neoclassical theory but rather to fundamental internal inconsistencies between the ‘old’ economics paradigm and ‘new’ progressive ecological economic thinking. Contradictions appear at various levels ranging from ontology and methodology to theory and values. As Daly has pioneered the foundations of ecological economics with his thinking, these ambiguities are not only problematic for steady-state economics but ecological economics as a field more generally. The paper concludes that ecological economics has to let go of neoclassical foundations as they contradict its core values and ambitions. A new and consistent theory of political economy of the environment along heterodox lines is needed
    corecore