10,010 research outputs found

    A Workflow for Software Development within Computational Epidemiology

    Get PDF
    A critical investigation into computational models developed for studying the spread of communicable disease is presented. The case in point is a spatially explicit micro-meso-macro model for the entire Swedish population built on registry data, thus far used for smallpox and for influenza-like illnesses. The lessons learned from a software development project of more than 100 person months are collected into a check list. The list is intended for use by computational epidemiologists and policy makers, and the workflow incorporating these two roles is described in detail.NOTICE: This is the author’s version of a work that was accepted for publication in Journal of Computationa Science. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Journal of Computational Science, VOL 2, ISSUE 3, 6 June 2011 DOI 10.1016/j.jocs.2011.05.004.</p

    On Understanding Catastrophe — The Case of Highly Severe Influenza-Like Illness

    Get PDF
    Computational epidemiology is a form of spatiotemporal reasoning in which social link structures are employed, and spatially explicit models are specified and executed. We point to issues thus far addressed neither by engineers, nor scientists, in the light of a use case focusing on catastrophic scenarios that assume the emergence of a highly unlikely but lethal and contagious strain of influenza. Our conclusion is that important perspectives are missing when dealing with policy issues resulting from scenario execution and analyses in computational epidemiology

    A Bayesian Framework for Parameter Estimation in Dynamical Models with Applications to Forecasting

    Get PDF
    Mathematical models in Biology are powerful tools for the study and exploration of complex dynamics. Nevertheless, bringing theoretical results to an agreement with experimental observations involves acknowledging a great deal of uncertainty intrinsic to our theoretical representation of a real system.&#xd;&#xa;Proper handling of such uncertainties, is key to the successful usage of models to predict experimental or &#xfb01;eld observations. This problem has been addressed over the years by many tools for model calibration an parameter estimation. In this article we present a general framework for uncertainty analysis and parameter estimation which is designed to handle uncertainties associated with the modeling of dynamic biological systems while remaining agnostic as to the type of model used. We apply the framework to two In&#xfb02;uenza transmission models: one deterministic and the other stochastic. The results show that the framework can be applied without modi&#xfb01;cations to the two types of models and that it performs equally well on both. We also discuss the application of the framework to calibrate models for forecasting purposes.&#xd;&#xa

    Toward a Standardized Strategy of Clinical Metabolomics for the Advancement of Precision Medicine

    Get PDF
    Despite the tremendous success, pitfalls have been observed in every step of a clinical metabolomics workflow, which impedes the internal validity of the study. Furthermore, the demand for logistics, instrumentations, and computational resources for metabolic phenotyping studies has far exceeded our expectations. In this conceptual review, we will cover inclusive barriers of a metabolomics-based clinical study and suggest potential solutions in the hope of enhancing study robustness, usability, and transferability. The importance of quality assurance and quality control procedures is discussed, followed by a practical rule containing five phases, including two additional "pre-pre-" and "post-post-" analytical steps. Besides, we will elucidate the potential involvement of machine learning and demonstrate that the need for automated data mining algorithms to improve the quality of future research is undeniable. Consequently, we propose a comprehensive metabolomics framework, along with an appropriate checklist refined from current guidelines and our previously published assessment, in the attempt to accurately translate achievements in metabolomics into clinical and epidemiological research. Furthermore, the integration of multifaceted multi-omics approaches with metabolomics as the pillar member is in urgent need. When combining with other social or nutritional factors, we can gather complete omics profiles for a particular disease. Our discussion reflects the current obstacles and potential solutions toward the progressing trend of utilizing metabolomics in clinical research to create the next-generation healthcare system.11Ysciescopu

    Essential guidelines for computational method benchmarking

    Get PDF
    In computational biology and other sciences, researchers are frequently faced with a choice between several computational methods for performing data analyses. Benchmarking studies aim to rigorously compare the performance of different methods using well-characterized benchmark datasets, to determine the strengths of each method or to provide recommendations regarding suitable choices of methods for an analysis. However, benchmarking studies must be carefully designed and implemented to provide accurate, unbiased, and informative results. Here, we summarize key practical guidelines and recommendations for performing high-quality benchmarking analyses, based on our experiences in computational biology.Comment: Minor update
    corecore