233 research outputs found

    Ensemble evaluation of hydrological model hypotheses

    Get PDF
    It is demonstrated for the first time how model parameter, structural and data uncertainties can be accounted for explicitly and simultaneously within the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. As an example application, 72 variants of a single soil moisture accounting store are tested as simplified hypotheses of runoff generation at six experimental grassland field-scale lysimeters through model rejection and a novel diagnostic scheme. The fields, designed as replicates, exhibit different hydrological behaviors which yield different model performances. For fields with low initial discharge levels at the beginning of events, the conceptual stores considered reach their limit of applicability. Conversely, one of the fields yielding more discharge than the others, but having larger data gaps, allows for greater flexibility in the choice of model structures. As a model learning exercise, the study points to a “leaking” of the fields not evident from previous field experiments. It is discussed how understanding observational uncertainties and incorporating these into model diagnostics can help appreciate the scale of model structural error

    Basement membrane and vascular remodelling in smokers and chronic obstructive pulmonary disease: a cross-sectional study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Little is known about airway remodelling in bronchial biopsies (BB) in smokers and chronic obstructive pulmonary disease (COPD). We conducted an initial pilot study comparing BB from COPD patients with nonsmoking controls. This pilot study suggested the presence of reticular basement membrane (Rbm) fragmentation and altered vessel distribution in COPD.</p> <p>Methods</p> <p>To determine whether Rbm fragmentation and altered vessel distribution in BB were specific for COPD we designed a cross-sectional study and stained BB from 19 current smokers and 14 ex-smokers with mild to moderate COPD and compared these to 15 current smokers with normal lung function and 17 healthy and nonsmoking subjects.</p> <p>Results</p> <p>Thickness of the Rbm was not significantly different between groups; although in COPD this parameter was quite variable. The Rbm showed fragmentation and splitting in both current smoking groups and ex-smoker COPD compared with healthy nonsmokers (p < 0.02); smoking and COPD seemed to have additive effects. Rbm fragmentation correlated with smoking history in COPD but not with age. There were more vessels in the Rbm and fewer vessels in the lamina propria in current smokers compared to healthy nonsmokers (p < 0.05). The number of vessels staining for vascular endothelial growth factor (VEGF) in the Rbm was higher in both current smoker groups and ex-smoker COPD compared to healthy nonsmokers (p < 0.004). In current smoker COPD VEGF vessel staining correlated with FEV1% predicted (r = 0.61, p < 0.02).</p> <p>Conclusions</p> <p>Airway remodelling in smokers and mild to moderate COPD is associated with fragmentation of the Rbm and altered distribution of vessels in the airway wall. Rbm fragmentation was also present to as great an extent in ex-smokers with COPD. These characteristics may have potential physiological consequences.</p

    The Cellular Phenotype of Roberts Syndrome Fibroblasts as Revealed by Ectopic Expression of ESCO2

    Get PDF
    Cohesion between sister chromatids is essential for faithful chromosome segregation. In budding yeast, the acetyltransferase Eco1/Ctf7 establishes cohesion during DNA replication in S phase and in response to DNA double strand breaks in G2/M phase. In humans two Eco1 orthologs exist: ESCO1 and ESCO2. Both proteins are required for proper sister chromatid cohesion, but their exact function is unclear at present. Since ESCO2 has been identified as the gene defective in the rare autosomal recessive cohesinopathy Roberts syndrome (RBS), cells from RBS patients can be used to elucidate the role of ESCO2. We investigated for the first time RBS cells in comparison to isogenic controls that stably express V5- or GFP-tagged ESCO2. We show that the sister chromatid cohesion defect in the transfected cell lines is rescued and suggest that ESCO2 is regulated by proteasomal degradation in a cell cycle-dependent manner. In comparison to the corrected cells RBS cells were hypersensitive to the DNA-damaging agents mitomycin C, camptothecin and etoposide, while no particular sensitivity to UV, ionizing radiation, hydroxyurea or aphidicolin was found. The cohesion defect of RBS cells and their hypersensitivity to DNA-damaging agents were not corrected by a patient-derived ESCO2 acetyltransferase mutant (W539G), indicating that the acetyltransferase activity of ESCO2 is essential for its function. In contrast to a previous study on cells from patients with Cornelia de Lange syndrome, another cohesinopathy, RBS cells failed to exhibit excessive chromosome aberrations after irradiation in G2 phase of the cell cycle. Our results point at an S phase-specific role for ESCO2 in the maintenance of genome stability

    Using Evolutionary Algorithms for Fitting High-Dimensional Models to Neuronal Data

    Get PDF
    In the study of neurosciences, and of complex biological systems in general, there is frequently a need to fit mathematical models with large numbers of parameters to highly complex datasets. Here we consider algorithms of two different classes, gradient following (GF) methods and evolutionary algorithms (EA) and examine their performance in fitting a 9-parameter model of a filter-based visual neuron to real data recorded from a sample of 107 neurons in macaque primary visual cortex (V1). Although the GF method converged very rapidly on a solution, it was highly susceptible to the effects of local minima in the error surface and produced relatively poor fits unless the initial estimates of the parameters were already very good. Conversely, although the EA required many more iterations of evaluating the model neuron’s response to a series of stimuli, it ultimately found better solutions in nearly all cases and its performance was independent of the starting parameters of the model. Thus, although the fitting process was lengthy in terms of processing time, the relative lack of human intervention in the evolutionary algorithm, and its ability ultimately to generate model fits that could be trusted as being close to optimal, made it far superior in this particular application than the gradient following methods. This is likely to be the case in many further complex systems, as are often found in neuroscience

    Modeling causes of death: an integrated approach using CODEm

    Get PDF
    Background: Data on causes of death by age and sex are a critical input into health decision-making. Priority setting in public health should be informed not only by the current magnitude of health problems but by trends in them. However, cause of death data are often not available or are subject to substantial problems of comparability. We propose five general principles for cause of death model development, validation, and reporting.Methods: We detail a specific implementation of these principles that is embodied in an analytical tool - the Cause of Death Ensemble model (CODEm) - which explores a large variety of possible models to estimate trends in causes of death. Possible models are identified using a covariate selection algorithm that yields many plausible combinations of covariates, which are then run through four model classes. The model classes include mixed effects linear models and spatial-temporal Gaussian Process Regression models for cause fractions and death rates. All models for each cause of death are then assessed using out-of-sample predictive validity and combined into an ensemble with optimal out-of-sample predictive performance.Results: Ensemble models for cause of death estimation outperform any single component model in tests of root mean square error, frequency of predicting correct temporal trends, and achieving 95% coverage of the prediction interval. We present detailed results for CODEm applied to maternal mortality and summary results for several other causes of death, including cardiovascular disease and several cancers.Conclusions: CODEm produces better estimates of cause of death trends than previous methods and is less susceptible to bias in model specification. We demonstrate the utility of CODEm for the estimation of several major causes of death

    Computer-Based Intensity Measurement Assists Pathologists in Scoring Phosphatase and Tensin Homolog Immunohistochemistry - Clinical Associations in NSCLC Patients of the European Thoracic Oncology Platform Lungscape Cohort.

    Get PDF
    Phosphatase and tensin homolog (PTEN) loss is frequently observed in NSCLC and associated with both phosphoinositide 3-kinase activation and tumoral immunosuppression. PTEN immunohistochemistry is a valuable readout, but lacks standardized staining protocol and cutoff value. After an external quality assessment using SP218, 138G6 and 6H2.1 anti-PTEN antibodies, scored on webbook and tissue microarray, the European Thoracic Oncology Platform cohort samples (n = 2245 NSCLC patients, 8980 tissue microarray cores) were stained with SP218. All cores were H-scored by pathologists and by computerized pixel-based intensity measurements calibrated by pathologists. All three antibodies differentiated six PTEN+ versus six PTEN- cases on external quality assessment. For 138G6 and SP218, high sensitivity and specificity was found for all H-score threshold values including prospectively defined 0, calculated 8 (pathologists), and calculated 5 (computer). High concordance among pathologists in setting computer-based intensities and between pathologists and computer in H-scoring was observed. Because of over-integration of the human eye, pixel-based computer H-scores were overall 54% lower. For all cutoff values, PTEN- was associated with smoking history, squamous cell histology, and higher tumor stage (p &lt; 0.001). In adenocarcinomas, PTEN- was associated with poor survival. Calibration of immunoreactivity intensities by pathologists following computerized H-score measurements has the potential to improve reproducibility and homogeneity of biomarker detection regarding epitope validation in multicenter studies

    Modelling the potential impacts of climate change on the hydrology of the Aipe river basin in Huila, Colombia

    Full text link
    [EN] The dynamics of a global world, and humans performing as a new geological force, require that an effort is undertaken to make robust decisions in order to devise strategies for the management and adaptation to climate change. This study aims to investigate the potential impact of climate change on the hydrology of the Aipe river basin in Huila, Colombia. The abcd Thomas model (four parameters) was calibrated and validated for the stream flows of the Aipe catchment (1992¿2012). The sensitivity and identifiability of the parameters were evaluated using the Monte Carlo Analysis Toolbox (MCAT). The results show the ability of the model to simulate the monthly stream flow (Nash¿Sutcliffe efficiency coefficient of 0,89). The most influential parameters are: a (water storage in the soil) and c (contribution to the aquifer). From the simulated scenarios, the baseline (1992¿2012) was estimated to be an average flow of 15,44 m3s¿1; the trend extrapolation scenario estimated a rate 13,79 m3s¿1 (¿10,64%); while for the multi-model assembly scenario it was 9,34 m3s¿1 (¿39,47%) and for the A2 scenario it was 5,74 m3s¿1 (¿62,60%). Lastly, we propose a set of strategies for adaptation to climate change that are committed to the integral management of water resources.[ES] La dinámica de un mundo global y el hombre como nueva fuerza geológica plantean la necesidad de tomar decisiones robustas, diseñar estrategias de manejo y de adaptarse al cambio climático. Este estudio investiga la respuesta hidrológica de la cuenca hidrográfica del río Aipe (688.9 km2 ), en Huila, Colombia, en acorde con los escenarios de cambio climático desde 2011 a 2040. El modelo hidrológico abcd de Thomas (4 parámetros) fue calibrado y validado comparando el caudal simulado y lo observado en el punto de cierre de la cuenca (en la estación Puente Carretera), usando series históricas mensuales (1992¿2012). Realizamos la evaluación de la sensibilidad e identificabilidad de los parámetros con la herramienta `Monte Carlo Analysis Toolbox¿ (MCAT). Los resultados muestran que el modelo es capaz de representar adecuadamente los caudales mensuales observados en el punto de desagüe de la cuenca, al encontrarse un índice de eficiencia de Nash¿Sutcliffe (NSE) de 0,89. Los parámetros más influyentes son a (almacenamiento del agua en el suelo) y c (aporte al acuífero). Con respecto a la simulación de los escenarios, la línea base (1992¿2012) estimó un caudal medio de 15,44 m3 s ¿1 ; el escenario de extrapolación de tendencias estimó un caudal de 13,79 m3 s ¿1 (¿10,64%); el escenario de ensamble multi-modelo de 9,34 m3 s ¿1 (¿39,47%) y el escenario A2 de 5,74 m3 s ¿1 (¿62,60%). Proponemos una batería de medidas de adaptación al cambio climático que buscan la gestión integral del recurso hídrico.Romero-Cuellar, J.; Buitrago-Vargas, A.; Quintero-Ruiz, T.; Francés, F. (2018). Simulación hidrológica de los impactos potenciales del cambio climático en la cuenca hidrográfica del río Aipe, en Huila, Colombia. RIBAGUA - Revista Iberoamericana del Agua. 5(1):63-78. https://doi.org/10.1080/23863781.2018.1454574S63785
    corecore