54 research outputs found

    Urinary Bisphenol A and Type-2 Diabetes in U.S. Adults: Data from NHANES 2003-2008

    Get PDF
    Bisphenol A (BPA) is found in plastics and other consumer products; exposure may lead to insulin resistance and development of type-2 diabetes mellitus (T2DM) through over-activation of pancreatic β-cells. Previous studies using data from the National Health and Nutrition Examination Survey (NHANES) showed an inconsistent association between prevalence of self-reported T2DM and urinary BPA. We used a different diagnosis method of T2DM (hemoglobin A1c (HbA1c)) with a larger subset of NHANES.We analyzed data from 4,389 adult participants who were part of a sub-study of environmental phenol measurements in urine from three NHANES cycles from 2003 to 2008. T2DM was defined as having a HbA1c ≥6.5% or use of diabetes medication. The weighted prevalence of T2DM was 9.2%. Analysis of the total sample revealed that a two-fold increase in urinary BPA was associated with an odds ratio (OR) of 1.08 of T2DM (95% confidence interval (CI), 1.02 to 1.16), after controlling for potential confounders. However, when we examined each NHANES cycle individually, we only found a statistically significant association in the 2003/04 cycle (n = 1,364, OR = 1.23 (95% CI, 1.07 to 1.42) for each doubling in urinary BPA). We found no association in either the NHANES cycle from 2005/06 (n = 1,363, OR = 1.05 (95% CI, 0.94 to 1.18)); or 2007/08 (n = 1,662, OR = 1.06 (95% CI, 0.91 to 1.23)). Similar patterns of associations between BPA and continuous HbA1c were also observed.Although higher urinary BPA was associated with elevated HbA1c and T2DM in the pooled analysis, it was driven by data from only one NHANES cycle. Additional studies, especially of a longitudinal design with repeated BPA measurements, are needed to further elucidate the association between BPA and T2DM

    Prediction of Cellular Burden with Host--Circuit Models

    Get PDF
    Heterologous gene expression draws resources from host cells. These resources include vital components to sustain growth and replication, and the resulting cellular burden is a widely recognised bottleneck in the design of robust circuits. In this tutorial we discuss the use of computational models that integrate gene circuits and the physiology of host cells. Through various use cases, we illustrate the power of host-circuit models to predict the impact of design parameters on both burden and circuit functionality. Our approach relies on a new generation of computational models for microbial growth that can flexibly accommodate resource bottlenecks encountered in gene circuit design. Adoption of this modelling paradigm can facilitate fast and robust design cycles in synthetic biology

    Validation of diabetes mellitus and hypertension diagnosis in computerized medical records in primary health care

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Computerized Clinical Records, which are incorporated in primary health care practice, have great potential for research. In order to use this information, data quality and reliability must be assessed to prevent compromising the validity of the results.</p> <p>The aim of this study is to validate the diagnosis of hypertension and diabetes mellitus in the computerized clinical records of primary health care, taking the diagnosis criteria established in the most prominently used clinical guidelines as the gold standard against which what measure the sensitivity, specificity, and determine the predictive values.</p> <p>The gold standard for diabetes mellitus was the diagnostic criteria established in 2003 American Diabetes Association Consensus Statement for diabetic subjects. The gold standard for hypertension was the diagnostic criteria established in the Joint National Committee published in 2003.</p> <p>Methods</p> <p>A cross-sectional multicentre validation study of diabetes mellitus and hypertension diagnoses in computerized clinical records of primary health care was carried out. Diagnostic criteria from the most prominently clinical practice guidelines were considered for standard reference.</p> <p>Sensitivity, specificity, positive and negative predictive values, and global agreement (with kappa index), were calculated. Results were shown overall and stratified by sex and age groups.</p> <p>Results</p> <p>The agreement for diabetes mellitus with the reference standard as determined by the guideline was almost perfect (κ = 0.990), with a sensitivity of 99.53%, a specificity of 99.49%, a positive predictive value of 91.23% and a negative predictive value of 99.98%.</p> <p>Hypertension diagnosis showed substantial agreement with the reference standard as determined by the guideline (κ = 0.778), the sensitivity was 85.22%, the specificity 96.95%, the positive predictive value 85.24%, and the negative predictive value was 96.95%. Sensitivity results were worse in patients who also had diabetes and in those aged 70 years or over.</p> <p>Conclusions</p> <p>Our results substantiate the validity of using diagnoses of diabetes and hypertension found within the computerized clinical records for epidemiologic studies.</p

    Synthetic lethality: a framework for the development of wiser cancer therapeutics

    Get PDF
    The challenge in medical oncology has always been to identify compounds that will kill, or at least tame, cancer cells while leaving normal cells unscathed. Most chemotherapeutic agents in use today were selected primarily for their ability to kill rapidly dividing cancer cells grown in cell culture and in mice, with their selectivity determined empirically during subsequent animal and human testing. Unfortunately, most of the drugs developed in this way have relatively low therapeutic indices (low toxic dose relative to the therapeutic dose). Recent advances in genomics are leading to a more complete picture of the range of mutations, both driver and passenger, present in human cancers. Synthetic lethality provides a conceptual framework for using this information to arrive at drugs that will preferentially kill cancer cells relative to normal cells. It also provides a possible way to tackle 'undruggable' targets. Two genes are synthetically lethal if mutation of either gene alone is compatible with viability but simultaneous mutation of both genes leads to death. If one is a cancer-relevant gene, the task is to discover its synthetic lethal interactors, because targeting these would theoretically kill cancer cells mutant in the cancer-relevant gene while sparing cells with a normal copy of that gene. All cancer drugs in use today, including conventional cytotoxic agents and newer 'targeted' agents, target molecules that are present in both normal cells and cancer cells. Their therapeutic indices almost certainly relate to synthetic lethal interactions, even if those interactions are often poorly understood. Recent technical advances enable unbiased screens for synthetic lethal interactors to be undertaken in human cancer cells. These approaches will hopefully facilitate the discovery of safer, more efficacious anticancer drugs that exploit vulnerabilities that are unique to cancer cells by virtue of the mutations they have accrued during tumor progression

    To be objective in Experimental Phenomenology: a Psychophysics application

    Get PDF
    Background Several scientific psychologists consider the approach for the study of perceptive problems of the Experimental Phenomenology is problematic, namely that the phenomenological demonstrations are subjectively based and they do not produce quantifiable results. Aim The aim of this study is to show that Experimental Phenomenology can lead to conclusions objective and quantifiable and propose a procedure allowing to obtain objective measuring using the Rasch mathematical model able to describe the experimental data gathered in Experimental Phenomenology procedures. Method In order to demonstrate this, a Psychophysics simulated study is proposed. Results/conclusions It is possible to carry out a fundamental measurement starting from Experimental Phenomenology by way of the Theory of Conjoint Measurement

    The subjective index for physical and social outcome (SIPSO) in stroke: investigation of its subscale structure

    Get PDF
    Background: Short and valid measures of the impact of a stroke on integration are required in health and social settings. The Subjective Index of Physical and Social Outcome (SIPSO) is one such measure. However, there are questions whether scores can be summed into a total score or whether subscale scores should be calculated. This paper aims to provide clarity on the internal construct validity of the subscales and the total scale.Methods: SIPSO data were collected as part of two parallel surveys of the met and unmet needs of 445 younger people (aged 18-65) with non-recent stroke (at least one year) and living at home. Factor, Mokken and Rasch analysis were used.Results: Factor analysis supported a two factor structure (explaining 68% of the variance) as did the Mokken analysis (overall Loevinger coefficient 0.77 for the Physical Integration subscale; 0.51 for the Social Integration subscale). Both subscales fitted the Rasch model (P &gt; 0.01) after adjusting for some observed differential item functioning. The 10-items together did not fit the Rasch model.Conclusions: The SIPSO subscales are valid for use with stroke patients of working age but the total SIPSO is not. The conversion table can be used by clinicians and researchers to convert ordinal data to interval level prior to mathematical operations and other parametric procedures. Further work is required to explore the occurrence of bias by gender for some of the items
    corecore