49 research outputs found
Trend tests for the evaluation of exposure-response relationships in epidemiological exposure studies
One possibility for the statistical evaluation of trends in epidemiological exposure studies is the use of a trend test for data organized in a 2 × k contingency table. Commonly, the exposure data are naturally grouped or continuous exposure data are appropriately categorized. The trend test should be sensitive to any shape of the exposure-response relationship. Commonly, a global trend test only determines whether there is a trend or not. Once a trend is seen it is important to identify the likely shape of the exposure-response relationship. This paper introduces a best contrast approach and an alternative approach based on order-restricted information criteria for the model selection of a particular exposure-response relationship. For the simple change point alternative H1 : 1 =.= q <q+1 =. = k an appropriate approach for the identification of a global trend as well as for the most likely shape of that exposure-response relationship is characterized by simulation and demonstrated for real data examples. Power and simultaneous confidence intervals can be estimated as well. If the conditions are fulfilled to transform the exposure-response data into a 2 × k table, a simple approach for identification of a global trend and its elementary shape is available for epidemiologists
The role of intention and self-efficacy on the association between breastfeeding of first and second child, a Danish cohort study
Abstract Background The impact of parity on breastfeeding duration may be explained by physiological as well as psychosocial factors. The aim in the present study was to investigate the mediating influence of intention and self-efficacy on the association between the breastfeeding duration of the first and the following child. Methods A 5-year Danish cohort study with data from online questionnaires was used. Data came from 1162 women, who participated in the “Ready for child” trial in 2006–7 and gave birth to their second child within 5 years in 2011–3. Analysis included multiple regression models with exclusive/any breastfeeding duration of first child as the exposure variables, intention and self-efficacy measured as mediators, and exclusive/any breastfeeding duration of the second child as the outcome variables. Results Duration of exclusive breastfeeding of the first child was significantly associated with exclusive breastfeeding duration of the second child (p < 0.001) and with the self-reported intention and self-efficacy in the ability to breastfeed the second child (p < 0.001). The exclusive breastfeeding period was slightly longer for the second child. Self-efficacy and intention mediated the association between breastfeeding duration in the first and second child. Together the two factors explained 48% of the association in exclusive breastfeeding and 27% of the association in any breastfeeding between the first and second child. Conclusion Due to a reinforcing effect of intention and self-efficacy, breastfeeding support should focus on helping the first time mothers to succeed as well as to identify the second time mother with low self-efficacy and additional need for support
Mortality in infants of obese mothers: is risk modified by mode of delivery?
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/90196/1/j.1600-0412.2011.01331.x.pd
The effect of assessing genetic risk of prostate cancer on the use of PSA tests in primary care: a cluster randomized controlled trial
Background Assessing genetic lifetime risk for prostate cancer has been proposed as a means of risk stratification to identify those for whom prostate-specific antigen (PSA) testing is likely to be most valuable. This project aimed to test the effect of introducing a genetic test for lifetime risk of prostate cancer in general practice on future PSA testing. Methods and findings We performed a cluster randomized controlled trial with randomization at the level of general practices (73 in each of two arms) in the Central Region (Region Midtjylland) of Denmark. In intervention practices, men were offered a genetic test (based on genotyping of 33 risk-associated single nucleotide polymorphisms) in addition to the standard PSA test that informed them about lifetime genetic risk of prostate cancer and distinguished between “normal” and “high” risk. The primary outcome was the proportion of men having a repeated PSA test within 2 years. A multilevel logistic regression model was used to test the association. After applying the exclusion criteria, 3,558 men were recruited in intervention practices, with 1,235 (34.7%) receiving the genetic test, and 4,242 men were recruited in control practices. Men with high genetic risk had a higher propensity for repeated PSA testing within 2 years than men with normal genetic risk (odds ratio [OR] = 8.94, p < 0.01). The study was conducted in routine practice and had some selection bias, which is evidenced by the relatively large proportion of younger and higher income participants taking the genetic test. Conclusions Providing general practitioners (GPs) with access to a genetic test to assess lifetime risk of prostate cancer did not reduce the overall number of future PSA tests. However, among men who had a genetic test, knowledge of genetic risk significantly influenced future PSA testing
Factors predicting treatment of World Trade Center-related lung injury : a longitudinal cohort study
The factors that predict treatment of lung injury in occupational cohorts are poorly defined. We aimed to identify patient characteristics associated with initiation of treatment with inhaled corticosteroid/long-acting beta-agonist (ICS/LABA) >2 years among World Trade Center (WTC)-exposed firefighters. The study population included 8530 WTC-exposed firefighters. Multivariable logistic regression assessed the association of patient characteristics with ICS/LABA treatment for >2 years over two-year intervals from 11 September 2001-10 September 2017. Cox proportional hazards models measured the association of high probability of ICS/LABA initiation with actual ICS/LABA initiation in subsequent intervals. Between 11 September 2001-1 July 2018, 1629/8530 (19.1%) firefighters initiated ICS/LABA treatment for >2 years. Forced Expiratory Volume in 1 s (FEV1), wheeze, and dyspnea were consistently and independently associated with ICS/LABA treatment. High-intensity WTC exposure was associated with ICS/LABA between 11 September 2001-10 September 2003. The 10th percentile of risk for ICS/LABA between 11 September 2005-10 Septmeber 2007 was associated with a 3.32-fold increased hazard of actual ICS/LABA initiation in the subsequent 4 years. In firefighters with WTC exposure, FEV1, wheeze, and dyspnea were independently associated with prolonged ICS/LABA treatment. A high risk for treatment was identifiable from routine monitoring exam results years before treatment initiation
Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study
Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection
Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study
Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe