542 research outputs found

    Maternal dietary fatty acid intake during pregnancy and the risk of preclinical and clinical type 1 diabetes in the offspring.

    Get PDF
    The aim of the present study was to examine the associations between the maternal intake of fatty acids during pregnancy and the risk of preclinical and clinical type 1 diabetes in the offspring. The study included 4887 children with human leucocyte antigen (HLA)-conferred type 1 diabetes susceptibility born during the years 1997-2004 from the Finnish Type 1 Diabetes Prediction and Prevention Study. Maternal diet was assessed with a validated FFQ. The offspring were observed at 3- to 12-month intervals for the appearance of type 1 diabetes-associated autoantibodies and development of clinical type 1 diabetes (average follow-up period: 4·6 years (range 0·5-11·5 years)). Altogether, 240 children developed preclinical type 1 diabetes and 112 children developed clinical type 1 diabetes. Piecewise linear log-hazard survival model and Cox proportional-hazards regression were used for statistical analyses. The maternal intake of palmitic acid (hazard ratio (HR) 0·82, 95 % CI 0·67, 0·99) and high consumption of cheese during pregnancy (highest quarter v. intermediate half HR 0·52, 95 % CI 0·31, 0·87) were associated with a decreased risk of clinical type 1 diabetes. The consumption of sour milk products (HR 1·14, 95 % CI 1·02, 1·28), intake of protein from sour milk (HR 1·15, 95 % CI 1·02, 1·29) and intake of fat from fresh milk (HR 1·43, 95 % CI 1·04, 1·96) were associated with an increased risk of preclinical type 1 diabetes, and the intake of low-fat margarines (HR 0·67, 95 % CI 0·49, 0·92) was associated with a decreased risk. No conclusive associations between maternal fatty acid intake or food consumption during pregnancy and the development of type 1 diabetes in the offspring were detected

    Sensitivity analysis for clinical trials with missing continuous outcome data using controlled multiple imputation: a practical guide

    Get PDF
    Missing data due to loss to follow‐up or intercurrent events are unintended, but unfortunately inevitable in clinical trials. Since the true values of missing data are never known, it is necessary to assess the impact of untestable and unavoidable assumptions about any unobserved data in sensitivity analysis. This tutorial provides an overview of controlled multiple imputation (MI) techniques and a practical guide to their use for sensitivity analysis of trials with missing continuous outcome data. These include δ ‐ and reference‐based MI procedures. In δ ‐based imputation, an offset term, δ , is typically added to the expected value of the missing data to assess the impact of unobserved participants having a worse or better response than those observed. Reference‐based imputation draws imputed values with some reference to observed data in other groups of the trial, typically in other treatment arms. We illustrate the accessibility of these methods using data from a pediatric eczema trial and a chronic headache trial and provide Stata code to facilitate adoption. We discuss issues surrounding the choice of δ in δ ‐based sensitivity analysis. We also review the debate on variance estimation within reference‐based analysis and justify the use of Rubin's variance estimator in this setting, since as we further elaborate on within, it provides information anchored inference

    Combining estimates of interest in prognostic modelling studies after multiple imputation: current practice and guidelines

    Get PDF
    Background: Multiple imputation (MI) provides an effective approach to handle missing covariate data within prognostic modelling studies, as it can properly account for the missing data uncertainty. The multiply imputed datasets are each analysed using standard prognostic modelling techniques to obtain the estimates of interest. The estimates from each imputed dataset are then combined into one overall estimate and variance, incorporating both the within and between imputation variability. Rubin's rules for combining these multiply imputed estimates are based on asymptotic theory. The resulting combined estimates may be more accurate if the posterior distribution of the population parameter of interest is better approximated by the normal distribution. However, the normality assumption may not be appropriate for all the parameters of interest when analysing prognostic modelling studies, such as predicted survival probabilities and model performance measures. Methods: Guidelines for combining the estimates of interest when analysing prognostic modelling studies are provided. A literature review is performed to identify current practice for combining such estimates in prognostic modelling studies. Results: Methods for combining all reported estimates after MI were not well reported in the current literature. Rubin's rules without applying any transformations were the standard approach used, when any method was stated. Conclusion: The proposed simple guidelines for combining estimates after MI may lead to a wider and more appropriate use of MI in future prognostic modelling studies

    A review of RCTs in four medical journals to assess the use of imputation to overcome missing data in quality of life outcomes

    Get PDF
    Background: Randomised controlled trials (RCTs) are perceived as the gold-standard method for evaluating healthcare interventions, and increasingly include quality of life (QoL) measures. The observed results are susceptible to bias if a substantial proportion of outcome data are missing. The review aimed to determine whether imputation was used to deal with missing QoL outcomes. Methods: A random selection of 285 RCTs published during 2005/6 in the British Medical Journal, Lancet, New England Journal of Medicine and Journal of American Medical Association were identified. Results: QoL outcomes were reported in 61 (21%) trials. Six (10%) reported having no missing data, 20 (33%) reported ≤ 10% missing, eleven (18%) 11%–20% missing, and eleven (18%) reported >20% missing. Missingness was unclear in 13 (21%). Missing data were imputed in 19 (31%) of the 61 trials. Imputation was part of the primary analysis in 13 trials, but a sensitivity analysis in six. Last value carried forward was used in 12 trials and multiple imputation in two. Following imputation, the most common analysis method was analysis of covariance (10 trials). Conclusion: The majority of studies did not impute missing data and carried out a complete-case analysis. For those studies that did impute missing data, researchers tended to prefer simpler methods of imputation, despite more sophisticated methods being available.The Health Services Research Unit is funded by the Chief Scientist Office of the Scottish Government Health Directorate. Shona Fielding is also currently funded by the Chief Scientist Office on a Research Training Fellowship (CZF/1/31)

    When the Transmission of Culture Is Child's Play

    Get PDF
    Background: Humans frequently engage in arbitrary, conventional behavior whose primary purpose is to identify with cultural in-groups. The propensity for doing so is established early in human ontogeny as children become progressively enmeshed in their own cultural milieu. This is exemplified by their habitual replication of causally redundant actions shown to them by adults. Yet children seemingly ignore such actions shown to them by peers. How then does culture get transmitted intra-generationally? Here we suggest the answer might be 'in play'. Principal Findings: Using a diffusion chain design preschoolers first watched an adult retrieve a toy from a novel apparatus using a series of actions, some of which were obviously redundant. These children could then show another child how to open the apparatus, who in turn could show a third child. When the adult modeled the actions in a playful manner they were retained down to the third child at higher rates than when the adult seeded them in a functionally oriented way. Conclusions: Our results draw attention to the possibility that play might serve a critical function in the transmission of human culture by providing a mechanism for arbitrary ideas to spread between children

    Assessment of low-dose cisplatin as a model of nausea and emesis in beagle dogs, potential for repeated administration

    Get PDF
    Cisplatin is a highly emetogenic cancer chemotherapy agent, which is often used to induce nausea and emesis in animal models. The cytotoxic properties of cisplatin also cause adverse events that negatively impact on animal welfare preventing repeated administration of cisplatin. In this study, we assessed whether a low (subclinical) dose of cisplatin could be utilized as a model of nausea and emesis in the dog while decreasing the severity of adverse events to allow repeated administration. The emetic, nausea-like behavior and potential biomarker response to both the clinical dose (70 mg/m2) and low dose (15 mg/m2) of cisplatin was assessed. Plasma creatinine concentrations and granulocyte counts were used to assess adverse effects on the kidneys and bone marrow, respectively. Nausea-like behavior and emesis was induced by both doses of cisplatin, but the latency to onset was greater in the low-dose group. No significant change in plasma creatinine was detected for either dose groups. Granulocytes were significantly reduced compared with baseline (P = 0.000) following the clinical, but not the low-dose cisplatin group. Tolerability of repeated administration was assessed with 4 administrations of an 18 mg/m2 dose cisplatin. Plasma creatinine did not change significantly. Cumulative effects on the granulocytes occurred, they were significantly decreased (P = 0.03) from baseline at 3 weeks following cisplatin for the 4th administration only. Our results suggest that subclinical doses (15 and 18 mg/m2) of cisplatin induce nausea-like behavior and emesis but have reduced adverse effects compared with the clinical dose allowing for repeated administration in crossover studies
    corecore