1,795 research outputs found

    The pharmaceutical death-ride of dihydroartemisinin

    Get PDF
    In the 2010 second edition of WHO's guidelines for the treatment of malaria, the relatively new fixed dose combination dihydroartemisinin-piperaquine is included as one of the recommended artemisinin combination therapies. However, experimental testing demonstrates that, due to its intrinsic chemical instability, dihydroartemisinin is not suitable to be used in pharmaceutical formulations. In addition, data show that the currently available dihydroartemisinin preparations fail to meet the internationally accepted stability requirements. At a time when many efforts aim to ban counterfeit and substandard drugs from the malaria market, the obvious question rises how WHO and public-private partnerships, such as Medicine for Malaria venture (MMV), can support the production and marketing of anti-malarial drugs that do not even meet the International Pharmacopoeia requirements

    Psychometric properties of questionnaires evaluating health-related quality of life and functional status in polytrauma patients with lower extremity injury

    Get PDF
    BACKGROUND: Long term disability is common among polytrauma patients. However, as yet little information exists on how to adequately measure functional status and health-related quality of life following polytrauma. AIMS: To establish the unidimensionality, internal consistency and validity of two health-related quality of life measures and one functional status questionnaire among polytrauma patients. METHODS: 186 Patients with severe polytrauma including lower extremity injury completed the Sickness Impact Profile-136 (SIP-136), the Medical Outcomes Study 36-Item Short Health Survey (SF-36) and the Groningen Activity Restriction Scale (GARS) 15 months after injury. Unidimensionality and internal consistency was assessed by principal components analysis and Cronbach's alpha (alpha). To test the construct validity of the questionnaires, predetermined hypotheses were tested. RESULTS: The unidimensionality and internal consistency of the GARS and the SF-36, but not the SIP-136 were supported. The construct validity of the SF-36, GARS and to a lesser extent the SIP-136 was confirmed. CONCLUSION: The SF-36 and the GARS appear to be preferable for use in polytrauma patients over the SIP-136

    Quantitative trait loci for bone traits segregating independently of those for growth in an F-2 broiler X layer cross

    Get PDF
    An F broiler-layer cross was phenotyped for 18 skeletal traits at 6, 7 and 9 weeks of age and genotyped with 120 microsatellite markers. Interval mapping identified 61 suggestive and significant QTL on 16 of the 25 linkage groups for 16 traits. Thirty-six additional QTL were identified when the assumption that QTL were fixed in the grandparent lines was relaxed. QTL with large effects on the lengths of the tarsometatarsus, tibia and femur, and the weights of the tibia and femur were identified on GGA4 between 217 and 249 cM. Six QTL for skeletal traits were identified that did not co-locate with genome wide significant QTL for body weight and two body weight QTL did not coincide with skeletal trait QTL. Significant evidence of imprinting was found in ten of the QTL and QTL x sex interactions were identified for 22 traits. Six alleles from the broiler line for weight- and size-related skeletal QTL were positive. Negative alleles for bone quality traits such as tibial dyschondroplasia, leg bowing and tibia twisting generally originated from the layer line suggesting that the allele inherited from the broiler is more protective than the allele originating from the layer

    Prospective associations between early childhood parental feeding practices and eating disorder symptoms and disordered eating behaviors in adolescence

    Get PDF
    OBJECTIVE: Nonresponsive parental feeding practices are associated with poorer appetite self-regulation in children. It is unknown whether this relationship extends beyond childhood to be prospectively associated with the onset of eating disorder (ED) symptoms in adolescence. This exploratory study therefore investigated prospective associations between early childhood parental feeding practices and adolescent ED symptoms and disordered eating behaviors. METHODS: Data were from two population-based cohorts with harmonized measures: Generation R (Netherlands; n = 4900) and Gemini (UK; n = 2094). Parents self-reported their pressure to eat, restriction and instrumental feeding (i.e., using food as a reward) at child age 4-5 years. Adolescents self-reported their compensatory behaviors (e.g., fasting, purging), binge-eating symptoms, restrained eating, uncontrolled eating, and emotional eating at 12-14 years. Associations between feeding practices and ED symptoms were examined separately in each cohort using generalized linear models. RESULTS: In Gemini, pressure to eat in early childhood was associated with adolescents engaging in compensatory behaviors. In Generation R, parental restriction was associated with adolescents engaging in compensatory behaviors, restrained eating, uncontrolled eating, and emotional eating. Instrumental feeding was associated with uncontrolled eating and emotional eating in Generation R. DISCUSSION: Nonresponsive parental feeding practices were associated with a greater frequency of specific ED symptoms and disordered eating in adolescence, although effect sizes were small and findings were inconsistent between cohorts. Potentially, the cultural and developmental context in which child-parent feeding interactions occur is important for ED symptoms. Further replication studies are required to better understand parents' role in the development and maintenance of ED-related symptoms. PUBLIC SIGNIFICANCE: Prospective research examining how early childhood parental feeding practices might contribute to adolescent ED symptoms is limited. In two population-based cohorts, nonresponsive feeding practices (restriction, instrumental feeding, pressure to eat) predicted increased frequency of some ED symptoms and disordered eating behaviors in adolescence, although associations were small and further replication is required. Findings support the promotion of responsive feeding practices, which may benefit young children's developing relationship with food

    Early childhood appetitive traits and eating disorder symptoms in adolescence: a 10-year longitudinal follow-up study in the Netherlands and the UK

    Get PDF
    BACKGROUND: Obesity and eating disorders commonly co-occur and might share common risk factors. Appetite avidity is an established neurobehavioural risk factor for obesity from early life, but the role of appetite in eating disorder susceptibility is unclear. We aimed to examine longitudinal associations between appetitive traits in early childhood and eating disorder symptoms in adolescence. METHODS: In this longitudinal cohort study, we used data from Generation R (based in Rotterdam, the Netherlands) and Gemini (based in England and Wales). Appetitive traits at age 4-5 years were measured using the parent-reported Child Eating Behaviour Questionnaire. At age 12-14 years, adolescents self-reported on overeating eating disorder symptoms (binge eating symptoms, uncontrolled eating, and emotional eating) and restrictive eating disorder symptoms (compensatory behaviours and restrained eating). Missing data on covariates were imputed using Multivariate Imputation via Chained Equations. Ordinal and binary logistic regressions were performed in each cohort separately and adjusted for confounders. Pooled results were obtained by meta-analyses. Sensitivity analyses were performed on complete cases using inverse probability weighting. FINDINGS: The final study sample included 2801 participants from Generation R and 869 participants from Gemini. Pooled findings after meta-analyses showed that higher food responsiveness in early childhood increased the odds of binge eating symptoms (odds ratio [OR]pooled 1·47, 95% CI 1·26-1·72), uncontrolled eating (1·33, 1·21-1·46), emotional eating (1·26, 1·13-1·41), restrained eating (1·16, 1·06-1·27), and compensatory behaviours (1·18, 1·08-1·30) in adolescence. Greater emotional overeating in early childhood increased the odds of compensatory behaviours (1·18, 1·06-1·33). By contrast, greater satiety responsiveness in early childhood decreased the odds of compensatory behaviours in adolescence (0·89, 0·81-0·99) and uncontrolled eating (0·86, 0·78-0·95) in adolescence. Slower eating in early childhood decreased the odds of compensatory behaviours (0·91, 0·84-0·99) and restrained eating (0·90, 0·83-0·98) in adolescence. No other associations were observed. INTERPRETATION: In this study, higher food responsiveness in early childhood was associated with a higher likelihood of self-reported eating disorder symptoms in adolescence, whereas greater satiety sensitivity and slower eating were associated with a lower likelihood of some eating disorder symptoms. Appetitive traits in children might be early neurobehavioural risk factors for, or markers of, subsequent eating disorder symptoms. FUNDING: MQ Mental Health Research, Rosetrees Trust, ZonMw

    Comparative institutional analysis for public health: governing voluntary collaborative agreements for public health in England and the Netherlands.

    Get PDF
    Democratic institutions and state-society relations shape governance arrangements and expectations between public and private stakeholders about public health impact. We illustrate this with a comparison between the English Public Health Responsibility Deal (RD) and the Dutch 'All About Health…' (AaH) programme. As manifestations of a Whole-of-Society approach, in which governments, civil society and business take responsibility for the co-production of economic utility and good health, these programmes are two recent collaborative platforms based on voluntary agreements to improve public health. Using a 'most similar cases' design, we conducted a comparative secondary analysis of data from the evaluations of the two programmes. The underlying rationale of both programmes was that voluntary agreements would be better suited than regulation to encourage business and civil society to take more responsibility for improving health. Differences between the two included: expectations of an enforcing versus facilitative role for government; hierarchical versus horizontal coordination; big business versus civil society participants; top-down versus bottom-up formulation of voluntary pledges and progress monitoring for accountability versus for learning and adaptation. Despite the attempt in both programmes to base voluntary commitments on trust, the English 'shadow of hierarchy' and adversarial state-society relationships conditioned non-governmental parties to see the pledges as controlling, quasi-contractual agreements that were only partially lived up to. The Dutch consensual political tradition enabled a civil society-based understanding and gradual acceptance of the pledges as the internalization by partner organizations of public health values within their operations. We conclude that there are institutional limitations to the implementation of generic trust-building and learning-based models of change 'Whole-of-Society' approaches

    Method for evaluating prediction models that apply the results of randomized trials to individual patients

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>The clinical significance of a treatment effect demonstrated in a randomized trial is typically assessed by reference to differences in event rates at the group level. An alternative is to make individualized predictions for each patient based on a prediction model. This approach is growing in popularity, particularly for cancer. Despite its intuitive advantages, it remains plausible that some prediction models may do more harm than good. Here we present a novel method for determining whether predictions from a model should be used to apply the results of a randomized trial to individual patients, as opposed to using group level results.</p> <p>Methods</p> <p>We propose applying the prediction model to a data set from a randomized trial and examining the results of patients for whom the treatment arm recommended by a prediction model is congruent with allocation. These results are compared with the strategy of treating all patients through use of a net benefit function that incorporates both the number of patients treated and the outcome. We examined models developed using data sets regarding adjuvant chemotherapy for colorectal cancer and Dutasteride for benign prostatic hypertrophy.</p> <p>Results</p> <p>For adjuvant chemotherapy, we found that patients who would opt for chemotherapy even for small risk reductions, and, conversely, those who would require a very large risk reduction, would on average be harmed by using a prediction model; those with intermediate preferences would on average benefit by allowing such information to help their decision making. Use of prediction could, at worst, lead to the equivalent of an additional death or recurrence per 143 patients; at best it could lead to the equivalent of a reduction in the number of treatments of 25% without an increase in event rates. In the Dutasteride case, where the average benefit of treatment is more modest, there is a small benefit of prediction modelling, equivalent to a reduction of one event for every 100 patients given an individualized prediction.</p> <p>Conclusion</p> <p>The size of the benefit associated with appropriate clinical implementation of a good prediction model is sufficient to warrant development of further models. However, care is advised in the implementation of prediction modelling, especially for patients who would opt for treatment even if it was of relatively little benefit.</p

    Does codon bias have an evolutionary origin?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is a 3-fold redundancy in the Genetic Code; most amino acids are encoded by more than one codon. These synonymous codons are not used equally; there is a Codon Usage Bias (CUB). This article will provide novel information about the origin and evolution of this bias.</p> <p>Results</p> <p>Codon Usage Bias (CUB, defined here as deviation from equal usage of synonymous codons) was studied in 113 species. The average CUB was 29.3 ± 1.1% (S.E.M, n = 113) of the theoretical maximum and declined progressively with evolution and increasing genome complexity. A Pan-Genomic Codon Usage Frequency (CUF) Table was constructed to describe genome-wide relationships among codons. Significant correlations were found between the number of synonymous codons and (i) the frequency of the respective amino acids (ii) the size of CUB. Numerous, statistically highly significant, internal correlations were found among codons and the nucleic acids they comprise. These strong correlations made it possible to predict missing synonymous codons (wobble bases) reliably from the remaining codons or codon residues.</p> <p>Conclusion</p> <p>The results put the concept of "codon bias" into a novel perspective. The internal connectivity of codons indicates that all synonymous codons might be integrated parts of the Genetic Code with equal importance in maintaining its functional integrity.</p

    Intracranial bleeding due to vitamin K deficiency: advantages of using a pediatric intensive care registry

    Get PDF
    Item does not contain fulltextAIM: To determine the incidence of late intracranial vitamin K deficiency bleeding (VKDB) in The Netherlands using the Dutch Pediatric Intensive Care Evaluation (PICE) registry. METHODS: The PICE registry was used to identify all infants who were admitted to a Dutch pediatric intensive care unit (PICU) with intracranial bleeding between 1 January 2004 and 31 December 2007. Cases of confirmed late intracranial VKDB were used to calculate the incidence for each year. To estimate the completeness of ascertainment of the PICE registry, data from 2005 were compared with general surveillance data from that year. RESULTS: In the 4-year study period, 16/64 (25%) of the infants admitted with intracranial bleeding had late intracranial VKDB, resulting in an overall incidence of 2.1/100,000 live births (95% confidence interval 1.2-3.5). The single-year incidence varied markedly between 0.5 and 3.3 per 100,000 live births. All five ascertained cases in 2005 were identified using the PICE registry, while general surveillance identified only three. CONCLUSIONS: The PICE registry allows ongoing monitoring of the incidence of late intracranial VKDB and appears to be associated with a higher rate of completeness than general surveillance. We propose the use of pediatric intensive care registries to assess the efficacy of national vitamin K prophylactic regimens

    Phage typing or CRISPR typing for epidemiological surveillance of Salmonella Typhimurium?

    Get PDF
    Objective: Salmonella Typhimurium is the most dominant Salmonella serovar around the world. It is associated with foodborne gastroenteritis outbreaks but has recently been associated with invasive illness and deaths. Characterization of S. Typhimurium is therefore very crucial for epidemiological surveillance. Phage typing has been used for decades for subtyping of S. Typhimurium to determine the epidemiological relation among isolates. Recent studies however have suggested that high throughput clustered regular interspaced short palindromic repeats (CRISPR) typing has the potential to replace phage typing. This study aimed to determine the efficacy of highthroughput CRISPR typing over conventional phage typing in epidemiological surveillance and outbreak investigation of S. Typhimurium. Results: In silico analysis of whole genome sequences (WGS) of well-documented phage types of S. Typhimurium reveals the presence of different CRISPR type among strains belong to the same phage type. Furthermore, different phage types of S. Typhimurium share identical CRISPR type. Interestingly, identical spacers were detected among outbreak and non-outbreak associated DT8 strains of S. Typhimurium. Therefore, CRISPR typing is not useful for the epidemiological surveillance and outbreak investigation of S. Typhimurium and phage typing, until it is replaced by WGS, is still the gold standard method for epidemiological surveillance of S. Typhimurium
    corecore