288 research outputs found

    A nonlinear dosimetric model for hemoglobin adduct formation by the neurotoxic agent acrylamide and its genotoxic metabolite glycidamide.

    Get PDF
    Hemoglobin (Hb) adducts, formed by the neurotoxic agent acrylamide (AA) and its genotoxic metabolite glycidamide (GA), were measured in the rat by means of a method for simultaneous determination of the adducts formed to cysteine. A novel, nonlinear dosimetric model was developed to describe Hb adduct formation. This model incorporates the saturable kinetics of the metabolic conversion in vivo of AA to GA. The pharmacokinetic parameters Vmax and Km and the first-order rates of elimination, k1 and k2, for AA and GA from all processes except conversion of AA to GA, were estimated directly from Hb adduct data to 19 M hr-1, 66 microM, 0.21 hr-1, and 0.48 hr-1, respectively. At low concentrations, approximately 60% of AA was metabolized to GA. The nonlinear dosimetric model for adduct formation has potential general applicability in high-to-low-dose extrapolation of genotoxic effects

    Impact of Sample Type and DNA Isolation Procedure on Genomic Inference of Microbiome Composition

    Get PDF
    Explorations of complex microbiomes using genomics greatly enhance our understanding about their diversity, biogeography, and function. The isolation of DNA from microbiome specimens is a key prerequisite for such examinations, but challenges remain in obtaining sufficient DNA quantities required for certain sequencing approaches, achieving accurate genomic inference of microbiome composition, and facilitating comparability of findings across specimen types and sequencing projects. These aspects are particularly relevant for the genomics-based global surveillance of infectious agents and antimicrobial resistance from different reservoirs. Here, we compare in a stepwise approach a total of eight commercially available DNA extraction kits and 16 procedures based on these for three specimen types (human feces, pig feces, and hospital sewage). We assess DNA extraction using spike-in controls and different types of beads for bead beating, facilitating cell lysis. We evaluate DNA concentration, purity, and stability and microbial community composition using 16S rRNA gene sequencing and for selected samples using shotgun metagenomic sequencing. Our results suggest that inferred community composition was dependent on inherent specimen properties as well as DNA extraction method. We further show that bead beating or enzymatic treatment can increase the extraction of DNA from Gram-positive bacteria. Final DNA quantities could be increased by isolating DNA from a larger volume of cell lysate than that in standard protocols. Based on this insight, we designed an improved DNA isolation procedure optimized for microbiome genomics that can be used for the three examined specimen types and potentially also for other biological specimens. A standard operating procedure is available from https://dx.doi.org/10.6084/m9.figshare.3475406. IMPORTANCE Sequencing-based analyses of microbiomes may lead to a breakthrough in our understanding of the microbial worlds associated with humans, animals, and the environment. Such insight could further the development of innovative ecosystem management approaches for the protection of our natural resources and the design of more effective and sustainable solutions to prevent and control infectious diseases. Genome sequence information is an organism (pathogen)-independent language that can be used across sectors, space, and time. Harmonized standards, protocols, and workflows for sample processing and analysis can facilitate the generation of such actionable information. In this study, we assessed several procedures for the isolation of DNA for next-generation sequencing. Our study highlights several important aspects to consider in the design and conduct of sequence-based analysis of microbiomes. We provide a standard operating procedure for the isolation of DNA from a range of biological specimens particularly relevant in clinical diagnostics and epidemiology

    Patient-Reported Morbidity Instruments: A Systematic Review

    Get PDF
    Objectives: Although comorbidities play an essential role in risk adjustment and outcomes measurement, there is little consensus regarding the best source of this data. The aim of this study was to identify general patient-reported morbidity instruments and their measurement properties. Methods: A systematic review was conducted using multiple electronic databases (Embase, Medline, Cochrane Central, and Web of Science) from inception to March 2018. Articles focusing primarily on the development or subsequent validation of a patient-reported morbidity instrument were included. After including relevant articles, the measurement properties of each morbidity instrument were extracted by 2 investigators for narrative synthesis. Results: A total of 1005 articles were screened, of which 34 eligible articles were ultimately included. The most widely assessed instruments were the Self-Reported Charlson Comorbidity Index (n = 7), the Self-Administered Comorbidity Questionnaire (n = 3), and the Disease Burden Morbidity Assessment (n = 3). The most commonly included conditions were diabetes, hypertension, and myocardial infarction. Studies demonstrated substantial variability in item-level reliability versus the gold standard medical record review (κ range 0.66-0.86), meaning that the accuracy of the self-reported comorbidity data is dependent on the selected morbidity. Conclusions: The Self-Reported Charlson Comorbidity Index and the Self-Administered Comorbidity Questionnaire were the most frequently cited instruments. Significant variability was observed in reliability per comorbid condition of patient-reported morbidity questionnaires. Further research is needed to determine whether patient-reported morbidity data should be used to bolster medical records data or serve as a stand-alone entity when risk adjusting observational outcomes data

    Evidence in action: a Thompsonian perspective on evidence-based decision-making in social work

    Full text link
    Evidence-based practice presupposes evidence-based decision-making. In the debate it is argued that a social work fashioned after evidence should be more rational, less authoritarian and built on scientific knowledge, respect and ethics. Yet the empirical evidence that this idea works is weak. In fact the difficulties met during efforts to implement evidence could be a sound reaction. Indeed difficulties experienced could be a defensive organizational reaction to a new, disturbing technology. In this article James D. Thompson’s classical study Organizations in Action from 1967 is applied to evidence-based decision-making in social work. It shows to date that many problems have been given, at best, tenuous attention. It is argued that a focus on evidence will raise ambiguity and complexity levels within organizations and that new professional specialists will emerge. Further, new constellations of power will appear, leading to a change of balance within the domains of social work

    Late symptoms in long-term gynaecological cancer survivors after radiation therapy: a population-based cohort study.

    Get PDF
    BACKGROUND: We surveyed the occurrence of physical symptoms among long-term gynaecological cancer survivors after pelvic radiation therapy, and compared with population-based control women. METHODS: We identified a cohort of 789 eligible gynaecological cancer survivors treated with pelvic radiation therapy alone or combined with surgery in Stockholm or Gothenburg, Sweden. A control group of 478 women was randomly sampled from the Swedish Population Registry. Data were collected through a study-specific validated postal questionnaire with 351 questions concerning gastrointestinal and urinary tract function, lymph oedema, pelvic bones and sexuality. Clinical characteristics and treatment details were retrieved from medical records. RESULTS: Participation rate was 78% for gynaecological cancer survivors and 72% for control women. Median follow-up time after treatment was 74 months. Cancer survivors reported a higher occurrence of symptoms from all organs studied. The highest age-adjusted relative risk (RR) was found for emptying of all stools into clothing without forewarning (RR 12.7), defaecation urgency (RR 5.7), difficulty feeling the need to empty the bladder (RR 2.8), protracted genital pain (RR 5.0), pubic pain when walking indoors (RR 4.9) and erysipelas on abdomen or legs at least once during the past 6 months (RR 3.6). Survivors treated with radiation therapy alone showed in general higher rates of symptoms. CONCLUSION: Gynaecological cancer survivors previously treated with pelvic radiation report a higher occurrence of symptoms from the urinary and gastrointestinal tract as well as lymph oedema, sexual dysfunction and pelvic pain compared with non-irradiated control women. Health-care providers need to actively ask patients about specific symptoms in order to provide proper diagnostic investigations and management

    Scholars’ open debate paper on the World Health Organization ICD-11 gaming disorder proposal

    Get PDF
    Concerns about problematic gaming behaviors deserve our full attention. However, we claim that it is far from clear that these problems can or should be attributed to a new disorder. The empirical basis for a Gaming Disorder proposal, such as in the new ICD-11, suffers from fundamental issues. Our main concerns are the low quality of the research base, the fact that the current operationalization leans too heavily on substance use and gambling criteria, and the lack of consensus on symptomatology and assessment of problematic gaming. The act of formalizing this disorder, even as a proposal, has negative medical, scientific, public-health, societal, and human rights fallout that should be considered. Of particular concern are moral panics around the harm of video gaming. They might result in premature application of diagnosis in the medical community and the treatment of abundant false-positive cases, especially for children and adolescents. Second, research will be locked into a confirmatory approach, rather than an exploration of the boundaries of normal versus pathological. Third, the healthy majority of gamers will be affected negatively. We expect that the premature inclusion of Gaming Disorder as a diagnosis in ICD-11 will cause significant stigma to the millions of children who play video games as a part of a normal, healthy life. At this point, suggesting formal diagnoses and categories is premature: the ICD-11 proposal for Gaming Disorder should be removed to avoid a waste of public health resources as well as to avoid causing harm to healthy video gamers around the world

    A weak scientific basis for gaming disorder: let us err on the side of caution

    Get PDF
    We greatly appreciate the care and thought that is evident in the 10 commentaries that discuss our debate paper, the majority of which argued in favor of a formalized ICD-11 gaming disorder. We agree that there are some people whose play of video games is related to life problems. We believe that understanding this population and the nature and severity of the problems they experience should be a focus area for future research. However, moving from research construct to formal disorder requires a much stronger evidence base than we currently have. The burden of evidence and the clinical utility should be extremely high, because there is a genuine risk of abuse of diagnoses. We provide suggestions about the level of evidence that might be required: transparent and preregistered studies, a better demarcation of the subject area that includes a rationale for focusing on gaming particularly versus a more general behavioral addictions concept, the exploration of non-addiction approaches, and the unbiased exploration of clinical approaches that treat potentially underlying issues, such as depressive mood or social anxiety first. We acknowledge there could be benefits to formalizing gaming disorder, many of which were highlighted by colleagues in their commentaries, but we think they do not yet outweigh the wider societal and public health risks involved. Given the gravity of diagnostic classification and its wider societal impact, we urge our colleagues at the WHO to err on the side of caution for now and postpone the formalization

    The impact of radiotherapy late effects on quality of life in gynaecological cancer patients

    Get PDF
    The aims of this study were to assess changes in quality of life (QoL) scores in relation to radical radiotherapy for gynaecological cancer (before and after treatment up to 3 years), and to identify the effect that late treatment effects have on QoL. This was a prospective study involving 225 gynaecological cancer patients. A QoL instrument (European Organisation for the Research and Treatment of Cancer QLQ-C30) and late treatment effect questionnaire (Late Effects Normal Tissues – Subjective Objective Management Analysis) were completed before and after treatment (immediately after radiotherapy, 6 weeks, 12, 24 and 36 months after treatment). Most patients had acute physical symptoms and impaired functioning immediately after treatment. Levels of fatigue and diarrhoea only returned to those at pre-treatment assessment after 6 weeks. Patients with high treatment toxicity scores had lower global QoL scores. In conclusion, treatment with radiotherapy for gynaecological cancer has a negative effect on QoL, most apparent immediately after treatment. Certain late treatment effects have a negative effect on QoL for at least 2 years after radiotherapy. These treatment effects are centred on symptoms relating to the rectum and bowel, for example, diarrhoea, tenesmus and urgency. Future research will identify specific symptoms resulting from late treatment toxicity that have the greatest effect on QoL; therefore allowing effective management plans to be developed to reduce these symptoms and improve QoL in gynaecological cancer patients
    corecore