1,865 research outputs found

    A dynamic network approach for the study of human phenotypes

    Get PDF
    The use of networks to integrate different genetic, proteomic, and metabolic datasets has been proposed as a viable path toward elucidating the origins of specific diseases. Here we introduce a new phenotypic database summarizing correlations obtained from the disease history of more than 30 million patients in a Phenotypic Disease Network (PDN). We present evidence that the structure of the PDN is relevant to the understanding of illness progression by showing that (1) patients develop diseases close in the network to those they already have; (2) the progression of disease along the links of the network is different for patients of different genders and ethnicities; (3) patients diagnosed with diseases which are more highly connected in the PDN tend to die sooner than those affected by less connected diseases; and (4) diseases that tend to be preceded by others in the PDN tend to be more connected than diseases that precede other illnesses, and are associated with higher degrees of mortality. Our findings show that disease progression can be represented and studied using network methods, offering the potential to enhance our understanding of the origin and evolution of human diseases. The dataset introduced here, released concurrently with this publication, represents the largest relational phenotypic resource publicly available to the research community.Comment: 28 pages (double space), 6 figure

    In vitro mutation artifacts after formalin fixation and error prone translesion synthesis during PCR

    Get PDF
    BACKGROUND: Clinical specimens are routinely fixed in 10% buffered formalin and paraffin embedded. Although DNA is commonly extracted from fixed tissues and amplified by PCR, the effects of formalin fixation are relatively unknown. Formalin fixation is known to impair PCR, presumably through damage that blocks polymerase elongation, but an insidious possibility is error prone translesion synthesis across sites of damage, producing in vitro artifactual mutations during PCR. METHODS: To better understand the consequences of fixation, DNA specimens extracted from fresh or fixed tissues were amplified with Taq DNA polymerase, and their PCR products were cloned and sequenced. RESULTS: Significantly more (3- to 4-fold) mutations were observed with fixed DNA specimens. The majority of mutations were transitions, predominantly at A:T base pairs, randomly distributed along the template. CONCLUSIONS: Formalin fixation appears to cause random base damage, which can be bridged during PCR by Taq DNA polymerase through error prone translesion synthesis. Fixed DNA is a damaged but "readable" template

    The pediatric glucocorticoid toxicity index

    Get PDF
    Objectives: To develop a Pediatric glucocorticoid toxicity index (pGTI), a standardized, weighted clinical outcome assessment that measures change in glucocorticoid (GC) toxicity over time. Methods: Fourteen physician experts from 7 subspecialties participated. The physician experts represented multiple subspecialties in which GCs play a major role in the treatment of inflammatory disease: nephrology, rheumatology, oncology, endocrinology, genetics, psychiatry, and maternal-fetal medicine. Nine investigators were from Canada, Europe, or New Zealand, and 5 were from the United States. Group consensus methods and multi-criteria decision analysis were used. The pGTI is an aggregate assessment of GC toxicities that are common, important, and dynamic. These toxicities are organized into health domains graded as minor, moderate, or major and are weighted according to severity. The relative weights were derived by group consensus and multi-criteria decision analysis using the 1000MindsTM software platform. Two quantitative scores comprise the overall toxicity profile derived from pGTI data: (1) the Cumulative Worsening Score; and (2) the Aggregate Improvement Score. The pGTI also includes a qualitative, unweighted record of GC side-effects known as the Damage Checklist, which documents less common toxicities that, although potentially severe, are unlikely to change with varying GC dosing. Results: One hundred and seven (107) toxicity items were included in the pGTI and thirty-two (32) in the Damage Checklist. To assess the degree to which the pGTI corresponds to expert clinical judgement, the investigators ranked 15 cases by clinical judgement from highest to lowest GC toxicity. Expert rankings were then compared to case ranking by the pGTI, yielding excellent agreement (weighted kappa 0.86). The pGTI was migrated to a digital environment following its development and initial validation. The digital platform is designed to ensure ease-of-use in the clinic, rigor in application, and accuracy of scoring. Clinic staff enter vital signs, laboratory results, and medication changes relevant to pGTI scoring. Clinicians record findings for GC myopathy, skin toxicity, mood dysfunction, and infection. The pGTI algorithms then apply the weights to these raw data and calculate scores. Embedded logic accounts for the impact of age- and sex-related reference ranges on several health domains: blood pressure, lipid metabolism, and bone mineral density. Other algorithms account for anticipated changes in the height Z-scores used in the growth domain, thereby addressing a concern unique to GC toxicity in children. The Damage Checklist ensures comprehensive measurement of GC toxicity but does not contribute to pGTI scoring, because the scored domains emphasize manifestations of GC toxicity that are likely to change over the course of a trial. Conclusions: We describe the development and initial evaluation of a weighted, composite toxicity index for the assessment of morbidity related to GC use in children and adolescents. Developing the pGTI digital platform was essential for performing the nuanced calculations necessary to ensure rigor, accuracy, and ease-of-use in both clinic and research settings

    The history and evolution of the clinical effectiveness of haemophilia type a treatment: a systematic review.

    Get PDF
    First evidence of cases of haemophilia dates from ancient Egypt, but it was when Queen Victoria from England in the 19th century transmitted this illness to her descendants, when it became known as the "royal disease". Last decades of the 20th century account for major discoveries that improved the life expectancy and quality of life of these patients. The history and evolution of haemophilia healthcare counts ups and downs. The introduction of prophylactic schemes during the 1970s have proved to be more effective that the classic on-demand replacement of clotting factors, nevertheless many patients managed with frequent plasma transfusions or derived products became infected with the Human Immunodeficiency Virus (HIV) and Hepatitis C virus during the 1980s and 1990s. Recombinant factor VIII inception has decreased the risk of blood borne infections and restored back longer life expectancies. Main concerns for haemophilia healthcare are shifting from the pure clinical aspects to the economic considerations of long-term replacement therapy. Nowadays researchers' attention has been placed on the future costs and cost-effectiveness of costly long-term treatment. Equity considerations are relevant as well, and alternative options for less affluent countries are under the scope of further research. The aim of this review was to assess the evidence of different treatment options for haemophilia type A over the past four decades, focusing on the most important technological advances that have influenced the natural course of this "royal disease"

    Interventions aimed at increasing research use in nursing: a systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There has been considerable interest recently in developing and evaluating interventions to increase research use by clinicians. However, most work has focused on medical practices; and nursing is not well represented in existing systematic reviews. The purpose of this article is to report findings from a systematic review of interventions aimed at increasing research use in nursing.</p> <p>Objective</p> <p>To assess the evidence on interventions aimed at increasing research use in nursing.</p> <p>Methods</p> <p>A systematic review of research use in nursing was conducted using databases (Medline, CINAHL, Healthstar, ERIC, Cochrane Central Register of Controlled Trials, and Psychinfo), grey literature, ancestry searching (Cochrane Database of Systematic Reviews), key informants, and manual searching of journals. Randomized controlled trials and controlled before- and after-studies were included if they included nurses, if the intervention was explicitly aimed at increasing research use or evidence-based practice, and if there was an explicit outcome to research use. Methodological quality was assessed using pre-existing tools. Data on interventions and outcomes were extracted and categorized using a pre-established taxonomy.</p> <p>Results</p> <p>Over 8,000 titles were screened. Three randomized controlled trials and one controlled before- and after-study met the inclusion criteria. The methodological quality of included studies was generally low. Three investigators evaluated single interventions. The most common intervention was education. Investigators measured research use using a combination of surveys (three studies) and compliance with guidelines (one study). Researcher-led educational meetings were ineffective in two studies. Educational meetings led by a local opinion leader (one study) and the formation of multidisciplinary committees (one study) were both effective at increasing research use.</p> <p>Conclusion</p> <p>Little is known about how to increase research use in nursing, and the evidence to support or refute specific interventions is inconclusive. To advance the field, we recommend that investigators: (1) use theoretically informed interventions to increase research use, (2) measure research use longitudinally using theoretically informed and psychometrically sound measures of research use, as well as, measuring patient outcomes relevant to the intervention, and (3) use more robust and methodologically sound study designs to evaluate interventions. If investigators aim to establish a link between using research and improved patient outcomes they must first identify those interventions that are effective at increasing research use.</p

    Vitamin D in the general population of young adults with autism in the Faroe Islands

    Get PDF
    Vitamin D deficiency has been proposed as a possible risk factor for developing autism spectrum disorder (ASD). 25-Hydroxyvitamin D3 (25(OH)D3) levels were examined in a cross-sectional population-based study in the Faroe Islands. The case group consisting of a total population cohort of 40 individuals with ASD (aged 15–24 years) had significantly lower 25(OH)D3 than their 62 typically-developing siblings and their 77 parents, and also significantly lower than 40 healthy age and gender matched comparisons. There was a trend for males having lower 25(OH)D3 than females. Effects of age, month/season of birth, IQ, various subcategories of ASD and Autism Diagnostic Observation Schedule score were also investigated, however, no association was found. The very low 25(OH)D3 in the ASD group suggests some underlying pathogenic mechanism

    Visual Exploration and Object Recognition by Lattice Deformation

    Get PDF
    Mechanisms of explicit object recognition are often difficult to investigate and require stimuli with controlled features whose expression can be manipulated in a precise quantitative fashion. Here, we developed a novel method (called “Dots”), for generating visual stimuli, which is based on the progressive deformation of a regular lattice of dots, driven by local contour information from images of objects. By applying progressively larger deformation to the lattice, the latter conveys progressively more information about the target object. Stimuli generated with the presented method enable a precise control of object-related information content while preserving low-level image statistics, globally, and affecting them only little, locally. We show that such stimuli are useful for investigating object recognition under a naturalistic setting – free visual exploration – enabling a clear dissociation between object detection and explicit recognition. Using the introduced stimuli, we show that top-down modulation induced by previous exposure to target objects can greatly influence perceptual decisions, lowering perceptual thresholds not only for object recognition but also for object detection (visual hysteresis). Visual hysteresis is target-specific, its expression and magnitude depending on the identity of individual objects. Relying on the particular features of dot stimuli and on eye-tracking measurements, we further demonstrate that top-down processes guide visual exploration, controlling how visual information is integrated by successive fixations. Prior knowledge about objects can guide saccades/fixations to sample locations that are supposed to be highly informative, even when the actual information is missing from those locations in the stimulus. The duration of individual fixations is modulated by the novelty and difficulty of the stimulus, likely reflecting cognitive demand
    corecore