55 research outputs found

    A not so isolated fringe:Dutch later prehistoric (c. 2200 BCE-AD 0) bronze alloy networks from compositional analyses on metals and corrosion layers

    Get PDF
    Using a corpus of over 370 compositional analyses of Dutch Bronze Age and Iron Age (c. 2000 BCE AD 0) copper alloy artefacts, long-term patterns in the types of alloys used for specific bronze objects are identified. As the Low Countries are devoid of copper ores and alloying elements, a combination of typo(chrono) logical and compositional analysis is used to identify through which European contact networks (such as Atlantic, Central European or Nordic exchange networks) these alloys were obtained. We employ a methodology that (following Bray et al., 2015) defines alloy groups by presence of As, Sb, Ag and Ni over 0.1 %wt, but expanded this classification to include Pb and to track high-impurity (>1%wt) alloys. Due to interfering soil-derived iron hydroxides, and preferent dissolution of copper from the objects’ surface, the determination of tin is in most cases overestimated when using p-XRF, so Sn was not systematically reviewed. Objects were assigned a calendar age in years BCE to facilitate chronological sorting. Using this classification, we could show how different alloys (using different base ores) were used in different periods, and in different combinations. Moreover, particular alloys were used for different groups of functional types of objects. Also, we show diachronic differences in the influx of new (or less frequently mixed) alloys and chronological trends in the substitution of As by Sn as main alloying element in the Early Bronze Age as well as the rise of leaded alloys at the close of the Bronze Age. Combining information on the composition of the objects with their typological traits, allowed us to reconstruct the scales and geographic scopes of the European contact networks in which the copper alloys used throughout later prehistory were obtained

    Ovarian stimulation for IVF and risk of primary breast cancer in BRCA1/2 mutation carriers

    Get PDF
    Background: The effect of in vitro fertilisation (IVF) on breast cancer risk for BRCA1/2 mutation carriers is rarely examined. As carriers may increasingly undergo IVF as part of preimplantation genetic diagnosis (PGD), we examined the impact of ovarian stimulation for IVF on breast cancer risk in BRCA1/2 mutation carriers. Methods: The study population consisted of 1550 BRCA1 and 964 BRCA2 mutation carriers, derived from the nationwide HEBON study and the nationwide PGD registry. Questionnaires, clinical records and linkages with the Netherlands Cancer Registry were used to collect data on IVF exposure, risk-reducing surgeries and cancer diagnosis, respectively. Time-dependent Cox regression analyses were conducted, stratified for birth cohort and adjusted for subfertility. Results: Of the 2514 BRCA1/2 mutation carriers, 3% (n = 76) were exposed to ovarian stimulation for IVF. In total, 938 BRCA1/2 mutation carriers (37.3%) were diagnosed with breast cancer. IVF exposure was not associated with risk of breast cancer (HR: 0.79, 95% CI: 0.46–1.36). Similar results were found for the subgroups of subfertile women (n = 232; HR: 0.73, 95% CI: 0.39–1.37) and BRCA1 mutation carriers (HR: 1.12, 95% CI: 0.60–2.09). In addition, age at and recency of first IVF treatment were not associated with breast cancer risk. Conclusion: No evidence was found for an association between ovarian stimulation for IVF and breast cancer risk in BRCA1/2 mutation carriers

    Analysis of shared heritability in common disorders of the brain

    Get PDF
    ience, this issue p. eaap8757 Structured Abstract INTRODUCTION Brain disorders may exhibit shared symptoms and substantial epidemiological comorbidity, inciting debate about their etiologic overlap. However, detailed study of phenotypes with different ages of onset, severity, and presentation poses a considerable challenge. Recently developed heritability methods allow us to accurately measure correlation of genome-wide common variant risk between two phenotypes from pools of different individuals and assess how connected they, or at least their genetic risks, are on the genomic level. We used genome-wide association data for 265,218 patients and 784,643 control participants, as well as 17 phenotypes from a total of 1,191,588 individuals, to quantify the degree of overlap for genetic risk factors of 25 common brain disorders. RATIONALE Over the past century, the classification of brain disorders has evolved to reflect the medical and scientific communities' assessments of the presumed root causes of clinical phenomena such as behavioral change, loss of motor function, or alterations of consciousness. Directly observable phenomena (such as the presence of emboli, protein tangles, or unusual electrical activity patterns) generally define and separate neurological disorders from psychiatric disorders. Understanding the genetic underpinnings and categorical distinctions for brain disorders and related phenotypes may inform the search for their biological mechanisms. RESULTS Common variant risk for psychiatric disorders was shown to correlate significantly, especially among attention deficit hyperactivity disorder (ADHD), bipolar disorder, major depressive disorder (MDD), and schizophrenia. By contrast, neurological disorders appear more distinct from one another and from the psychiatric disorders, except for migraine, which was significantly correlated to ADHD, MDD, and Tourette syndrome. We demonstrate that, in the general population, the personality trait neuroticism is significantly correlated with almost every psychiatric disorder and migraine. We also identify significant genetic sharing between disorders and early life cognitive measures (e.g., years of education and college attainment) in the general population, demonstrating positive correlation with several psychiatric disorders (e.g., anorexia nervosa and bipolar disorder) and negative correlation with several neurological phenotypes (e.g., Alzheimer's disease and ischemic stroke), even though the latter are considered to result from specific processes that occur later in life. Extensive simulations were also performed to inform how statistical power, diagnostic misclassification, and phenotypic heterogeneity influence genetic correlations. CONCLUSION The high degree of genetic correlation among many of the psychiatric disorders adds further evidence that their current clinical boundaries do not reflect distinct underlying pathogenic processes, at least on the genetic level. This suggests a deeply interconnected nature for psychiatric disorders, in contrast to neurological disorders, and underscores the need to refine psychiatric diagnostics. Genetically informed analyses may provide important "scaffolding" to support such restructuring of psychiatric nosology, which likely requires incorporating many levels of information. By contrast, we find limited evidence for widespread common genetic risk sharing among neurological disorders or across neurological and psychiatric disorders. We show that both psychiatric and neurological disorders have robust correlations with cognitive and personality measures. Further study is needed to evaluate whether overlapping genetic contributions to psychiatric pathology may influence treatment choices. Ultimately, such developments may pave the way toward reduced heterogeneity and improved diagnosis and treatment of psychiatric disorders

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Abstracts from the Food Allergy and Anaphylaxis Meeting 2016

    Get PDF

    A Solve-RD ClinVar-based reanalysis of 1522 index cases from ERN-ITHACA reveals common pitfalls and misinterpretations in exome sequencing

    Get PDF
    Purpose Within the Solve-RD project (https://solve-rd.eu/), the European Reference Network for Intellectual disability, TeleHealth, Autism and Congenital Anomalies aimed to investigate whether a reanalysis of exomes from unsolved cases based on ClinVar annotations could establish additional diagnoses. We present the results of the “ClinVar low-hanging fruit” reanalysis, reasons for the failure of previous analyses, and lessons learned. Methods Data from the first 3576 exomes (1522 probands and 2054 relatives) collected from European Reference Network for Intellectual disability, TeleHealth, Autism and Congenital Anomalies was reanalyzed by the Solve-RD consortium by evaluating for the presence of single-nucleotide variant, and small insertions and deletions already reported as (likely) pathogenic in ClinVar. Variants were filtered according to frequency, genotype, and mode of inheritance and reinterpreted. Results We identified causal variants in 59 cases (3.9%), 50 of them also raised by other approaches and 9 leading to new diagnoses, highlighting interpretation challenges: variants in genes not known to be involved in human disease at the time of the first analysis, misleading genotypes, or variants undetected by local pipelines (variants in off-target regions, low quality filters, low allelic balance, or high frequency). Conclusion The “ClinVar low-hanging fruit” analysis represents an effective, fast, and easy approach to recover causal variants from exome sequencing data, herewith contributing to the reduction of the diagnostic deadlock

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Charred bone : Physical and chemical changes during laboratory simulated heating under reducing conditions and its relevance for the study of fire use in archaeology

    No full text
    In order to gain insight into the timing and nature of hominin fire use, the effect of heat on the physical and chemical properties of the materials entering the archaeological record needs to be understood. The present study concerns the fire proxy heated bone. Two types of heating can be distinguished: combustion (or burning, with oxygen) and charring (without oxygen), for both of which the formation of char is the first step. We performed a series of controlled laboratory-based heating experiments, in reducing conditions (i.e. charring), covering a broad temperature range (20–900 °C), and applied a variety of different analytical techniques. Results indicate that charred bone shows a distinctly different thermal alteration trajectory than combusted bone, which has implications for the suitability of the different analytical techniques when identifying and determining past heating conditions (charring vs. combustion; temperature) of heated bone from archaeological contexts. Combined, the reference data and techniques presented in this study can be used as a robust toolkit for the characterisation of archaeological charred bone from various ages and contexts
    corecore