48 research outputs found

    Monazite trumps zircon: applying SHRIMP U–Pb geochronology to systematically evaluate emplacement ages of leucocratic, low-temperature granites in a complex Precambrian orogen

    Get PDF
    Although zircon is the most widely used geochronometer to determine the crystallisation ages of granites, it can be unreliable for low-temperature melts because they may not crystallise new zircon. For leucocratic granites U–Pb zircon dates, therefore, may reflect the ages of the source rocks rather than the igneous crystallisation age. In the Proterozoic Capricorn Orogen of Western Australia, leucocratic granites are associated with several pulses of intracontinental magmatism spanning ~800 million years. In several instances, SHRIMP U–Pb zircon dating of these leucocratic granites either yielded ages that were inconclusive (e.g., multiple concordant ages) or incompatible with other geochronological data. To overcome this we used SHRIMP U–Th–Pb monazite geochronology to obtain igneous crystallisation ages that are consistent with the geological and geochronological framework of the orogen. The U–Th–Pb monazite geochronology has resolved the time interval over which two granitic supersuites were emplaced; a Paleoproterozoic supersuite thought to span ~80 million years was emplaced in less than half that time (1688–1659 Ma) and a small Meso- to Neoproterozoic supersuite considered to have been intruded over ~70 million years was instead assembled over ~130 million years and outlasted associated regional metamorphism by ~100 million years. Both findings have consequences for the duration of associated orogenic events and any estimates for magma generation rates. The monazite geochronology has contributed to a more reliable tectonic history for a complex, long-lived orogen. Our results emphasise the benefit of monazite as a geochronometer for leucocratic granites derived by low-temperature crustal melting and are relevant to other orogens worldwide

    What scans we will read: imaging instrumentation trends in clinical oncology

    Get PDF
    Oncological diseases account for a significant portion of the burden on public healthcare systems with associated costs driven primarily by complex and long-lasting therapies. Through the visualization of patient-specific morphology and functional-molecular pathways, cancerous tissue can be detected and characterized non- invasively, so as to provide referring oncologists with essential information to support therapy management decisions. Following the onset of stand-alone anatomical and functional imaging, we witness a push towards integrating molecular image information through various methods, including anato-metabolic imaging (e.g., PET/ CT), advanced MRI, optical or ultrasound imaging. This perspective paper highlights a number of key technological and methodological advances in imaging instrumentation related to anatomical, functional, molecular medicine and hybrid imaging, that is understood as the hardware-based combination of complementary anatomical and molecular imaging. These include novel detector technologies for ionizing radiation used in CT and nuclear medicine imaging, and novel system developments in MRI and optical as well as opto-acoustic imaging. We will also highlight new data processing methods for improved non-invasive tissue characterization. Following a general introduction to the role of imaging in oncology patient management we introduce imaging methods with well-defined clinical applications and potential for clinical translation. For each modality, we report first on the status quo and point to perceived technological and methodological advances in a subsequent status go section. Considering the breadth and dynamics of these developments, this perspective ends with a critical reflection on where the authors, with the majority of them being imaging experts with a background in physics and engineering, believe imaging methods will be in a few years from now. Overall, methodological and technological medical imaging advances are geared towards increased image contrast, the derivation of reproducible quantitative parameters, an increase in volume sensitivity and a reduction in overall examination time. To ensure full translation to the clinic, this progress in technologies and instrumentation is complemented by progress in relevant acquisition and image-processing protocols and improved data analysis. To this end, we should accept diagnostic images as “data”, and – through the wider adoption of advanced analysis, including machine learning approaches and a “big data” concept – move to the next stage of non-invasive tumor phenotyping. The scans we will be reading in 10 years from now will likely be composed of highly diverse multi- dimensional data from multiple sources, which mandate the use of advanced and interactive visualization and analysis platforms powered by Artificial Intelligence (AI) for real-time data handling by cross-specialty clinical experts with a domain knowledge that will need to go beyond that of plain imaging

    Identifying associations between diabetes and acute respiratory distress syndrome in patients with acute hypoxemic respiratory failure: an analysis of the LUNG SAFE database

    Get PDF
    Background: Diabetes mellitus is a common co-existing disease in the critically ill. Diabetes mellitus may reduce the risk of acute respiratory distress syndrome (ARDS), but data from previous studies are conflicting. The objective of this study was to evaluate associations between pre-existing diabetes mellitus and ARDS in critically ill patients with acute hypoxemic respiratory failure (AHRF). Methods: An ancillary analysis of a global, multi-centre prospective observational study (LUNG SAFE) was undertaken. LUNG SAFE evaluated all patients admitted to an intensive care unit (ICU) over a 4-week period, that required mechanical ventilation and met AHRF criteria. Patients who had their AHRF fully explained by cardiac failure were excluded. Important clinical characteristics were included in a stepwise selection approach (forward and backward selection combined with a significance level of 0.05) to identify a set of independent variables associated with having ARDS at any time, developing ARDS (defined as ARDS occurring after day 2 from meeting AHRF criteria) and with hospital mortality. Furthermore, propensity score analysis was undertaken to account for the differences in baseline characteristics between patients with and without diabetes mellitus, and the association between diabetes mellitus and outcomes of interest was assessed on matched samples. Results: Of the 4107 patients with AHRF included in this study, 3022 (73.6%) patients fulfilled ARDS criteria at admission or developed ARDS during their ICU stay. Diabetes mellitus was a pre-existing co-morbidity in 913 patients (22.2% of patients with AHRF). In multivariable analysis, there was no association between diabetes mellitus and having ARDS (OR 0.93 (0.78-1.11); p = 0.39), developing ARDS late (OR 0.79 (0.54-1.15); p = 0.22), or hospital mortality in patients with ARDS (1.15 (0.93-1.42); p = 0.19). In a matched sample of patients, there was no association between diabetes mellitus and outcomes of interest. Conclusions: In a large, global observational study of patients with AHRF, no association was found between diabetes mellitus and having ARDS, developing ARDS, or outcomes from ARDS. Trial registration: NCT02010073. Registered on 12 December 2013

    Spontaneous Breathing in Early Acute Respiratory Distress Syndrome: Insights From the Large Observational Study to UNderstand the Global Impact of Severe Acute Respiratory FailurE Study

    Get PDF
    OBJECTIVES: To describe the characteristics and outcomes of patients with acute respiratory distress syndrome with or without spontaneous breathing and to investigate whether the effects of spontaneous breathing on outcome depend on acute respiratory distress syndrome severity. DESIGN: Planned secondary analysis of a prospective, observational, multicentre cohort study. SETTING: International sample of 459 ICUs from 50 countries. PATIENTS: Patients with acute respiratory distress syndrome and at least 2 days of invasive mechanical ventilation and available data for the mode of mechanical ventilation and respiratory rate for the 2 first days. INTERVENTIONS: Analysis of patients with and without spontaneous breathing, defined by the mode of mechanical ventilation and by actual respiratory rate compared with set respiratory rate during the first 48 hours of mechanical ventilation. MEASUREMENTS AND MAIN RESULTS: Spontaneous breathing was present in 67% of patients with mild acute respiratory distress syndrome, 58% of patients with moderate acute respiratory distress syndrome, and 46% of patients with severe acute respiratory distress syndrome. Patients with spontaneous breathing were older and had lower acute respiratory distress syndrome severity, Sequential Organ Failure Assessment scores, ICU and hospital mortality, and were less likely to be diagnosed with acute respiratory distress syndrome by clinicians. In adjusted analysis, spontaneous breathing during the first 2 days was not associated with an effect on ICU or hospital mortality (33% vs 37%; odds ratio, 1.18 [0.92-1.51]; p = 0.19 and 37% vs 41%; odds ratio, 1.18 [0.93-1.50]; p = 0.196, respectively ). Spontaneous breathing was associated with increased ventilator-free days (13 [0-22] vs 8 [0-20]; p = 0.014) and shorter duration of ICU stay (11 [6-20] vs 12 [7-22]; p = 0.04). CONCLUSIONS: Spontaneous breathing is common in patients with acute respiratory distress syndrome during the first 48 hours of mechanical ventilation. Spontaneous breathing is not associated with worse outcomes and may hasten liberation from the ventilator and from ICU. Although these results support the use of spontaneous breathing in patients with acute respiratory distress syndrome independent of acute respiratory distress syndrome severity, the use of controlled ventilation indicates a bias toward use in patients with higher disease severity. In addition, because the lack of reliable data on inspiratory effort in our study, prospective studies incorporating the magnitude of inspiratory effort and adjusting for all potential severity confounders are required

    Epidemiology and patterns of tracheostomy practice in patients with acute respiratory distress syndrome in ICUs across 50 countries

    Get PDF
    Background: To better understand the epidemiology and patterns of tracheostomy practice for patients with acute respiratory distress syndrome (ARDS), we investigated the current usage of tracheostomy in patients with ARDS recruited into the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG-SAFE) study. Methods: This is a secondary analysis of LUNG-SAFE, an international, multicenter, prospective cohort study of patients receiving invasive or noninvasive ventilation in 50 countries spanning 5 continents. The study was carried out over 4 weeks consecutively in the winter of 2014, and 459 ICUs participated. We evaluated the clinical characteristics, management and outcomes of patients that received tracheostomy, in the cohort of patients that developed ARDS on day 1-2 of acute hypoxemic respiratory failure, and in a subsequent propensity-matched cohort. Results: Of the 2377 patients with ARDS that fulfilled the inclusion criteria, 309 (13.0%) underwent tracheostomy during their ICU stay. Patients from high-income European countries (n = 198/1263) more frequently underwent tracheostomy compared to patients from non-European high-income countries (n = 63/649) or patients from middle-income countries (n = 48/465). Only 86/309 (27.8%) underwent tracheostomy on or before day 7, while the median timing of tracheostomy was 14 (Q1-Q3, 7-21) days after onset of ARDS. In the subsample matched by propensity score, ICU and hospital stay were longer in patients with tracheostomy. While patients with tracheostomy had the highest survival probability, there was no difference in 60-day or 90-day mortality in either the patient subgroup that survived for at least 5 days in ICU, or in the propensity-matched subsample. Conclusions: Most patients that receive tracheostomy do so after the first week of critical illness. Tracheostomy may prolong patient survival but does not reduce 60-day or 90-day mortality. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013

    Trends in the detection of aquatic non-indigenous species across global marine, estuarine and freshwater ecosystems: A 50-year perspective

    No full text
    Aim: The introduction of aquatic non-indigenous species (ANS) has become a major driver for global changes in species biogeography. We examined spatial patterns and temporal trends of ANS detections since 1965 to inform conservation policy and management. Location: Global. Methods: We assembled an extensive dataset of first records of detection of ANS (1965–2015) across 49 aquatic ecosystems, including the (a) year of first collection, (b) population status and (c) potential pathway(s) of introduction. Data were analysed at global and regional levels to assess patterns of detection rate, richness and transport pathways. Results: An annual mean of 43 (±16 SD) primary detections of ANS occurred—one new detection every 8.4 days for 50 years. The global rate of detections was relatively stable during 1965–1995, but increased rapidly after this time, peaking at roughly 66 primary detections per year during 2005–2010 and then declining marginally. Detection rates were variable within and across regions through time. Arthropods, molluscs and fishes were the most frequently reported ANS. Most ANS were likely introduced as stowaways in ships’ ballast water or biofouling, although direct evidence is typically absent. Main conclusions: This synthesis highlights the magnitude of recent ANS detections, yet almost certainly represents an underestimate as many ANS go unreported due to limited search effort and diminishing taxonomic expertise. Temporal rates of detection are also confounded by reporting lags, likely contributing to the lower detection rate observed in recent years. There is a critical need to implement standardized, repeated methods across regions and taxa to improve the quality of global-scale comparisons and sustain core measures over longer time-scales. It will be fundamental to fill in knowledge gaps given that invasion data representing broad regions of the world's oceans are not yet readily available and to maintain knowledge pipelines for adaptive management

    Deep learning based identification of bone scintigraphies containing metastatic bone disease foci

    No full text
    Abstract Purpose Metastatic bone disease (MBD) is the most common form of metastases, most frequently deriving from prostate cancer. MBD is screened with bone scintigraphy (BS), which have high sensitivity but low specificity for the diagnosis of MBD, often requiring further investigations. Deep learning (DL) - a machine learning technique designed to mimic human neuronal interactions- has shown promise in the field of medical imaging analysis for different purposes, including segmentation and classification of lesions. In this study, we aim to develop a DL algorithm that can classify areas of increased uptake on bone scintigraphy scans. Methods We collected 2365 BS from three European medical centres. The model was trained and validated on 1203 and 164 BS scans respectively. Furthermore we evaluated its performance on an external testing set composed of 998 BS scans. We further aimed to enhance the explainability of our developed algorithm, using activation maps. We compared the performance of our algorithm to that of 6 nuclear medicine physicians. Results The developed DL based algorithm is able to detect MBD on BSs, with high specificity and sensitivity (0.80 and 0.82 respectively on the external test set), in a shorter time compared to the nuclear medicine physicians (2.5 min for AI and 30 min for nuclear medicine physicians to classify 134 BSs). Further prospective validation is required before the algorithm can be used in the clinic
    corecore