641 research outputs found

    Parapneumonic empyema diagnosed by chest radiograph and computed tomography

    Get PDF
    Pleural effusion is commonly seen associated with pneumonia. When this progresses to empyema, directed therapy is frequently required. Chest radiographic and computed tomography findings can help distinguish empyema from a transudative pleural effusion

    High elevation of the ‘Nevadaplano’ during the Late Cretaceous

    Get PDF
    During the Late Cretaceous, central Nevada may have been a high elevation plateau, the Nevadaplano; some geodynamic models of the western US require thickened crust and high elevations during the Mesozoic to drive the subsequent tectonic events of the Cenozoic while other models do not. To test the hypothesis of high elevations during the late Mesozoic, we used carbonate clumped isotope thermometry to determine the temperature contrast between Late Cretaceous to Paleocene carbonates atop the putative plateau in Nevada versus carbonates from relatively low paleoelevation central Utah site. Lacustrine carbonates from the Nevada site preserve summer temperatures ∼13 °C cooler than summer temperatures from paleosol carbonates from the Utah site, after correcting for ∼1.2 °C of secular climatic cooling between the times of carbonate deposition at the two sites. This ∼13 °C temperature difference implies an elevation difference between the two sites of ∼2.2–3.1 km; including uncertainties from age estimation and climate change broadens this estimate to ⩾2 km. Our findings support crustal thickness estimates and Cenozoic tectonic models that imply thickened crust and high elevation in Nevada during the Mesozoic

    Hot summers in the Western United States during the Late Cretaceous and Early Cenozoic

    Get PDF
    Understanding how seasonal temperatures on land respond to global greenhouse climate conditions is important for predicting effects of climate change on ecosystem structure, agriculture and distributions of natural resources. Fossil floral and faunal assemblages suggest winter temperatures in middle and high latitude continental interiors during the Cretaceous and early Cenozoic were at or above freezing, whereas terrestrial summer temperature estimates are uncertain. Carbonate clumped isotope (Δ_(47)) temperature estimates from lacustrine and paleosol carbonates appear to be generally biased toward summer temperatures in middle and high latitudes. Though problematic for reconstructing mean annual temperature (MAT), this bias presents an opportunity to reconstruct terrestrial summer temperatures and, through comparison with paleobotanical data, estimate past terrestrial seasonality

    Influenza Vaccine Effectiveness against Hospitalisation with Confirmed Influenza in the 2010-11 Seasons: A Test-negative Observational Study

    Get PDF
    Immunisation programs are designed to reduce serious morbidity and mortality from influenza, but most evidence supporting the effectiveness of this intervention has focused on disease in the community or in primary care settings. We aimed to examine the effectiveness of influenza vaccination against hospitalisation with confirmed influenza. We compared influenza vaccination status in patients hospitalised with PCR-confirmed influenza with patients hospitalised with influenza-negative respiratory infections in an Australian sentinel surveillance system. Vaccine effectiveness was estimated from the odds ratio of vaccination in cases and controls. We performed both simple multivariate regression and a stratified analysis based on propensity score of vaccination. Vaccination status was ascertained in 333 of 598 patients with confirmed influenza and 785 of 1384 test-negative patients. Overall estimated crude vaccine effectiveness was 57% (41%, 68%). After adjusting for age, chronic comorbidities and pregnancy status, the estimated vaccine effectiveness was 37% (95% CI: 12%, 55%). In an analysis accounting for a propensity score for vaccination, the estimated vaccine effectiveness was 48.3% (95% CI: 30.0, 61.8%). Influenza vaccination was moderately protective against hospitalisation with influenza in the 2010 and 2011 seasons

    Weekly vs. Every-3-Week Paclitaxel and Carboplatin for Ovarian Cancer

    Get PDF
    BACKGROUND A dose-dense weekly schedule of paclitaxel (resulting in a greater frequency of drug delivery) plus carboplatin every 3 weeks or the addition of bevacizumab to paclitaxel and carboplatin administered every 3 weeks has shown efficacy in ovarian cancer. We proposed to determine whether dose-dense weekly paclitaxel and carboplatin would prolong progression-free survival as compared with paclitaxel and carboplatin administered every 3 weeks among patients receiving and those not receiving bevacizumab. METHODS We prospectively stratified patients according to whether they elected to receive bevacizumab and then randomly assigned them to receive either paclitaxel, administered intravenously at a dose of 175 mg per square meter of body-surface area every 3 weeks, plus carboplatin (dose equivalent to an area under the curve [AUC] of 6) for six cycles or paclitaxel, administered weekly at a dose of 80 mg per square meter, plus carboplatin (AUC, 6) for six cycles. The primary end point was progression-free survival. RESULTS A total of 692 patients were enrolled, 84% of whom opted to receive bevacizumab. In the intention-to-treat analysis, weekly paclitaxel was not associated with longer progression-free survival than paclitaxel administered every 3 weeks (14.7 months and 14.0 months, respectively; hazard ratio for disease progression or death, 0.89; 95% confidence interval [CI], 0.74 to 1.06; P=0.18). Among patients who did not receive bevacizumab, weekly paclitaxel was associated with progression-free survival that was 3.9 months longer than that observed with paclitaxel administered every 3 weeks (14.2 vs. 10.3 months; hazard ratio, 0.62; 95% CI, 0.40 to 0.95; P=0.03). However, among patients who received bevacizumab, weekly paclitaxel did not significantly prolong progression-free survival, as compared with paclitaxel administered every 3 weeks (14.9 months and 14.7 months, respectively; hazard ratio, 0.99; 95% CI, 0.83 to 1.20; P=0.60). A test for interaction that assessed homogeneity of the treatment effect showed a significant difference between treatment with bevacizumab and without bevacizumab (P=0.047). Patients who received weekly paclitaxel had a higher rate of grade 3 or 4 anemia than did those who received paclitaxel every 3 weeks (36% vs. 16%), as well as a higher rate of grade 2 to 4 sensory neuropathy (26% vs. 18%); however, they had a lower rate of grade 3 or 4 neutropenia (72% vs. 83%). CONCLUSIONS Overall, weekly paclitaxel, as compared with paclitaxel administered every 3 weeks, did not prolong progression-free survival among patients with ovarian cancer

    Cancer risk in 680 000 people exposed to computed tomography scans in childhood or adolescence: Data linkage study of 11 million Australians

    Get PDF
    Objective To assess the cancer risk in children and adolescents following exposure to low dose ionising radiation from diagnostic computed tomography (CT) scans. Design Population based, cohort, data linkage study in Australia. Cohort members 10.9 million people identified from Australian Medicare records, aged 0-19 years on 1 January 1985 or born between 1 January 1985 and 31 December 2005; all exposures to CT scans funded by Medicare during 1985-2005 were identified for this cohort. Cancers diagnosed in cohort members up to 31 December 2007 were obtained through linkage to national cancer records. Main outcome Cancer incidence rates in individuals exposed to a CT scan more than one year before any cancer diagnosis, compared with cancer incidence rates in unexposed individuals. Results 60 674 cancers were recorded, including 3150 in 680 211 people exposed to a CT scan at least one year before any cancer diagnosis. The mean duration of follow-up after exposure was 9.5 years. Overall cancer incidence was 24% greater for exposed than for unexposed people, after accounting for age, sex, and year of birth (incidence rate ratio (IRR) 1.24 (95% confidence interval 1.20 to 1.29); P<0.001). We saw a dose-response relation, and the IRR increased by 0.16 (0.13 to 0.19) for each additional CT scan. The IRR was greater after exposure at younger ages (P<0.001 for trend). At 1-4, 5-9, 10-14, and 15 or more years since first exposure, IRRs were 1.35 (1.25 to 1.45), 1.25 (1.17 to 1.34), 1.14 (1.06 to 1.22), and 1.24 (1.14 to 1.34), respectively. The IRR increased significantly for many types of solid cancer (digestive organs, melanoma, soft tissue, female genital, urinary tract, brain, and thyroid); leukaemia, myelodysplasia, and some other lymphoid cancers. There was an excess of 608 cancers in people exposed to CT scans (147 brain, 356 other solid, 48 leukaemia or myelodysplasia, and 57 other lymphoid). The absolute excess incidence rate for all cancers combined was 9.38 per 100 000 person years at risk, as of 31 December 2007. The average effective radiation dose per scan was estimated as 4.5 mSv. Conclusions The increased incidence of cancer after CT scan exposure in this cohort was mostly due to irradiation. Because the cancer excess was still continuing at the end of follow-up, the eventual lifetime risk from CT scans cannot yet be determined. Radiation doses from contemporary CT scans are likely to be lower than those in 1985-2005, but some increase in cancer risk is still likely from current scans. Future CT scans should be limited to situations where there is a definite clinical indication, with every scan optimised to provide a diagnostic CT image at the lowest possible radiation dose

    Ornithological expeditions to Sarawak, Malaysian Borneo, 2007-2017

    Get PDF
    Louisiana State University, the University of Kansas, and the Universiti Malaysia Sarawak undertook collaborative research on the evolution and ecology of Bornean birds starting in 2005. This collaboration included a series of expeditions from 2007–2017 to collect and study birds at \u3e30 sites in Sarawak, Malaysian Borneo. Here we provide information on the study-sites and summarize the main discoveries resulting from the collaboration

    Neurobiological Mechanisms That Contribute to Stress-related Cocaine Use

    Get PDF
    The ability of stressful life events to trigger drug use is particularly problematic for the management of cocaine addiction due to the unpredictable and often uncontrollable nature of stress. For this reason, understanding the neurobiological processes that contribute to stress-related drug use is important for the development of new and more effective treatment strategies aimed at minimizing the role of stress in the addiction cycle. In this review we discuss the neurocircuitry that has been implicated in stress-induced drug use with an emphasis on corticotropin releasing factor actions in the ventral tegmental area (VTA) and an important pathway from the bed nucleus of the stria terminalis to the VTA that is regulated by norepinephrine via actions at beta adrenergic receptors. In addition to the neurobiological mechanisms that underlie stress-induced cocaine seeking, we review findings suggesting that the ability of stressful stimuli to trigger cocaine use emerges and intensifies in an intake-dependent manner with repeated cocaine self-administration. Further, we discuss evidence that the drug-induced neuroadaptations that are necessary for heightened susceptibility to stress-induced drug use are reliant on elevated levels of glucocorticoid hormones at the time of cocaine use. Finally, the potential ability of stress to function as a “stage setter” for drug use – increasing sensitivity to cocaine and drug-associated cues – under conditions where it does not directly trigger cocaine seeking is discussed. As our understanding of the mechanisms through which stress promotes drug use advances, the hope is that so too will the available tools for effectively managing addiction, particularly in cocaine addicts whose drug use is stress-driven
    corecore