2,804 research outputs found

    Seasonal nutritional status in Norway lobsters, Nephrops norvegicus (L.): Are females nutritionally compromised over the winter? (Forthcoming)

    Get PDF
    Norway lobsters, Nephrops norvegicus, are sediment-dwelling decapod crustaceans that excavate burrows from which they make short excursions to feed by predation and scavenging. The females of this species are known to reside within their burrows for an extended period of time over the winter while brooding their eggs. The aim of this study was to assess the likelihood of these females being able to feed during this brooding period. Biophysical and biochemical measurements that had previously been shown to change with starvation under laboratory conditions in male N. norvegicus were taken for female N. norvegicus under similar conditions. These measurements were also compared in both sexes obtained from monthly trawl samples from the Clyde Sea Area, Scotland, UK, together with trawl composition data. The laboratory study showed that the hepatosomatic index, and the copper, lipid and water content of the hepatopancreas can be used as indicators of the state of starvation in females, as in males. In the wild, both sexes have reduced nutritional status during the winter, but not to the degree seen in animals starved for 20 weeks in aquarium trials. This study does not support the hypothesis that females cease feeding over winter, during their brooding period. Firstly, some females were unable to sustain ovary development during starvation under controlled conditions, contrary to field observations. Secondly, field data suggests that there is no sex-specific reduction in nutritional status

    The Health Impact of Symptomatic Adult Spinal Deformity: Comparison of Deformity Types to United States Population Norms and Chronic Diseases.

    Get PDF
    Study designA retrospective analysis of a prospective, multicenter database.ObjectiveThe aim of this study was to evaluate the health impact of symptomatic adult spinal deformity (SASD) by comparing Standard Form Version 2 (SF-36) scores for SASD with United States normative and chronic disease values.Summary of background dataRecent data have identified radiographic parameters correlating with poor health-related quality of life for SASD. Disability comparisons between SASD patients and patients with chronic diseases may provide further insight to the disease burden caused by SASD.MethodsConsecutive SASD patients, with no history of spine surgery, were enrolled into a multicenter database and evaluated for type and severity of spinal deformity. Baseline SF-36 physical component summary (PCS) and mental component summary (MCS) values for SASD patients were compared with reported U.S. normative and chronic disease SF-36 scores. SF-36 scores were reported as normative-based scores (NBS) and evaluated for minimally clinical important difference (MCID).ResultsBetween 2008 and 2011, 497 SASD patients were prospectively enrolled and evaluated. Mean PCS for all SASD was lower than U.S. total population (ASD = 40.9; US = 50; P < 0.05). Generational decline in PCS for SASD patients with no other reported comorbidities was more rapid than U.S. norms (P < 0.05). PCS worsened with lumbar scoliosis and increasing sagittal vertical axis (SVA). PCS scores for patients with isolated thoracic scoliosis were similar to values reported by individuals with chronic back pain (45.5 vs 45.7, respectively; P > 0.05), whereas patients with lumbar scoliosis combined with severe sagittal malalignment (SVA >10 cm) demonstrated worse PCS scores than values reported by patients with limited use of arms and legs (24.7 vs 29.1, respectively; P < 0.05).ConclusionsSASD is a heterogeneous condition that, depending upon the type and severity of the deformity, can have a debilitating impact on health often exceeding the disability of more recognized chronic diseases. Health care providers must be aware of the types of SASD that correlate with disability to facilitate appropriate diagnosis, treatment, and research efforts.Level of evidence3

    High-dose therapy and blood or marrow transplantation for non-Hodgkin lymphoma with central nervous system involvement

    Get PDF
    AbstractThe role of autologous or allogeneic blood or marrow transplantation (BMT) remains undefined in patients with central nervous system (CNS) involvement by lymphoma. The records of all adult and pediatric non-Hodgkin lymphoma patients receiving BMT at Johns Hopkins from 1980 to 2003 were reviewed, and 37 patients were identified who had CNS involvement that was treated into remission by the time of BMT. The chief histologies were diffuse large B-cell lymphoma and T-cell lymphoblastic lymphoma/leukemia. Twenty-four percent received intrathecal chemotherapy alone, and 70% received intrathecal chemotherapy and CNS irradiation before BMT. The main preparative regimens were cyclophosphamide/total body irradiation and busulfan/cyclophosphamide. Forty-one percent received an allogeneic transplant. Lymphoma relapsed after BMT in 14 patients (38%), and at least 5 had documented or suspected CNS relapse. In multivariate models, age ≥18 years at diagnosis, resistant systemic disease, busulfan/cyclophosphamide conditioning, and lack of intrathecal consolidation after BMT were statistically significant predictors of inferior survival. The 5-year actuarial event-free survival was 36%, and overall survival was 39%. After BMT, long-term survival is thus achievable in a subset of patients with a history of treated CNS involvement by non-Hodgkin lymphoma. The survival rates are not dissimilar to those typically seen in other high-risk lymphoma patients undergoing BMT. These data suggest that patients with lymphomatous involvement of the CNS who achieve CNS remission should be offered BMT if it is otherwise indicated

    A crossover randomised controlled trial of oral mandibular advancement devices for obstructive sleep apnoea-hypopnoea (TOMADO)

    Get PDF
    Rationale Mandibular advancement devices (MADs) are used to treat obstructive sleep apnoea-hypopnoea syndrome (OSAHS) but evidence is lacking regarding their clinical and cost-effectiveness in less severe disease. Objectives To compare clinical- and cost-effectiveness of a range of MADs against no treatment in mild to moderate OSAHS. Measurements and methods This open-label, randomised, controlled, crossover trial was undertaken at a UK sleep centre. Adults with Apnoea-Hypopnoea Index (AHI) 5–<30/h and Epworth Sleepiness Scale (ESS) score ≥9 underwent 6 weeks of treatment with three nonadjustable MADs: self-moulded (SleepPro 1; SP1); semi-bespoke (SleepPro 2; SP2); fully-bespoke MAD (bMAD); and 4 weeks no treatment. Primary outcome was AHI scored by a polysomnographer blinded to treatment. Secondary outcomes included ESS, quality of life, resource use and cost. Main results 90 patients were randomised and 83 were analysed. All devices reduced AHI compared with no treatment by 26% (95% CI 11% to 38%, p=0.001) for SP1, 33% (95% CI 24% to 41%) for SP2 and 36% (95% CI 24% to 45%, p<0.001) for bMAD. ESS was 1.51 (95% CI 0.73 to 2.29, p<0.001, SP1) to 2.37 (95% CI 1.53 to 3.22, p<0.001, bMAD) lower than no treatment (p<0.001 for all). Compliance was lower for SP1, which was the least preferred treatment at trial exit. All devices were cost-effective compared with no treatment at a £20 000/quality-adjusted life year (QALY) threshold. SP2 was the most cost-effective up to £39 800/QALY. Conclusions Non-adjustable MADs achieve clinically important improvements in mild to moderate OSAHS and are cost-effective

    A self-controlled case series study to measure the risk of SARS-CoV-2 infection associated with attendance at sporting and cultural events: the UK Events Research Programme events.

    Get PDF
    BACKGROUND: In 2021, whilst societies were emerging from major social restrictions during the SARS-CoV-2 pandemic, the UK government instigated an Events Research Programme to examine the risk of COVID-19 transmission from attendance at cultural events and explore ways to enable people to attend a range of events whilst minimising risk of transmission. We aimed to measure any impact on risk of COVID-19 transmission from attendance at events held at or close to commercially viable capacity using routinely collected data. METHODS: Data were obtained on attendees at Phase 3 Events Research Programme events, for which some infection risk mitigation measures were in place (i.e. evidence of vaccination or a negative lateral flow test). Attendance data were linked with COVID-19 test result data from the UK Test and Trace system. Using a self-controlled case series design, we measured the within person incidence rate ratio for testing positive for COVID-19, comparing the rate in days 3 to 9 following event attendance (high risk period) with days 1 and 2 and 10-16 (baseline period). Rate ratios were adjusted for estimates of underlying regional COVID-19 prevalence to account for population level fluctuations in infection risk, and events were grouped into broadly similar types. RESULTS: From attendance data available for 188,851 attendees, 3357 people tested positive for COVID-19 during the observation period. After accounting for total testing trends over the period, incidence rate ratios and 95% confidence intervals for positive tests were 1.16 (0.53-2.57) for indoor seated events, 1.12 (0.95-1.30) for mainly outdoor seated events, 0.65 (0.51-0.83) for mainly outdoor partially seated events, and 1.70 (1.52-1.89) for mainly outdoor unseated multi-day events. CONCLUSIONS: For the majority of event types studied in the third phase of the UK Events Research Programme, we found no evidence of an increased risk of COVID-19 transmission associated with event attendance. However, we found a 70% increased risk of infection associated with attendance at mainly outdoor unseated multi-day events. We have also demonstrated a novel use for self-controlled case series methodology in monitoring infection risk associated with event attendance

    The American Association for the Surgery of Trauma renal injury grading scale: Implications of the 2018 revisions for injury reclassification and predicting bleeding interventions.

    Get PDF
    BackgroundIn 2018, the American Association for the Surgery of Trauma (AAST) published revisions to the renal injury grading system to reflect the increased reliance on computed tomography scans and non-operative management of high-grade renal trauma (HGRT). We aimed to evaluate how these revisions will change the grading of HGRT and if it outperforms the original 1989 grading in predicting bleeding control interventions.MethodsData on HGRT were collected from 14 Level-1 trauma centers from 2014 to 2017. Patients with initial computed tomography scans were included. Two radiologists reviewed the scans to regrade the injuries according to the 1989 and 2018 AAST grading systems. Descriptive statistics were used to assess grade reclassifications. Mixed-effect multivariable logistic regression was used to measure the predictive ability of each grading system. The areas under the curves were compared.ResultsOf the 322 injuries included, 27.0% were upgraded, 3.4% were downgraded, and 69.5% remained unchanged. Of the injuries graded as III or lower using the 1989 AAST, 33.5% were upgraded to grade IV using the 2018 AAST. Of the grade V injuries, 58.8% were downgraded using the 2018 AAST. There was no statistically significant difference in the overall areas under the curves between the 2018 and 1989 AAST grading system for predicting bleeding interventions (0.72 vs. 0.68, p = 0.34).ConclusionAbout one third of the injuries previously classified as grade III will be upgraded to grade IV using the 2018 AAST, which adds to the heterogeneity of grade IV injuries. Although the 2018 AAST grading provides more anatomic details on injury patterns and includes important radiologic findings, it did not outperform the 1989 AAST grading in predicting bleeding interventions.Level of evidencePrognostic and Epidemiological Study, level III

    A novel sub-seabed CO\u3csub\u3e2\u3c/sub\u3e release experiment informing monitoring and impact assessment for geological carbon storage

    Get PDF
    © 2014 The Authors. Carbon capture and storage is a mitigation strategy that can be used to aid the reduction of anthropogenic CO2 emissions. This process aims to capture CO2 from large point-source emitters and transport it to a long-term storage site. For much of Europe, these deep storage sites are anticipated to be sited below the sea bed on continental shelves. A key operational requirement is an understanding of best practice of monitoring for potential leakage and of the environmental impact that could result from a diffusive leak from a storage complex. Here we describe a controlled CO2 release experiment beneath the seabed, which overcomes the limitations of laboratory simulations and natural analogues. The complex processes involved in setting up the experimental facility and ensuring its successful operation are discussed, including site selection, permissions, communications and facility construction. The experimental design and observational strategy are reviewed with respect to scientific outcomes along with lessons learnt in order to facilitate any similar future

    Chlamydia trachomatis from Australian Aboriginal people with trachoma are polyphyletic composed of multiple distinctive lineages.

    Get PDF
    Chlamydia trachomatis causes sexually transmitted infections and the blinding disease trachoma. Current data on C. trachomatis phylogeny show that there is only a single trachoma-causing clade, which is distinct from the lineages causing urogenital tract (UGT) and lymphogranuloma venerum diseases. Here we report the whole-genome sequences of ocular C. trachomatis isolates obtained from young children with clinical signs of trachoma in a trachoma endemic region of northern Australia. The isolates form two lineages that fall outside the classical trachoma lineage, instead being placed within UGT clades of the C. trachomatis phylogenetic tree. The Australian trachoma isolates appear to be recombinants with UGT C. trachomatis genome backbones, in which loci that encode immunodominant surface proteins (ompA and pmpEFGH) have been replaced by those characteristic of classical ocular isolates. This suggests that ocular tropism and association with trachoma are functionally associated with some sequence variants of ompA and pmpEFGH
    • …
    corecore