38 research outputs found

    Regional modelling of permafrost thicknesses over the past 130 ka: implications for permafrost development in Great Britain

    Get PDF
    The greatest thicknesses of permafrost in Great Britain most likely occurred during the last glacial–interglacial cycle, as this is when some of the coldest conditions occurred during the last 1 000 000 years. The regional development of permafrost across Great Britain during the last glacial–interglacial cycle was modelled from a ground surface temperature history based on mean annual temperatures and the presence of glacier ice. To quantify the growth and decay of permafrost, modelling was undertaken at six locations across Great Britain that represent upland glaciated, lowland glaciated, upland unglaciated and lowland unglaciated conditions. Maximum predicted permafrost depths derived in this academic study range between several tens of metres to over 100 m depending upon various factors including elevation, glacier ice cover, geothermal heat flux and air temperature. In general, the greatest maximum permafrost thicknesses occur at upland glaciated locations, with minimum thickness at lowland sites. Current direct geological evidence for permafrost is from surface or shallow processes, mainly associated with the active layer. Further research is recommended to identify the imprint of freeze/thaw conditions in permanently frozen porous rocks from beneath the active layer

    Geothermal exploration in the Fell Sandstone Formation (Mississippian) beneath the city centre of Newcastle upon Tyne, UK: the Newcastle Science Central Deep Geothermal Borehole

    Get PDF
    The postulate that geothermal energy might be recoverable from strata laterally equivalent to the Fell Sandstone Formation (Carboniferous: Mississippian) beneath Newcastle upon Tyne has been examined by the drilling and testing of the 1821 m deep Newcastle Science Central Deep Geothermal Borehole. This proved 376.5 m of Fell Sandstone Formation below 1400 m, much of which resembled braided river deposits found at outcrop, although some lower portions were reddened and yielded grains of aeolian affinity. Downhole logging after attainment of thermal equilibrium proved a temperature of 73°C at 1740 m, and allowed estimation of heat flow at about 88 mW m−2. This relatively high value probably reflects deep convective transfer of heat over a distance of >8 km from the North Pennine Batholith, along the Ninety Fathom Fault. The Fell Sandstone traversed by the borehole proved to be of low hydraulic conductivity (c. 7×10−5 m d−1). The water that entered the well was highly saline, with a Na–(Ca)–Cl signature similar to other warm waters encountered in the region. It remains for future directional drilling to establish whether sufficient natural fracture permeability can be encountered, or wells stimulated, to support commercial heat production

    Unlocking the potential of geothermal energy in the UK

    Get PDF
    This report is intended to provide technical information that complement the BGS Science Briefing Note: Deep impact: Unlocking the potential of geothermal energy for affordable low-carbon heating in the UK [1]. It gives a general overview of the deep geothermal opportunities that exist in the UK (although regional geothermal potential is not discussed here) as well as of financial, policy and regulatory actions that are needed to support the effective development and exploitation of deep geothermal resources in the UK. The recommendations are applicable to the UK government and its departments as well as to devolved administrations in Scotland, Northern Ireland and Wales, and devolved policy areas, such as heat policy and planning, in the respective nations. Following the introduction, the report is organised in three sections. In Section 1, details are given of the UK’s deep geothermal resources and how and where they could be utilised. Section 2 focuses on the experiences of continental Europe and the policies that have enabled the growth of a geothermal industry. Section 3 considers key policies and regulatory actions identified as necessary to drive the development of the UK geothermal sector from its current status of infancy to a mature technology that is universally recognised and utilised by a wide range of stakeholders, end-users and supported by investors

    Critical perspectives on ‘consumer involvement’ in health research: epistemological dissonance and the know-do gap

    Get PDF
    Researchers in the area of health and social care (both in Australia and internationally) are encouraged to involve consumers throughout the research process, often on ethical, political and methodological grounds, or simply as ‘good practice’. This paper presents findings from a qualitative study in the UK of researchers’ experiences and views of consumer involvement in health research. Two main themes are presented in the paper. Firstly, we explore the ‘know-do gap’ which relates to the tensions between researchers’ perceptions of the potential benefits of, and their actual practices in relation to, consumer involvement. Secondly, we focus on one of the reasons for this ‘know-do gap’, namely epistemological dissonance. Findings are linked to issues around consumerism in research, lay/professional knowledges, the (re)production of professional and consumer identities and the maintenance of boundaries between consumers and researchers

    Simultaneous generation of many RNA-seq libraries in a single reaction

    Get PDF
    Although RNA-seq is a powerful tool, the considerable time and cost associated with library construction has limited its utilization for various applications. RNAtag-Seq, an approach to generate multiple RNA-seq libraries in a single reaction, lowers time and cost per sample, and it produces data on prokaryotic and eukaryotic samples that are comparable to those generated by traditional strand-specific RNA-seq approaches

    Improving the diagnosis and treatment of urinary tract infection in young children in primary care:results from the ‘DUTY’ prospective diagnostic cohort study

    Get PDF
    PURPOSE Up to 50% of urinary tract infections (UTIs) in young children are missed in primary care. Urine culture is essential for diagnosis, but urine collection is often difficult. Our aim was to derive and internally validate a 2-step clinical rule using (1) symptoms and signs to select children for urine collection; and (2) symptoms, signs, and dipstick testing to guide antibiotic treatment. METHODS We recruited acutely unwell children aged under 5 years from 233 primary care sites across England and Wales. Index tests were parent-reported symptoms, clinician-reported signs, urine dipstick results, and clinician opinion of UTI likelihood (clinical diagnosis before dipstick and culture). The reference standard was microbiologically confirmed UTI cultured from a clean-catch urine sample. We calculated sensitivity, specificity, and area under the receiver operator characteristic (AUROC) curve of coefficient-based (graded severity) and points-based (dichotomized) symptom/sign logistic regression models, and we then internally validated the AUROC using bootstrapping. RESULTS Three thousand thirty-six children provided urine samples, and culture results were available for 2,740 (90%). Of these results, 60 (2.2%) were positive: the clinical diagnosis was 46.6% sensitive, with an AUROC of 0.77. Previous UTI, increasing pain/crying on passing urine, increasingly smelly urine, absence of severe cough, increasing clinician impression of severe illness, abdominal tenderness on examination, and normal findings on ear examination were associated with UTI. The validated coefficient- and points-based model AUROCs were 0.87 and 0.86, respectively, increasing to 0.90 and 0.90, respectively, by adding dipstick nitrites, leukocytes, and blood. CONCLUSIONS A clinical rule based on symptoms and signs is superior to clinician diagnosis and performs well for identifying young children for noninvasive urine sampling. Dipstick results add further diagnostic value for empiric antibiotic treatment

    Longitudinal evaluation of aflatoxin exposure in two cohorts in south-western Uganda

    Get PDF
    Aflatoxins (AF) are a group of mycotoxins. AF exposure causes acute and chronic adverse health effects such as aflatoxicosis and hepatocellular carcinoma in human populations, especially in the developing world. In this study, AF exposure was evaluated using archived serum samples from human immunodeficiency virus (HIV)-seronegative participants from two cohort studies in south-western Uganda. AFB1-lysine (AFB-Lys) adduct levels were determined via HPLC fluorescence in a total of 713 serum samples from the General Population Cohort (GPC), covering eight time periods between 1989 and 2010. Overall, 90% (642/713) of the samples were positive for AFB-Lys and the median level was 1.58 pg mg(-1) albumin (range = 0.40-168 pg mg(-1) albumin). AFB-Lys adduct levels were also measured in a total of 374 serum samples from the Rakai Community Cohort Study (RCCS), across four time periods between 1999 and 2003. The averaged detection rate was 92.5% (346/374) and the median level was 1.18 pg mg(-1) albumin (range = 0.40-122.5 pg mg(-1) albumin). In the GPC study there were no statistically significant differences between demographic parameters, such as age, sex and level of education, and levels of serum AFB-Lys adduct. In the RCCS study, longitudinal analysis using generalised estimating equations revealed significant differences between the adduct levels and residential areas (p = 0.05) and occupations (p = 0.02). This study indicates that AF exposure in people in two populations in south-western Uganda is persistent and has not significantly changed over time. Data from one study, but not the other, indicated that agriculture workers and rural area residents had more AF exposure than those non-agricultural workers and non-rural area residents. These results suggest the need for further study of AF-induced human adverse health effects, especially the predominant diseases in the region

    Reduced fire severity offers near-term buffer to climate-driven declines in conifer resilience across the western United States

    Get PDF
    Increasing fire severity and warmer, drier postfire conditions are making forests in the western United States (West) vulnerable to ecological transformation. Yet, the relative importance of and interactions between these drivers of forest change remain unresolved, particularly over upcoming decades. Here, we assess how the interactive impacts of changing climate and wildfire activity influenced conifer regeneration after 334 wildfires, using a dataset of postfire conifer regeneration from 10,230 field plots. Our findings highlight declining regeneration capacity across the West over the past four decades for the eight dominant conifer species studied. Postfire regeneration is sensitive to high-severity fire, which limits seed availability, and postfire climate, which influences seedling establishment. In the near-term, projected differences in recruitment probability between low- and high-severity fire scenarios were larger than projected climate change impacts for most species, suggesting that reductions in fire severity, and resultant impacts on seed availability, could partially offset expected climate-driven declines in postfire regeneration. Across 40 to 42% of the study area, we project postfire conifer regeneration to be likely following low-severity but not high-severity fire under future climate scenarios (2031 to 2050). However, increasingly warm, dry climate conditions are projected to eventually outweigh the influence of fire severity and seed availability. The percent of the study area considered unlikely to experience conifer regeneration, regardless of fire severity, increased from 5% in 1981 to 2000 to 26 to 31% by mid-century, highlighting a limited time window over which management actions that reduce fire severity may effectively support postfire conifer regeneration. © 2023 the Author(s)

    Multiorgan MRI findings after hospitalisation with COVID-19 in the UK (C-MORE): a prospective, multicentre, observational cohort study

    Get PDF
    Introduction: The multiorgan impact of moderate to severe coronavirus infections in the post-acute phase is still poorly understood. We aimed to evaluate the excess burden of multiorgan abnormalities after hospitalisation with COVID-19, evaluate their determinants, and explore associations with patient-related outcome measures. Methods: In a prospective, UK-wide, multicentre MRI follow-up study (C-MORE), adults (aged ≥18 years) discharged from hospital following COVID-19 who were included in Tier 2 of the Post-hospitalisation COVID-19 study (PHOSP-COVID) and contemporary controls with no evidence of previous COVID-19 (SARS-CoV-2 nucleocapsid antibody negative) underwent multiorgan MRI (lungs, heart, brain, liver, and kidneys) with quantitative and qualitative assessment of images and clinical adjudication when relevant. Individuals with end-stage renal failure or contraindications to MRI were excluded. Participants also underwent detailed recording of symptoms, and physiological and biochemical tests. The primary outcome was the excess burden of multiorgan abnormalities (two or more organs) relative to controls, with further adjustments for potential confounders. The C-MORE study is ongoing and is registered with ClinicalTrials.gov, NCT04510025. Findings: Of 2710 participants in Tier 2 of PHOSP-COVID, 531 were recruited across 13 UK-wide C-MORE sites. After exclusions, 259 C-MORE patients (mean age 57 years [SD 12]; 158 [61%] male and 101 [39%] female) who were discharged from hospital with PCR-confirmed or clinically diagnosed COVID-19 between March 1, 2020, and Nov 1, 2021, and 52 non-COVID-19 controls from the community (mean age 49 years [SD 14]; 30 [58%] male and 22 [42%] female) were included in the analysis. Patients were assessed at a median of 5·0 months (IQR 4·2–6·3) after hospital discharge. Compared with non-COVID-19 controls, patients were older, living with more obesity, and had more comorbidities. Multiorgan abnormalities on MRI were more frequent in patients than in controls (157 [61%] of 259 vs 14 [27%] of 52; p<0·0001) and independently associated with COVID-19 status (odds ratio [OR] 2·9 [95% CI 1·5–5·8]; padjusted=0·0023) after adjusting for relevant confounders. Compared with controls, patients were more likely to have MRI evidence of lung abnormalities (p=0·0001; parenchymal abnormalities), brain abnormalities (p<0·0001; more white matter hyperintensities and regional brain volume reduction), and kidney abnormalities (p=0·014; lower medullary T1 and loss of corticomedullary differentiation), whereas cardiac and liver MRI abnormalities were similar between patients and controls. Patients with multiorgan abnormalities were older (difference in mean age 7 years [95% CI 4–10]; mean age of 59·8 years [SD 11·7] with multiorgan abnormalities vs mean age of 52·8 years [11·9] without multiorgan abnormalities; p<0·0001), more likely to have three or more comorbidities (OR 2·47 [1·32–4·82]; padjusted=0·0059), and more likely to have a more severe acute infection (acute CRP >5mg/L, OR 3·55 [1·23–11·88]; padjusted=0·025) than those without multiorgan abnormalities. Presence of lung MRI abnormalities was associated with a two-fold higher risk of chest tightness, and multiorgan MRI abnormalities were associated with severe and very severe persistent physical and mental health impairment (PHOSP-COVID symptom clusters) after hospitalisation. Interpretation: After hospitalisation for COVID-19, people are at risk of multiorgan abnormalities in the medium term. Our findings emphasise the need for proactive multidisciplinary care pathways, with the potential for imaging to guide surveillance frequency and therapeutic stratification
    corecore