9 research outputs found

    Particulate emissions from a steel works a quantitative ecological assessment

    Get PDF
    The research presented here was the response to an improvement condition issued by the Environment Agency. The aim of the present study was to examine the potential effects of the particulate emissions from an integrated iron and steel works, to an adjacent sand dune ecosystem, identified as a Site of Special Scientific Importance (SSSI) for its flora, fauna and bird life. A monitoring and assessment of the deposition and flux of particulates was undertaken from April 2006 to September 2007 at monitoring sites located on the iron and steel works and in the surrounding area. A passive particulate deposition and flux monitoring study was undertaken at six sites on and surrounding the integrated iron and steel works using Frisbee deposit gauges and sticky pads. The deposition and flux of particulates was significantly higher at the monitoring sites located on or close to the works, and decreased up to 3 km from the works. The particles found on and near the works were predominately iron-rich, and most likely to be a result of emissions from the works. The chemical characteristics of the particles identified further away from the works were more diverse, and a combination of marine, soil, combustion or industrially derived particles. A desk-top review and development of the model scenario was undertaken, to assess the relevance of the modelled scenarios of PM(_10) emissions, to the total dust deposited to the passive deposition gauges. The model scenario was found to be an important qualitative tool, but could not be used to predict quantitative measurements of particulate deposition due to the limitations and uncertainty of modelling. The deposition of particulates to the SSSI was significantly higher at sites located closer to the works, and increased significantly with exposure time. The iron concentration of the soil was found to be significantly higher on the SSSI than at a sand dune ecosystem 3.5 km away. Cleaned leaves of Leymus arenarius and Plantago lanceolata had a significantly higher rate of photosynthesis compared to untreated leaves growing on the SSSI. Therefore at sites where the rate of particulate deposition was relatively higher, and increased soil iron concentration was enhanced in comparison to similar sites, the rate of photosynthesis was significantly reduced in leaves of Leymus and Plantago on the SSSI

    Impact of the radiation balance on snowmelt in a sparse deciduous birch forest

    Get PDF
    The representation of high-latitude surface processes and quantifying surface-climate feedbacks are some of the most serious shortcomings of present day Arctic land surface modelling. The energy balance of seasonally snow-covered sparse deciduous forests at high latitudes is poorly understood and inaccurately represented within hydrological and climate models. Snow cover plays an important role in wintertime fluxes of energy, water and carbon, controlling the length of the active growing season and hence the overall carbon balance of Arctic ecosystems. Snow cover is non-uniform and spatially variable, as wind redistributes snow from areas of exposed open tundra to sheltered areas within the forest, where a deeper snowpack develops. Low solar zenith angles, coupled with sparse deciduous leafless trees, cast shadows across the snow surface. The spatial distribution of canopy gaps determines the timing of direct radiation which penetrates down through the canopy to the snow surface. The forest canopy also excludes incoming longwave radiation and yet also emits longwave radiation to the snow surface. Consequently the forest canopy plays a key role in the radiation balance of sparse forests. To improve our knowledge of these complex processes, meteorological and field observations were taken in an area of highly heterogeneous birch Betula pubescens ssp. czerepanovii forest in Abisko, Sweden during the spring of 2008 and 2009. Detailed measurements of short and longwave radiation above and below the canopy, hemispherical photographs, tree temperatures and snow surveys were conducted to quantify the radiation balance of the sparse deciduous forest. An array of below canopy pyranometers found the mean canopy transmissivity to be 74 % in 2008 and 76 % in 2009. Hemispherical photographs taken at the pyranometer locations analysed with Gap Light Analyzer (GLA) showed reasonable agreement with a mean canopy transmissivity of 75 % in 2008 and 74 % in 2009. The canopy transmissivity was found to be independent of the diffuse fraction of radiation as the canopy is very sparse. A series of survey grids and transects were established to scale up from the below canopy pyranometers to the landscape scale. Hemispherical photographs analysed with GLA showed the sparse forest canopy had a mean transmissivity of 78 % and a mean LAI of 0.25, whereas the open tundra had a mean transmissivity of 97 % and a mean LAI of < 0.01. Snow surveys showed the sparse forest snow depth to vary between 0.34 and 0.55 m, whereas the snow depth in the open tundra varied between 0.12 and 0.18 m. Observations of canopy temperatures showed a strong influence of incident shortwave radiation warming the tree branches to temperatures up to 15 °C warmer than ambient air temperature on the south facing sides of the trees, and up to 6 °C on the north facing sides of the trees. To reproduce the observed radiation balance, two canopy models (Homogenous and Clumped) were developed. The Homogeneous canopy model assumes a single tree tile with a uniform sparse canopy. The Clumped canopy model assumes a tree and a grass tile, where the tree tile is permanently in shade from the canopy and the grass tile receives all the incoming radiation. These canopy models identified the need for a parameter that accounts for the spatial and temporal variation of the shaded gaps within the sparse forest. JULES (Joint UK Land Environment Simulator) is the community land surface model used in the UK Hadley Centre GCM suite. Modifications of the land-surface interactions were included in JULES to represent the shaded gaps within the sparse deciduous forest. New parameterisations were developed for the time-varying sunlit fractions of the gap (flit), the sky-view fraction (fv), and the longwave radiation emitted from the canopy (LWtree). These model developments were informed by field observations of the forest canopy and evaluated against the below canopy short and longwave radiation observed data sets. The JULES Shaded gap model output showed a strong positive relationship with the observations of below canopy shortwave and longwave radiation. The JULES Shaded gap model improves the ratio of observed to modelled short and longwave radiation on sunny days compared to the JULES model. The JULES Shaded gap model reduces the time to snow melt by 2 to 4 days compared to the JULES model, making the model output more aligned with in-situ observational data. This shortening of the modelled snow-season directly impacts on the simulated carbon and water balance regionally and has wider relevance at the pan-Arctic scale. When JULES Shaded Gap was evaluated on the global scale, it improved the modelled snowmass across large areas of sparse forest in northern Canada, Scandinavia and Northern Russia with respect to GlobSnow. The performance of the land surface-snow-vegetation interactions of JULES was improved by using the Shaded gap to model the radiation balance of sparse forests in climate-sensitive Arctic regions. Furthermore these observational data can be used to develop and evaluate high latitude land-surface processes and biogeochemical feedbacks in other earth system models

    Explicitly modelling microtopography in permafrost landscapes in a land surface model (JULES vn5.4_microtopography)

    Get PDF
    Microtopography can be a key driver of heterogeneity in the ground thermal and hydrological regime of permafrost landscapes. In turn, this heterogeneity can influence plant communities, methane fluxes, and the initiation of abrupt thaw processes. Here we have implemented a two-tile representation of microtopography in JULES (the Joint UK Land Environment Simulator), where tiles are representative of repeating patterns of elevation difference. Tiles are coupled by lateral flows of water, heat, and redistribution of snow, and a surface water store is added to represent ponding. Simulations are performed of two Siberian polygon sites, (Samoylov and Kytalyk) and two Scandinavian palsa sites (Stordalen and Iškoras). The model represents the observed differences between greater snow depth in hollows vs. raised areas well. The model also improves soil moisture for hollows vs. the non-tiled configuration (“standard JULES”) though the raised tile remains drier than observed. The modelled differences in snow depths and soil moisture between tiles result in the lower tile soil temperatures being warmer for palsa sites, as in reality. However, when comparing the soil temperatures for July at 20 cm depth, the difference in temperature between tiles, or “temperature splitting”, is smaller than observed (3.2 vs. 5.5 ∘C). Polygons display small (0.2 ∘C) to zero temperature splitting, in agreement with observations. Consequently, methane fluxes are near identical (+0 % to 9 %) to those for standard JULES for polygons, although they can be greater than standard JULES for palsa sites (+10 % to 49 %). Through a sensitivity analysis we quantify the relative importance of model processes with respect to soil moisture and temperatures, identifying which parameters result in the greatest uncertainty in modelled temperature. Varying the palsa elevation between 0.5 and 3 m has little effect on modelled soil temperatures, showing that using only two tiles can still be a valid representation of sites with a range of palsa elevations. Mire saturation is heavily dependent on landscape-scale drainage. Lateral conductive fluxes, while small, reduce the temperature splitting by ∼ 1 ∘C and correspond to the order of observed lateral degradation rates in peat plateau regions, indicating possible application in an area-based thaw model

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    Multiorgan MRI findings after hospitalisation with COVID-19 in the UK (C-MORE): a prospective, multicentre, observational cohort study

    Get PDF
    Introduction: The multiorgan impact of moderate to severe coronavirus infections in the post-acute phase is still poorly understood. We aimed to evaluate the excess burden of multiorgan abnormalities after hospitalisation with COVID-19, evaluate their determinants, and explore associations with patient-related outcome measures. Methods: In a prospective, UK-wide, multicentre MRI follow-up study (C-MORE), adults (aged ≥18 years) discharged from hospital following COVID-19 who were included in Tier 2 of the Post-hospitalisation COVID-19 study (PHOSP-COVID) and contemporary controls with no evidence of previous COVID-19 (SARS-CoV-2 nucleocapsid antibody negative) underwent multiorgan MRI (lungs, heart, brain, liver, and kidneys) with quantitative and qualitative assessment of images and clinical adjudication when relevant. Individuals with end-stage renal failure or contraindications to MRI were excluded. Participants also underwent detailed recording of symptoms, and physiological and biochemical tests. The primary outcome was the excess burden of multiorgan abnormalities (two or more organs) relative to controls, with further adjustments for potential confounders. The C-MORE study is ongoing and is registered with ClinicalTrials.gov, NCT04510025. Findings: Of 2710 participants in Tier 2 of PHOSP-COVID, 531 were recruited across 13 UK-wide C-MORE sites. After exclusions, 259 C-MORE patients (mean age 57 years [SD 12]; 158 [61%] male and 101 [39%] female) who were discharged from hospital with PCR-confirmed or clinically diagnosed COVID-19 between March 1, 2020, and Nov 1, 2021, and 52 non-COVID-19 controls from the community (mean age 49 years [SD 14]; 30 [58%] male and 22 [42%] female) were included in the analysis. Patients were assessed at a median of 5·0 months (IQR 4·2–6·3) after hospital discharge. Compared with non-COVID-19 controls, patients were older, living with more obesity, and had more comorbidities. Multiorgan abnormalities on MRI were more frequent in patients than in controls (157 [61%] of 259 vs 14 [27%] of 52; p&lt;0·0001) and independently associated with COVID-19 status (odds ratio [OR] 2·9 [95% CI 1·5–5·8]; padjusted=0·0023) after adjusting for relevant confounders. Compared with controls, patients were more likely to have MRI evidence of lung abnormalities (p=0·0001; parenchymal abnormalities), brain abnormalities (p&lt;0·0001; more white matter hyperintensities and regional brain volume reduction), and kidney abnormalities (p=0·014; lower medullary T1 and loss of corticomedullary differentiation), whereas cardiac and liver MRI abnormalities were similar between patients and controls. Patients with multiorgan abnormalities were older (difference in mean age 7 years [95% CI 4–10]; mean age of 59·8 years [SD 11·7] with multiorgan abnormalities vs mean age of 52·8 years [11·9] without multiorgan abnormalities; p&lt;0·0001), more likely to have three or more comorbidities (OR 2·47 [1·32–4·82]; padjusted=0·0059), and more likely to have a more severe acute infection (acute CRP &gt;5mg/L, OR 3·55 [1·23–11·88]; padjusted=0·025) than those without multiorgan abnormalities. Presence of lung MRI abnormalities was associated with a two-fold higher risk of chest tightness, and multiorgan MRI abnormalities were associated with severe and very severe persistent physical and mental health impairment (PHOSP-COVID symptom clusters) after hospitalisation. Interpretation: After hospitalisation for COVID-19, people are at risk of multiorgan abnormalities in the medium term. Our findings emphasise the need for proactive multidisciplinary care pathways, with the potential for imaging to guide surveillance frequency and therapeutic stratification

    Cognitive and psychiatric symptom trajectories 2–3 years after hospital admission for COVID-19: a longitudinal, prospective cohort study in the UK

    No full text
    Background: COVID-19 is known to be associated with increased risks of cognitive and psychiatric outcomes after the acute phase of disease. We aimed to assess whether these symptoms can emerge or persist more than 1 year after hospitalisation for COVID-19, to identify which early aspects of COVID-19 illness predict longer-term symptoms, and to establish how these symptoms relate to occupational functioning. Methods: The Post-hospitalisation COVID-19 study (PHOSP-COVID) is a prospective, longitudinal cohort study of adults (aged ≥18 years) who were hospitalised with a clinical diagnosis of COVID-19 at participating National Health Service hospitals across the UK. In the C-Fog study, a subset of PHOSP-COVID participants who consented to be recontacted for other research were invited to complete a computerised cognitive assessment and clinical scales between 2 years and 3 years after hospital admission. Participants completed eight cognitive tasks, covering eight cognitive domains, from the Cognitron battery, in addition to the 9-item Patient Health Questionnaire for depression, the Generalised Anxiety Disorder 7-item scale, the Functional Assessment of Chronic Illness Therapy Fatigue Scale, and the 20-item Cognitive Change Index (CCI-20) questionnaire to assess subjective cognitive decline. We evaluated how the absolute risks of symptoms evolved between follow-ups at 6 months, 12 months, and 2–3 years, and whether symptoms at 2–3 years were predicted by earlier aspects of COVID-19 illness. Participants completed an occupation change questionnaire to establish whether their occupation or working status had changed and, if so, why. We assessed which symptoms at 2–3 years were associated with occupation change. People with lived experience were involved in the study. Findings: 2469 PHOSP-COVID participants were invited to participate in the C-Fog study, and 475 participants (191 [40·2%] females and 284 [59·8%] males; mean age 58·26 [SD 11·13] years) who were discharged from one of 83 hospitals provided data at the 2–3-year follow-up. Participants had worse cognitive scores than would be expected on the basis of their sociodemographic characteristics across all cognitive domains tested (average score 0·71 SD below the mean [IQR 0·16–1·04]; p<0·0001). Most participants reported at least mild depression (263 [74·5%] of 353), anxiety (189 [53·5%] of 353), fatigue (220 [62·3%] of 353), or subjective cognitive decline (184 [52·1%] of 353), and more than a fifth reported severe depression (79 [22·4%] of 353), fatigue (87 [24·6%] of 353), or subjective cognitive decline (88 [24·9%] of 353). Depression, anxiety, and fatigue were worse at 2–3 years than at 6 months or 12 months, with evidence of both worsening of existing symptoms and emergence of new symptoms. Symptoms at 2–3 years were not predicted by the severity of acute COVID-19 illness, but were strongly predicted by the degree of recovery at 6 months (explaining 35·0–48·8% of the variance in anxiety, depression, fatigue, and subjective cognitive decline); by a biocognitive profile linking acutely raised D-dimer relative to C-reactive protein with subjective cognitive deficits at 6 months (explaining 7·0–17·2% of the variance in anxiety, depression, fatigue, and subjective cognitive decline); and by anxiety, depression, fatigue, and subjective cognitive deficit at 6 months. Objective cognitive deficits at 2–3 years were not predicted by any of the factors tested, except for cognitive deficits at 6 months, explaining 10·6% of their variance. 95 of 353 participants (26·9% [95% CI 22·6–31·8]) reported occupational change, with poor health being the most common reason for this change. Occupation change was strongly and specifically associated with objective cognitive deficits (odds ratio [OR] 1·51 [95% CI 1·04–2·22] for every SD decrease in overall cognitive score) and subjective cognitive decline (OR 1·54 [1·21–1·98] for every point increase in CCI-20). Interpretation: Psychiatric and cognitive symptoms appear to increase over the first 2–3 years post-hospitalisation due to both worsening of symptoms already present at 6 months and emergence of new symptoms. New symptoms occur mostly in people with other symptoms already present at 6 months. Early identification and management of symptoms might therefore be an effective strategy to prevent later onset of a complex syndrome. Occupation change is common and associated mainly with objective and subjective cognitive deficits. Interventions to promote cognitive recovery or to prevent cognitive decline are therefore needed to limit the functional and economic impacts of COVID-19. Funding: National Institute for Health and Care Research Oxford Health Biomedical Research Centre, Wolfson Foundation, MQ Mental Health Research, MRC-UK Research and Innovation, and National Institute for Health and Care Research.</p

    Prospective observational cohort study on grading the severity of postoperative complications in global surgery research

    Get PDF
    Background The Clavien–Dindo classification is perhaps the most widely used approach for reporting postoperative complications in clinical trials. This system classifies complication severity by the treatment provided. However, it is unclear whether the Clavien–Dindo system can be used internationally in studies across differing healthcare systems in high- (HICs) and low- and middle-income countries (LMICs). Methods This was a secondary analysis of the International Surgical Outcomes Study (ISOS), a prospective observational cohort study of elective surgery in adults. Data collection occurred over a 7-day period. Severity of complications was graded using Clavien–Dindo and the simpler ISOS grading (mild, moderate or severe, based on guided investigator judgement). Severity grading was compared using the intraclass correlation coefficient (ICC). Data are presented as frequencies and ICC values (with 95 per cent c.i.). The analysis was stratified by income status of the country, comparing HICs with LMICs. Results A total of 44 814 patients were recruited from 474 hospitals in 27 countries (19 HICs and 8 LMICs). Some 7508 patients (16·8 per cent) experienced at least one postoperative complication, equivalent to 11 664 complications in total. Using the ISOS classification, 5504 of 11 664 complications (47·2 per cent) were graded as mild, 4244 (36·4 per cent) as moderate and 1916 (16·4 per cent) as severe. Using Clavien–Dindo, 6781 of 11 664 complications (58·1 per cent) were graded as I or II, 1740 (14·9 per cent) as III, 2408 (20·6 per cent) as IV and 735 (6·3 per cent) as V. Agreement between classification systems was poor overall (ICC 0·41, 95 per cent c.i. 0·20 to 0·55), and in LMICs (ICC 0·23, 0·05 to 0·38) and HICs (ICC 0·46, 0·25 to 0·59). Conclusion Caution is recommended when using a treatment approach to grade complications in global surgery studies, as this may introduce bias unintentionally

    Critical care admission following elective surgery was not associated with survival benefit:prospective analysis of data from 27 countries

    No full text
    Purpose: As global initiatives increase patient access to surgical treatments, there is a need to define optimal levels of perioperative care. Our aim was to describe the relationship between the provision and use of critical care resources and postoperative mortality. Methods: Planned analysis of data collected during an international 7-day cohort study of adults undergoing elective in-patient surgery. We used risk-adjusted mixed-effects logistic regression models to evaluate the association between admission to critical care immediately after surgery and in-hospital mortality. We evaluated hospital-level associations between mortality and critical care admission immediately after surgery, critical care admission to treat life-threatening complications, and hospital provision of critical care beds. We evaluated the effect of national income using interaction tests. Results: 44,814 patients from 474 hospitals in 27 countries were available for analysis. Death was more frequent amongst patients admitted directly to critical care after surgery (critical care: 103/4317 patients [2%], standard ward: 99/39,566 patients [0.3%]; adjusted OR 3.01 [2.10–5.21]; p &lt; 0.001). This association may differ with national income (high income countries OR 2.50 vs. low and middle income countries OR 4.68; p = 0.07). At hospital level, there was no association between mortality and critical care admission directly after surgery (p = 0.26), critical care admission to treat complications (p = 0.33), or provision of critical care beds (p = 0.70). Findings of the hospital-level analyses were not affected by national income status. A sensitivity analysis including only high-risk patients yielded similar findings. Conclusions: We did not identify any survival benefit from critical care admission following surgery
    corecore