23 research outputs found

    N, P and K budgets for crop rotations on nine organic farms in the UK

    Get PDF
    On organic farms, where the importation of materials to build/maintain soil fertility is restricted, it is important that a balance between inputs and outputs of nutrients is achieved to ensure both short-term productivity and long-term sustainability. This paper considers different approaches to nutrient budgeting on organic farms and evaluates the sources of bias in the measurements and/or estimates of the nutrient inputs and outputs. The paper collates 88 nutrient budgets compiled at the farm scale in 9 temperate countries. All the nitrogen (N) budgets showed an N surplus (average 83.2 kg N ha-1 year-1). The efficiency of N use, defined as outputs/inputs, was highest (0.9) and lowest (0.2) in arable and beef systems respectively. The phosphorus (P) and potassium (K) budgets showed both surpluses and deficits (average 3.6 kg P ha-1 year-1, 14.2 kg K ha-1 year-1) with horticultural systems showing large surpluses resulting from purchased manure. The estimation of N fixation and quantities of nutrients in purchased manures may introduce significant errors in nutrient budgets. Overall, the data illustrate the diversity of management systems in place on organic farms, and suggest that used together with soil analysis, nutrient budgets are a useful tool for improving the long-term sustainability of organic systems

    Effects of sleep disturbance on dyspnoea and impaired lung function following hospital admission due to COVID-19 in the UK: a prospective multicentre cohort study

    Get PDF
    Background: Sleep disturbance is common following hospital admission both for COVID-19 and other causes. The clinical associations of this for recovery after hospital admission are poorly understood despite sleep disturbance contributing to morbidity in other scenarios. We aimed to investigate the prevalence and nature of sleep disturbance after discharge following hospital admission for COVID-19 and to assess whether this was associated with dyspnoea. Methods: CircCOVID was a prospective multicentre cohort substudy designed to investigate the effects of circadian disruption and sleep disturbance on recovery after COVID-19 in a cohort of participants aged 18 years or older, admitted to hospital for COVID-19 in the UK, and discharged between March, 2020, and October, 2021. Participants were recruited from the Post-hospitalisation COVID-19 study (PHOSP-COVID). Follow-up data were collected at two timepoints: an early time point 2–7 months after hospital discharge and a later time point 10–14 months after hospital discharge. Sleep quality was assessed subjectively using the Pittsburgh Sleep Quality Index questionnaire and a numerical rating scale. Sleep quality was also assessed with an accelerometer worn on the wrist (actigraphy) for 14 days. Participants were also clinically phenotyped, including assessment of symptoms (ie, anxiety [Generalised Anxiety Disorder 7-item scale questionnaire], muscle function [SARC-F questionnaire], dyspnoea [Dyspnoea-12 questionnaire] and measurement of lung function), at the early timepoint after discharge. Actigraphy results were also compared to a matched UK Biobank cohort (non-hospitalised individuals and recently hospitalised individuals). Multivariable linear regression was used to define associations of sleep disturbance with the primary outcome of breathlessness and the other clinical symptoms. PHOSP-COVID is registered on the ISRCTN Registry (ISRCTN10980107). Findings: 2320 of 2468 participants in the PHOSP-COVID study attended an early timepoint research visit a median of 5 months (IQR 4–6) following discharge from 83 hospitals in the UK. Data for sleep quality were assessed by subjective measures (the Pittsburgh Sleep Quality Index questionnaire and the numerical rating scale) for 638 participants at the early time point. Sleep quality was also assessed using device-based measures (actigraphy) a median of 7 months (IQR 5–8 months) after discharge from hospital for 729 participants. After discharge from hospital, the majority (396 [62%] of 638) of participants who had been admitted to hospital for COVID-19 reported poor sleep quality in response to the Pittsburgh Sleep Quality Index questionnaire. A comparable proportion (338 [53%] of 638) of participants felt their sleep quality had deteriorated following discharge after COVID-19 admission, as assessed by the numerical rating scale. Device-based measurements were compared to an age-matched, sex-matched, BMI-matched, and time from discharge-matched UK Biobank cohort who had recently been admitted to hospital. Compared to the recently hospitalised matched UK Biobank cohort, participants in our study slept on average 65 min (95% CI 59 to 71) longer, had a lower sleep regularity index (–19%; 95% CI –20 to –16), and a lower sleep efficiency (3·83 percentage points; 95% CI 3·40 to 4·26). Similar results were obtained when comparisons were made with the non-hospitalised UK Biobank cohort. Overall sleep quality (unadjusted effect estimate 3·94; 95% CI 2·78 to 5·10), deterioration in sleep quality following hospital admission (3·00; 1·82 to 4·28), and sleep regularity (4·38; 2·10 to 6·65) were associated with higher dyspnoea scores. Poor sleep quality, deterioration in sleep quality, and sleep regularity were also associated with impaired lung function, as assessed by forced vital capacity. Depending on the sleep metric, anxiety mediated 18–39% of the effect of sleep disturbance on dyspnoea, while muscle weakness mediated 27–41% of this effect. Interpretation: Sleep disturbance following hospital admission for COVID-19 is associated with dyspnoea, anxiety, and muscle weakness. Due to the association with multiple symptoms, targeting sleep disturbance might be beneficial in treating the post-COVID-19 condition. Funding: UK Research and Innovation, National Institute for Health Research, and Engineering and Physical Sciences Research Council

    Post-acute COVID-19 neuropsychiatric symptoms are not associated with ongoing nervous system injury

    Get PDF
    A proportion of patients infected with severe acute respiratory syndrome coronavirus 2 experience a range of neuropsychiatric symptoms months after infection, including cognitive deficits, depression and anxiety. The mechanisms underpinning such symptoms remain elusive. Recent research has demonstrated that nervous system injury can occur during COVID-19. Whether ongoing neural injury in the months after COVID-19 accounts for the ongoing or emergent neuropsychiatric symptoms is unclear. Within a large prospective cohort study of adult survivors who were hospitalized for severe acute respiratory syndrome coronavirus 2 infection, we analysed plasma markers of nervous system injury and astrocytic activation, measured 6 months post-infection: neurofilament light, glial fibrillary acidic protein and total tau protein. We assessed whether these markers were associated with the severity of the acute COVID-19 illness and with post-acute neuropsychiatric symptoms (as measured by the Patient Health Questionnaire for depression, the General Anxiety Disorder assessment for anxiety, the Montreal Cognitive Assessment for objective cognitive deficit and the cognitive items of the Patient Symptom Questionnaire for subjective cognitive deficit) at 6 months and 1 year post-hospital discharge from COVID-19. No robust associations were found between markers of nervous system injury and severity of acute COVID-19 (except for an association of small effect size between duration of admission and neurofilament light) nor with post-acute neuropsychiatric symptoms. These results suggest that ongoing neuropsychiatric symptoms are not due to ongoing neural injury

    SARS-CoV-2-specific nasal IgA wanes 9 months after hospitalisation with COVID-19 and is not induced by subsequent vaccination

    Get PDF
    BACKGROUND: Most studies of immunity to SARS-CoV-2 focus on circulating antibody, giving limited insights into mucosal defences that prevent viral replication and onward transmission. We studied nasal and plasma antibody responses one year after hospitalisation for COVID-19, including a period when SARS-CoV-2 vaccination was introduced. METHODS: In this follow up study, plasma and nasosorption samples were prospectively collected from 446 adults hospitalised for COVID-19 between February 2020 and March 2021 via the ISARIC4C and PHOSP-COVID consortia. IgA and IgG responses to NP and S of ancestral SARS-CoV-2, Delta and Omicron (BA.1) variants were measured by electrochemiluminescence and compared with plasma neutralisation data. FINDINGS: Strong and consistent nasal anti-NP and anti-S IgA responses were demonstrated, which remained elevated for nine months (p < 0.0001). Nasal and plasma anti-S IgG remained elevated for at least 12 months (p < 0.0001) with plasma neutralising titres that were raised against all variants compared to controls (p < 0.0001). Of 323 with complete data, 307 were vaccinated between 6 and 12 months; coinciding with rises in nasal and plasma IgA and IgG anti-S titres for all SARS-CoV-2 variants, although the change in nasal IgA was minimal (1.46-fold change after 10 months, p = 0.011) and the median remained below the positive threshold determined by pre-pandemic controls. Samples 12 months after admission showed no association between nasal IgA and plasma IgG anti-S responses (R = 0.05, p = 0.18), indicating that nasal IgA responses are distinct from those in plasma and minimally boosted by vaccination. INTERPRETATION: The decline in nasal IgA responses 9 months after infection and minimal impact of subsequent vaccination may explain the lack of long-lasting nasal defence against reinfection and the limited effects of vaccination on transmission. These findings highlight the need to develop vaccines that enhance nasal immunity. FUNDING: This study has been supported by ISARIC4C and PHOSP-COVID consortia. ISARIC4C is supported by grants from the National Institute for Health and Care Research and the Medical Research Council. Liverpool Experimental Cancer Medicine Centre provided infrastructure support for this research. The PHOSP-COVD study is jointly funded by UK Research and Innovation and National Institute of Health and Care Research. The funders were not involved in the study design, interpretation of data or the writing of this manuscript

    Large-scale phenotyping of patients with long COVID post-hospitalization reveals mechanistic subtypes of disease

    Get PDF
    One in ten severe acute respiratory syndrome coronavirus 2 infections result in prolonged symptoms termed long coronavirus disease (COVID), yet disease phenotypes and mechanisms are poorly understood1. Here we profiled 368 plasma proteins in 657 participants ≥3 months following hospitalization. Of these, 426 had at least one long COVID symptom and 233 had fully recovered. Elevated markers of myeloid inflammation and complement activation were associated with long COVID. IL-1R2, MATN2 and COLEC12 were associated with cardiorespiratory symptoms, fatigue and anxiety/depression; MATN2, CSF3 and C1QA were elevated in gastrointestinal symptoms and C1QA was elevated in cognitive impairment. Additional markers of alterations in nerve tissue repair (SPON-1 and NFASC) were elevated in those with cognitive impairment and SCG3, suggestive of brain–gut axis disturbance, was elevated in gastrointestinal symptoms. Severe acute respiratory syndrome coronavirus 2-specific immunoglobulin G (IgG) was persistently elevated in some individuals with long COVID, but virus was not detected in sputum. Analysis of inflammatory markers in nasal fluids showed no association with symptoms. Our study aimed to understand inflammatory processes that underlie long COVID and was not designed for biomarker discovery. Our findings suggest that specific inflammatory pathways related to tissue damage are implicated in subtypes of long COVID, which might be targeted in future therapeutic trials

    Selenium levels in cows fed pasture and concentrates or a total mixed ration and supplemented with selenized yeast to produce milk with supra-nutritional selenium concentrations

    No full text
    Seventy multiparous Holstein-Friesian cows were fed different amounts of pasture and concentrates, or a total mixed ration (TMR), for 42 d in mid-lactation to test the hypothesis that the concentration of Se in milk would depend on the amount of Se consumed, when the Se is primarily organic in nature, regardless of the diet of the cows. Of the 70 cows, 60 grazed irrigated perennial pasture at daily allowances of either 20 or 40 kg of dry matter (DM)/cow. These cows received 1 of 3 amounts of concentrates, either 1, 3, or 6 kg of DM/cow per day of pellets, and at each level of concentrate feeding, the pellets were formulated to provide 1 of 2 quantities of Se from Se yeast, either about 16 or 32 mg of Se/d. The other 10 cows were included in 2 additional treatments where a TMR diet was supplemented with 1 kg of DM/d of pellets formulated to include 1 of the 2 quantities of supplemental Se. Total Se intakes ranged from 14.5 to 35.9 mg/d, and of this, the Se-enriched pellets provided 93, 91, and 72% of the Se for cows allocated 20 and 40 kg of pasture DM/d or the TMR, respectively. No effects of the amount of Se consumed on any milk production variable, or on somatic cell count, body weight, and body condition score, for either the pasture-fed or TMR-fed cows were found. Milk Se concentrations responded quickly to the commencement of Se supplementation, reaching 89% of steady state levels at d 5. When milk Se concentrations were at steady state (d 12 to 40), each 1 mg of Se eaten increased the Se concentration of milk by 5.0 μg/kg (R2 = 0.97), and this response did not seem to be affected by the diet of the cows or their milk production. The concentration of Se in whole blood was more variable than that in milk, and took much longer to respond to change in Se status, but it was not affected by diet at any time either. For the on-farm production of Se-enriched milk, it is important to be able to predict milk Se concentration from Se input. In our study, type of diet did not affect this relationship

    Producing milk with uniform high selenium concentrations on commercial dairy farms

    No full text
    Six herds on five commercial dairy farms were involved in the production of high selenium (Se) milk. The farms had a range of herd sizes, herd structures, feeding systems and milk production per cow. On all farms, pelleted concentrate supplements containing Se yeast were fed twice daily in the dairy for 16 days. The objectives were to: (1) produce milk with Se concentrations exceeding 225 mu g/kg on the five farms for pilot-scale production of a high protein milk powder; (2) validate a predictive relationship between Se intake and milk Se concentration developed in research; and (3) examine the time taken from the introduction of Se yeast to steady-state concentrations of Se in milk under a range of commercial farming conditions. We hypothesised that the relationship between Se intake and its concentration in milk found in research would apply on commercial farms. Daily Se intake, which was primarily from Se yeast in the pelleted concentrates, varied from 35 to 51 mg Se/cow. Grazed pasture and conserved forage contributed less than 1 mg Se/cow on all farms. The time taken from the introduction of pellets containing Se yeast to steady-state milk Se concentrations was 4-7 days. The steady-state Se concentrations in milk varied from 166 to 247 mu g/kg, but these concentrations were only 55-72% of predicted values. All the milk produced from the five farms on the last 2 days of feeding of Se-enriched pellets was used to produce a milk protein concentrate with a Se concentration of 5.4 mg/kg. Factors that might have affected Se incorporation into milk and the implications of these results for commercial production of high Se milk or milk products are discussed

    Investigating tinnitus subgroups based on hearing-related difficulties

    No full text
    Purpose Meaningfully grouping individuals with tinnitus who share a common characteristics (ie, subgrouping, phenotyping) may help tailor interventions to certain tinnitus subgroups and hence reduce outcome variability. The purpose of this study was to test if the presence of tinnitus subgroups are discernible based on hearing-related comorbidities, and to identify predictors of tinnitus severity for each subgroup identified. Methods An exploratory cross-sectional study was used. The study was nested within an online survey distributed worldwide to investigate tinnitus experiences during the COVID-19 pandemic. The main outcome measure was the tinnitus Handicap Inventory- Screening Version. Results From the 3400 respondents, 2980 were eligible adults with tinnitus with an average age of 58 years (SD = 14.7) and 49% (n = 1457) being female. A three-cluster solution identified distinct subgroups, namely, those with tinnitus-only (n = 1306; 44%), those presenting with tinnitus, hyperacusis, hearing loss and/or misophonia (n = 795; 27%), and those with tinnitus and hearing loss (n = 879; 29%). Those with tinnitus and hyperacusis reported the highest tinnitus severity (M = 20.3; SD = 10.5) and those with tinnitus and no hearing loss had the lowest tinnitus severity (M = 15.7; SD = 10.4). Younger age and the presence of mental health problems predicted greater tinnitus severity for all groups (beta <= -0.1, P <= .016). Conclusion Further exploration of these potential subtypes are needed in both further research and clinical practice by initially triaging tinnitus patients prior to their clinical appointments based on the presence of hearing-related comorbidities. Unique management pathways and interventions could be tailored for each tinnitus subgroup
    corecore