32 research outputs found
Amyloid-PET imaging predicts functional decline in clinically normal individuals
Background:
There is good evidence that elevated amyloid-β (Aβ) positron emission tomography (PET) signal is associated with cognitive decline in clinically normal (CN) individuals. However, it is less well established whether there is an association between the Aβ burden and decline in daily living activities in this population. Moreover, Aβ-PET Centiloids (CL) thresholds that can optimally predict functional decline have not yet been established.
//
Methods:
Cross-sectional and longitudinal analyses over a mean three-year timeframe were performed on the European amyloid-PET imaging AMYPAD-PNHS dataset that phenotypes 1260 individuals, including 1032 CN individuals and 228 participants with questionable functional impairment. Amyloid-PET was assessed continuously on the Centiloid (CL) scale and using Aβ groups (CL 50 = Aβ+). Functional abilities were longitudinally assessed using the Clinical Dementia Rating (Global-CDR, CDR-SOB) and the Amsterdam Instrumental Activities of Daily Living Questionnaire (A-IADL-Q). The Global-CDR was available for the 1260 participants at baseline, while baseline CDR-SOB and A-IADL-Q scores and longitudinal functional data were available for different subsamples that had similar characteristics to those of the entire sample.
//
Results:
Participants included 765 Aβ- (61%, Mdnage = 66.0, IQRage = 61.0–71.0; 59% women), 301 Aβ± (24%; Mdnage = 69.0, IQRage = 64.0–75.0; 53% women) and 194 Aβ+ individuals (15%, Mdnage = 73.0, IQRage = 68.0–78.0; 53% women). Cross-sectionally, CL values were associated with CDR outcomes. Longitudinally, baseline CL values predicted prospective changes in the CDR-SOB (bCL*Time = 0.001/CL/year, 95% CI [0.0005,0.0024], p = .003) and A-IADL-Q (bCL*Time = -0.010/CL/year, 95% CI [-0.016,-0.004], p = .002) scores in initially CN participants. Increased clinical progression (Global-CDR > 0) was mainly observed in Aβ+ CN individuals (HRAβ+ vs Aβ- = 2.55, 95% CI [1.16,5.60], p = .020). Optimal thresholds for predicting decline were found at 41 CL using the CDR-SOB (bAβ+ vs Aβ- = 0.137/year, 95% CI [0.069,0.206], p < .001) and 28 CL using the A-IADL-Q (bAβ+ vs Aβ- = -0.693/year, 95% CI [-1.179,-0.208], p = .005).
//
Conclusions:
Amyloid-PET quantification supports the identification of CN individuals at risk of functional decline.
//
Trial registration: The AMYPAD PNHS is registered at www.clinicaltrialsregister.eu with the EudraCT Number:
2018-002277-22
Low-input soil management increases yield and decreases CO2-emissions but aggravates risk of nitrate leaching and diseases in winter wheat cropping systems under climate change
Understanding how climate change will affect crop performance is critical to ensure global food security and sustainability. Empirical data is key to anticipate the impact of climate change on cropping systems, but multifactorial climate change experiments remain scarce. In this study, the growth of winter wheat was examined in two agricultural soil management systems: one with long-term low organic inputs and the other one with high organic inputs. The wheat was grown in these differentially managed soils in an Ecotron, where the plant-soil mesocosms were subjected to three different climatic conditions. These conditions represent a gradient of ongoing climate change, simulating the weather patterns of the years 2013, 2068, and 2085 respectively. This approach allows to study the combined effects of projected increases in temperature, atmospheric CO2-concentrations, solar irradiation and altered precipitation patterns on the cropping system (wheat growth, grain yield, rhizosphere processes, greenhouse gases, disease dynamics). The low-input system outperformed the high-input system with higher yields and lower CO2-emissions in the future climates. On the other hand, the risk for plant diseases and nitrate leaching was also increased in the low-input system. To reduce the environmental impact of high-yielding cropping systems in the future it is therefore essential to identify management practices which allow fertiliser application and nutrient buffering without necessarily increasing organic inputs, like fertigation or biological nitrification inhibition. Under both here studied soil management systems the wheat plants developed natural coping mechanisms such as enhanced root growth and increased levels of proline and silicon to mitigate the adverse effects of environmental and biotic stresses. Unravelling the molecular mechanisms that trigger such inherent plant defences is a further interesting target for breeding future crops. Adapting crop rotations and cover crops to the shorter wheat cycle in the future is also an opportunity to break disease cycles
Tolérance d'une culture de froment à l'égard de l'hétérogeneite d'épandage des engrais azotés
Tolerance of wheat crop [Triticum aestivum] towards the spreading heterogeneity of nitrogen manure. The homogeneity of fertilizer application depends mainly on the regularity of transversal distribution. During the last few years, we have tested more than 300 distributors on farms. Results show big differences between the machines (CV between 5/ and more than 50/). The result is linked to distributor features, fertilizer characteristics and user skills. Moreover, crop yield is a function of fertilizer availability for plants. In the case of wheat, nitrogen is particularly important. Using farm application data, a simulation of wheat yield is built in relation with an increase of typical distribution mistakes. Thus, it is possible to assess an acceptable level of heterogeneity for the crop
Culture de sécurité des professionnels de santé en soins primaires : adaptation en langue française du questionnaire MOSPSC (« Medical Office Survey on Patient Safety Culture »)
International audiencePatient safety culture (PSC) takes into account a number of individual and organizational factors. Evaluation of PSC with the participation of primary health care professionals can be carried out through self-administered surveys such as the AHRQ's Medical Office Survey on Patient Safety Culture (MOSPSC) questionnaire
Validation d’un questionnaire de culture de sécurité et de ses indicateurs : l’expérimentation CLARTE sur 91 établissements de santé
French national survey of inpatient adverse events prospectively assessed with ward staff
OBJECTIVES: To estimate the incidence of adverse events in medical and surgical activity in public and private hospitals, and to assess the clinical situation of patients and the active errors. DESIGN: Prospective assessment of adverse events by external senior nursing and doctor investigators with ward staff. SETTING: Random three‐stage stratified cluster sampling of stays or fractions of stay in a 7‐day observation period for each ward. PARTICIPANTS: 8754 patients observed in 292 wards in 71 hospitals, over 35 234 hospitalisation days. MAIN OUTCOME MEASURES: Number of adverse events in relation to number of days of hospitalisation. RESULTS: The incidence density of adverse events was 6.6 per 1000 days of hospitalisation (95% CI 5.7 to 7.5), of which 35% were preventable. Invasive procedures were the source of half the adverse events, of which 20% were preventable. Adverse events related to the psychological sphere and pain were mostly considered as preventable. Ward staff found it difficult to assess the role of care management in the occurrence of adverse events: 41% of adverse events were expected because of the disease itself, and could have occurred in the absence of the related medical management. CONCLUSION: At the national level in France, every year 120 000–190 000 adverse events during hospitalisation can be considered as preventable. Areas such as perioperative period and geriatric units should receive closer attention. As adverse events occurred more commonly in vulnerable patients, who are not specifically targeted by clinical guidance, practising evidence‐based medicine is not likely to prevent all cases. Therefore clinical risk management should prioritise empowerment of local staff, provision of favourable conditions within the organisation, and staff training based on simple tools appropriate for ward‐level identification and analysis of adverse events
