331 research outputs found
Identifying the barriers and enablers for a triage, treatment, and transfer clinical intervention to manage acute stroke patients in the emergency department : A systematic review using the theoretical domains framework (TDF)
Background
Clinical guidelines recommend that assessment and management of patients with stroke commences early including in emergency departments (ED). To inform the development of an implementation intervention targeted in ED, we conducted a systematic review of qualitative and quantitative studies to identify relevant barriers and enablers to six key clinical behaviours in acute stroke care: appropriate triage, thrombolysis administration, monitoring and management of temperature, blood glucose levels, and of swallowing difficulties and transfer of stroke patients in ED.
Methods
Studies of any design, conducted in ED, where barriers or enablers based on primary data were identified for one or more of these six clinical behaviours. Major biomedical databases (CINAHL, OVID SP EMBASE, OVID SP MEDLINE) were searched using comprehensive search strategies. The barriers and enablers were categorised using the theoretical domains framework (TDF). The behaviour change technique (BCT) that best aligned to the strategy each enabler represented was selected for each of the reported enablers using a standard taxonomy.
Results
Five qualitative studies and four surveys out of the 44 studies identified met the selection criteria. The majority of barriers reported corresponded with the TDF domains of “environmental, context and resources” (such as stressful working conditions or lack of resources) and “knowledge” (such as lack of guideline awareness or familiarity). The majority of enablers corresponded with the domains of “knowledge” (such as education for physicians on the calculated risk of haemorrhage following intravenous thrombolysis [tPA]) and “skills” (such as providing opportunity to treat stroke cases of varying complexity). The total number of BCTs assigned was 18. The BCTs most frequently assigned to the reported enablers were “focus on past success” and “information about health consequences.”
Conclusions
Barriers and enablers for the delivery of key evidence-based protocols in an emergency setting have been identified and interpreted within a relevant theoretical framework. This new knowledge has since been used to select specific BCTs to implement evidence-based care in an ED setting. It is recommended that findings from similar future reviews adopt a similar theoretical approach. In particular, the use of existing matrices to assist the selection of relevant BCTs
Can a Satellite-Derived Estimate of the Fraction of PAR Absorbed by Chlorophyll (FAPAR(sub chl)) Improve Predictions of Light-Use Efficiency and Ecosystem Photosynthesis for a Boreal Aspen Forest?
Gross primary production (GPP) is a key terrestrial ecophysiological process that links atmospheric composition and vegetation processes. Study of GPP is important to global carbon cycles and global warming. One of the most important of these processes, plant photosynthesis, requires solar radiation in the 0.4-0.7 micron range (also known as photosynthetically active radiation or PAR), water, carbon dioxide (CO2), and nutrients. A vegetation canopy is composed primarily of photosynthetically active vegetation (PAV) and non-photosynthetic vegetation (NPV; e.g., senescent foliage, branches and stems). A green leaf is composed of chlorophyll and various proportions of nonphotosynthetic components (e.g., other pigments in the leaf, primary/secondary/tertiary veins, and cell walls). The fraction of PAR absorbed by whole vegetation canopy (FAPAR(sub canopy)) has been widely used in satellite-based Production Efficiency Models to estimate GPP (as a product of FAPAR(sub canopy)x PAR x LUE(sub canopy), where LUE(sub canopy) is light use efficiency at canopy level). However, only the PAR absorbed by chlorophyll (a product of FAPAR(sub chl) x PAR) is used for photosynthesis. Therefore, remote sensing driven biogeochemical models that use FAPAR(sub chl) in estimating GPP (as a product of FAPAR(sub chl x PAR x LUE(sub chl) are more likely to be consistent with plant photosynthesis processes
The Photochemical Reflectance Index from Directional Cornfield Reflectances: Observations and Simulations
The two-layer Markov chain Analytical Canopy Reflectance Model (ACRM) was linked with in situ hyperspectral leaf optical properties to simulate the Photochemical Reflectance Index (PRI) for a corn crop canopy at three different growth stages. This is an extended study after a successful demonstration of PRI simulations for a cornfield previously conducted at an early vegetative growth stage. Consistent with previous in situ studies, sunlit leaves exhibited lower PRI values than shaded leaves. Since sunlit (shaded) foliage dominates the canopy in the reflectance hotspot (coldspot), the canopy PRI derived from field hyperspectral observations displayed sensitivity to both view zenith angle and relative azimuth angle at all growth stages. Consequently, sunlit and shaded canopy sectors were most differentiated when viewed along the azimuth matching the solar principal plane. These directional PRI responses associated with sunlit/shaded foliage were successfully reproduced by the ACRM. As before, the simulated PRI values from the current study were closer to in situ values when both sunlit and shaded leaves were utilized as model input data in a two-layer mode, instead of a one-layer mode with sunlit leaves only. Model performance as judged by correlation between in situ and simulated values was strongest for the mature corn crop (r = 0.87, RMSE = 0.0048), followed by the early vegetative stage (r = 0.78; RMSE = 0.0051) and the early senescent stage (r = 0.65; RMSE = 0.0104). Since the benefit of including shaded leaves in the scheme varied across different growth stages, a further analysis was conducted to investigate how variable fractions of sunlit/shaded leaves affect the canopy PRI values expected for a cornfield, with implications for 20 remote sensing monitoring options. Simulations of the sunlit to shaded canopy ratio near 50/50 +/- 10 (e.g., 60/40) matching field observations at all growth stages were examined. Our results suggest in the importance of the sunlit/shaded fraction and canopy structure in understanding and interpreting PRI
Canopy Level Chlorophyll Fluorescence and the PRI in a Cornfield
Two bio-indicators, the Photochemical Reflectance Index (PRI) and solar-induced red and far-red Chlorophyll Fluorescence (SIF), were derived from directional hyperspectral observations and studied in a cornfield on two contrasting days in the growing season. Both red and far-red SIF exhibited higher values on the day when the canopy in the early senescent stage, but only the far-red SIF showed sensitivity to viewing geometry. Consequently, the red/far-red SIF ratio varied greatly among azimuth positions while the largest values were obtained for the "hotspot" at both growth stages. This ratio was lower (approx.0.88 +/- 0.4) in early July than in August when the ratio approached equivalence (near approx.1). In concert, the PRI exhibited stronger responses to both zenith and azimuth angles and different values on the two growth stages. The potential of using these indices to monitor photosynthetic activities needs further investigatio
Arctic Tundra Vegetation Functional Types Based on Photosynthetic Physiology and Optical Properties
Non-vascular plants (lichens and mosses) are significant components of tundra landscapes and may respond to climate change differently from vascular plants affecting ecosystem carbon balance. Remote sensing provides critical tools for monitoring plant cover types, as optical signals provide a way to scale from plot measurements to regional estimates of biophysical properties, for which spatial-temporal patterns may be analyzed. Gas exchange measurements were collected for pure patches of key vegetation functional types (lichens, mosses, and vascular plants) in sedge tundra at Barrow, AK. These functional types were found to have three significantly different values of light use efficiency (LUE) with values of 0.013 plus or minus 0.0002, 0.0018 plus or minus 0.0002, and 0.0012 plus or minus 0.0001 mol C mol (exp -1) absorbed quanta for vascular plants, mosses and lichens, respectively. Discriminant analysis of the spectra reflectance of these patches identified five spectral bands that separated each of these vegetation functional types as well as nongreen material (bare soil, standing water, and dead leaves). These results were tested along a 100 m transect where midsummer spectral reflectance and vegetation coverage were measured at one meter intervals. Along the transect, area-averaged canopy LUE estimated from coverage fractions of the three functional types varied widely, even over short distances. The patch-level statistical discriminant functions applied to in situ hyperspectral reflectance data collected along the transect successfully unmixed cover fractions of the vegetation functional types. The unmixing functions, developed from the transect data, were applied to 30 m spatial resolution Earth Observing-1 Hyperion imaging spectrometer data to examine variability in distribution of the vegetation functional types for an area near Barrow, AK. Spatial variability of LUE was derived from the observed functional type distributions. Across this landscape, a fivefold variation in tundra LUE was observed. LUE calculated from the functional type cover fractions was also correlated to a spectral vegetation index developed to detect vegetation chlorophyll content. The concurrence of these alternate methods suggest that hyperspectral remote sensing can distinguish functionally distinct vegetation types and can be used to develop regional estimates of photosynthetic LUE in tundra landscapes
Hydration and nutrition care practices in stroke: findings from the UK and Australia
BACKGROUND: Dehydration and malnutrition are common in hospitalised patients following stroke leading to poor outcomes including increased mortality. Little is known about hydration and nutrition care practices in hospital to avoid dehydration or malnutrition, and how these practices vary in different countries. This study sought to capture how the hydration and nutrition needs of patients' post-stroke are assessed and managed in the United Kingdom (UK) and Australia (AUS).AIM: To examine and compare current in-hospital hydration and nutrition care practice for patients with stroke in the UK and Australia.METHODS: A cross-sectional survey was conducted between April and November 2019. Questionnaires were mailed to stroke specialist nurses in UK and Australian hospitals providing post-stroke inpatient acute care or rehabilitation. Non-respondents were contacted up to five times.RESULTS: We received 150/174 (86%) completed surveys from hospitals in the UK, and 120/162 (74%) in Australia. Of the 270 responding hospitals, 96% reported undertaking assessment of hydration status during an admission, with nurses most likely to complete assessments (85%). The most common methods of admission assessment were visual assessment of the patient (UK 62%; AUS 58%), weight (UK 52%; AUS 52%), and body mass index (UK 47%; AUS 42%). Almost all (99%) sites reported that nutrition status was assessed at some point during admission, and these were mainly completed by nurses (91%). Use of standardised nutrition screening tools were more common in the UK (91%) than Australia (60%). Similar proportions of hydration management decisions were made by physicians (UK 84%; AUS 83%), and nutrition management decisions by dietitians (UK 98%; AUS 97%).CONCLUSION: Despite broadly similar hydration and nutrition care practices after stroke in the UK and Australia, some variability was identified. Although nutrition assessment was more often informed by structured screening tools, the routine assessment of hydration was generally not. Nurses were responsible for assessment and monitoring, while dietitians and physicians undertook decision-making regarding management. Hydration care could be improved through the development of standardised assessment tools. This study highlights the need for increased implementation and use of evidence-based protocols in stroke hydration and nutrition care to improve patient outcomes.</p
Comparison of Measurements and FluorMOD Simulations for Solar Induced Chlorophyll Fluorescence and Reflectance of a Corn Crop under Nitrogen Treatments [SIF and Reflectance for Corn]
The FLuorescence Explorer (FLEX) satellite concept is one of six semifinalist mission proposals selected in 2006 for pre-Phase studies by the European Space Agency (ESA). The FLEX concept proposes to measure passive solar induced chlorophyll fluorescence (SIF) of terrestrial ecosystems. A new spectral vegetation Fluorescence Model (FluorMOD) was developed to include the effects of steady state SIF on canopy reflectance. We used our laboratory and field measurements previously acquired from foliage and canopies of corn (Zea mays L.) under controlled nitrogen (N) fertilization to parameterize and evaluate FluorMOD. Our data included biophysical properties, fluorescence (F) and reflectance spectra for leaves; reflectance spectra of canopies and soil; solar irradiance; plot-level leaf area index; and canopy SIF emissions determined using the Fraunhofer Line Depth principal for the atmospheric telluric oxygen absorption features at 688 nm (O2-beta) and 760 nm (O2-alpha). FluorMOD simulations implemented in the default "look-up-table" mode did not reproduce the observed magnitudes of leaf F, canopy SIF, or canopy reflectance. However, simulations for all of these parameters agreed with observations when the default FluorMOD information was replaced with measurements, although N treatment responses were underestimated. Recommendations were provided to enhance FluorMOD's potential utility in support of SIF field experiments and studies of agriculture and ecosystems
Impact assessment of the Centre for Research Excellence in Stroke Rehabilitation and Brain Recovery
Background: Research impact is an emerging measure of research achievement alongside traditional academic outputs such as publications. We present the results of applying the Framework to Assess the Impact from Translational health research (FAIT) to the Centre for Research Excellence (CRE) in Stroke Rehabilitation and Brain Recovery (CRE-Stroke, 2014–2019) and report on the feasibility and lessons from the application of FAIT to a CRE rather than a discrete research project.
Methods: Data were gathered via online surveys, in-depth interviews, document analysis and review of relevant websites/databases to report on the three major FAIT methods: the modified Payback Framework, an assessment of costs against monetized consequences, and a narrative account of the impact generated from CRE-Stroke activities. FAIT was applied during the last 4 years of CRE-Stroke operation.
Results: With an economic investment of AU 18.8 million in leveraged grants, fellowships and consultancies. Collectively, CRE-Stroke members produced 354 publications that were accessed 470,000 times and cited over 7220 times. CRE-Stroke supported 26 PhDs, 39 postdocs and seven novice clinician researchers. There were 59 capacity-building events benefiting 744 individuals including policy-makers and consumers. CRE-Stroke created research infrastructure (including a research register of stroke survivors and a brain biobank), and its global leadership produced international consensus recommendations to influence the stroke research landscape worldwide. Members contributed to the Australian Living Stroke Guidelines: four researchers’ outputs were directly referenced. Based only on the consequences that could be monetized, CRE-Stroke returned AU$ 4.82 for every dollar invested in the CRE.
Conclusion: This case example in the developing field of impact assessment illustrates how researchers can use evidence to demonstrate and report the impact of and returns on research investment. The prospective application of FAIT by a dedicated research impact team demonstrated impact in broad categories of knowledge-gain, capacity-building, new infrastructure, input to policy and economic benefits. The methods can be used by other research teams to provide comprehensive evidence to governments and other research funders about what has been generated from their research investment but requires dedicated resources to complete
How registry data are used to inform activities for stroke care quality improvement across 55 countries : A cross-sectional survey of Registry of Stroke Care Quality (RES-Q) hospitals
Background and purpose
The Registry of Stroke Care Quality (RES-Q) is a worldwide quality improvement data platform that captures performance and quality measures, enabling standardized comparisons of hospital care. The aim of this study was to determine if, and how, RES-Q data are used to influence stroke quality improvement and identify the support and educational needs of clinicians using RES-Q data to improve stroke care.
Methods
A cross-sectional self-administered online survey was administered (October 2021–February 2022). Participants were RES-Q hospital local coordinators responsible for stroke data collection. Descriptive statistics are presented.
Results
Surveys were sent to 1463 hospitals in 74 countries; responses were received from 358 hospitals in 55 countries (response rate 25%). RES-Q data were used “always” or “often” to: develop quality improvement initiatives (n = 213, 60%); track stroke care quality over time (n = 207, 58%); improve local practice (n = 191, 53%); and benchmark against evidence-based policies, procedures and/or guidelines to identify practice gaps (n = 179, 50%). Formal training in the use of RES-Q tools and data were the most frequent support needs identified by respondents (n = 165, 46%). Over half “strongly agreed” or “agreed” that to support clinical practice change, education is needed on: (i) using data to identify evidence–practice gaps (n = 259, 72%) and change clinical practice (n = 263, 74%), and (ii) quality improvement science and methods (n = 255, 71%).
Conclusion
RES-Q data are used for monitoring stroke care performance. However, to facilitate their optimal use, effective quality improvement methods are needed. Educating staff in quality improvement science may develop competency and improve use of data in practice
Levonorgestrel-releasing intrauterine system vs. usual medical treatment for menorrhagia: An economic evaluation alongside a randomised controlled trial
Objective: To undertake an economic evaluation alongside the largest randomised controlled trial comparing Levonorgestrel-releasing intrauterine device ('LNG-IUS') and usual medical treatment for women with menorrhagia in primary care; and compare the cost-effectiveness findings using two alternative measures of quality of life. Methods: 571 women with menorrhagia from 63 UK centres were randomised between February 2005 and July 2009. Women were randomised to having a LNG-IUS fitted, or usual medical treatment, after discussing with their general practitioner their contraceptive needs or desire to avoid hormonal treatment. The treatment was specified prior to randomisation. For the economic evaluation we developed a state transition (Markov) model with a 24 month follow-up. The model structure was informed by the trial women's pathway and clinical experts. The economic evaluation adopted a UK National Health Service perspective and was based on an outcome of incremental cost per Quality Adjusted Life Year (QALY) estimated using both EQ-5D and SF-6D. Results: Using EQ-5D, LNG-IUS was the most cost-effective treatment for menorrhagia. LNG-IUS costs £100 more than usual medical treatment but generated 0.07 more QALYs. The incremental cost-effectiveness ratio for LNG-IUS compared to usual medical treatment was £1600 per additional QALY. Using SF-6D, usual medical treatment was the most cost-effective treatment. Usual medical treatment was both less costly (£100) and generated 0.002 more QALYs. Conclusion: Impact on quality of life is the primary indicator of treatment success in menorrhagia. However, the most costeffective treatment differs depending on the quality of life measure used to estimate the QALY. Under UK guidelines LNG-IUS would be the recommended treatment for menorrhagia. This study demonstrates that the appropriate valuation of outcomes in menorrhagia is crucial. Copyright: © 2014 Sanghera et al
- …