6,519 research outputs found
Longitudinal Impact of Vision Impairment on Concern About Falling in People With Age-Related Macular Degeneration
Purpose: To explore the longitudinal impact of central vision loss on concern about falling (CF), over a 12-month period, in people with age-related macular degeneration (AMD). Methods: Participants included 60 community-dwelling older people (age, 79.7 ± 6.4 years) with central vision impairment due to AMD. Binocular high-contrast visual acuity, contrast sensitivity, and visual fields were assessed at baseline and at 12 months. CF was assessed at both time points using the Falls Efficacy Scale–International (FES-I). Senso-rimotor function (sit to stand, knee extension, postural sway, and walking speed) and neuropsychological function (reaction time, symptoms of anxiety and depression) were also assessed at both time points using validated instruments. Falls data were collected using monthly diaries during the 12 months. Results: CF increased by a small but significant amount over the 12-month follow-up (2.1 units; P = 0.01), with increasing prevalence of high levels of CF (FES-I score ≥ 23), from 48% at baseline to 65% at 12 months. Linear mixed models showed that reduced contrast sensitivity was significantly associated with increased concern about falling (P = 0.004), whereas declines in both visual acuity and contrast sensitivity during the follow-up period were associated with increases in CF over the 12-month follow-up (P = 0.041 and P = 0.054, respectively), independent of age, gender, falls history, or number of comorbidities. Conclusions: Higher levels of CF are common in older people with AMD, and levels increase over time; this increase is associated with declines in both visual acuity and contrast sensitivity. These findings highlight the need for regular assessment of both visual acuity and contrast sensitivity to identify those at greatest risk of developing higher CF. Translational Relevance: Routine assessment of visual acuity and contrast sensitivity in older people with AMD will assist in identifying those at risk of developing high CF
Visual Predictors of Postural Sway in Older Adults
Purpose: Accurate perception of body position relative to the environment through visual cues provides sensory input to the control of postural stability. This study explored which vision measures are most important for control of postural sway in older adults with a range of visual characteristics. Methods: Participants included 421 older adults (mean age = 72.6 ± 6.1), 220 with vision impairment associated with a range of eye diseases and 201 with normal vision. Participants completed a series of vision, cognitive, and physical function tests. Postural sway was measured using an electronic forceplate (HUR Labs) on a foam surface with eyes open. Linear regression analysis identified the strongest visual predictors of postu-ral sway, controlling for potential confounding factors, including cognitive and physical function. Results: In univariate regression models, unadjusted and adjusted for age, all of the vision tests were significantly associated with postural sway (P < 0.05), with the strongest predictor being visual motion sensitivity (standardized regression coefficient, β = 0.340; age-adjusted β = 0.253). In multiple regression models, motion sensitivity (β = 0.187), integrated binocular visual fields (β =−0.109), and age (β = 0.234) were the only significant visual predictors of sway, adjusted for confounding factors, explain-ing 23% of the variance in postural sway. Conclusions: Of the vision tests, visual motion perception and binocular visual fields were most strongly associated with postural stability in older adults with and without vision impairment. Translational Relevance: Findings provide insight into the visual contributions to postural stability in older adults and have implications for falls risk assessment
Portfolio-based appraisal: superficial or useful?
This paper outlines the growing role played by performance appraisal within medical regulation, supported by learning portfolios. It investigates if these are superficial or useful tools. In doing so it argues that caution must be exercised in promoting such tools to help modernise medical regulatory frameworks
The use of driver screening tools to predict self-reported crashes and incidents in older drivers
There is a clear need to identify older drivers at increased crash risk, without additional burden on the individual or licensing system. Brief off-road screening tools have been used to identify unsafe drivers and drivers at risk of losing their license. The aim of the current study was to evaluate and compare driver screening tools in predicting prospective self-reported crashes and incidents over 24 months in drivers aged 60 years and older. 525 drivers aged 63–96 years participated in the prospective Driving Aging Safety and Health (DASH) study, completing an on-road driving assessment and seven off-road screening tools (Multi-D battery, Useful Field of View, 14-Item Road Law, Drive Safe, Drive Safe Intersection, Maze Test, Hazard Perception Test (HPT)), along with monthly self-report diaries on crashes and incidents over a 24-month period. Over the 24 months, 22% of older drivers reported at least one crash, while 42% reported at least one significant incident (e.g., near miss). As expected, passing the on-road driving assessment was associated with a 55% [IRR 0.45, 95% CI 0.29–0.71] reduction in self-reported crashes adjusting for exposure (crash rate), but was not associated with reduced rate of a significant incident. For the off-road screening tools, poorer performance on the Multi-D test battery was associated with a 22% [IRR 1.22, 95% CI 1.08–1.37] increase in crash rate over 24 months. Meanwhile, all other off-road screening tools were not predictive of rates of crashes or incidents reported prospectively. The finding that only the Multi-D battery was predictive of increased crash rate, highlights the importance of accounting for age-related changes in vision, sensorimotor skills and cognition, as well as driving exposure, in older drivers when using off-road screening tools to assess future crash risk
Exploring perceptions of Advanced Driver Assistance Systems (ADAS) in older drivers with age-related declines
Perceptions of Advanced Driver Assistance Systems (ADAS) were explored in two semi-structured face-to-face focus group studies of 42 older drivers (aged 65 years and older) with and without age-related declines. Study 1 explored perceptions regarding ADAS, focusing on visual, auditory, physical, and cognitive factors. Study 2 extended this by additionally exploring perceptions following exposure to videos and stationary vehicle demonstrations of an ADAS. Participants had a range of visual, hearing, memory, and health characteristics which impacted on their daily life. In both studies, some participants had insights regarding various ADAS technologies prior to the study, but many were unfamiliar with these systems. Nevertheless, overall, participants reported that ADAS would assist them to drive as they age and increase their mobility and independence. There were comments regarding the benefits of warning alerts, although the potential for them to be distracting was also highlighted. Participants with vision impairment preferred audio alerts and participants with hearing impairment preferred visual display alerts. Findings highlighted the potential for ADAS to assist those with age-related declines and the need to increase the flexibility of warning system alerts to suit the varying requirements of older drivers, as well as to reduce the complexity of vehicle interfaces. Collectively, these strategies would maximize the benefits of these vehicles to increase the mobility, independence, and quality of life of older drivers with and without age-related declines
Is carbon dioxide pricing a driver in concrete mix design?
The global cement industry is responsible for 7% of anthropogenic carbon dioxide emissions and, as such, has a vital role to play in the transition to a low carbon dioxide economy. In recent years, this has been achieved by technological advances and increased use of supplementary cementitious materials, but the authors have recently shown that there are other means of achieving comparable carbon dioxide savings, for example, by reducing workability. However, price remains a considerable barrier to the widespread implementation of low carbon dioxide concrete. Using the same model for concrete mix design as was used to determine embodied carbon dioxide (ECD), variations in the cost of the components of concrete have now been considered. Considering 24 different mix designs, each spanning a range of characteristic strengths from 20 to 100 MPa, measures to reduce the carbon dioxide footprint were also found to reduce the material cost of the concrete. As such, it may be considered that the construction industry is already encouraged to reduce its ‘carbon footprint’. However, the concept of the carbon footprint was then considered in a more nuanced fashion, considering the ECD per unit strength. On such a basis, the cheapest mixes did not have the lowest ECD. Therefore, the impact of levying a charge on the carbon footprint was considered. To ensure low carbon dioxide concrete is also the cheapest, carbon dioxide emissions would have to be priced approximately one to two orders of magnitude higher than current market value. This would become the dominant factor in construction, with serious consequences for the industry. Furthermore, such charges may pose ethical problems, being viewed as a ‘licence to pollute’ and therefore undermining society's efforts to reduce the carbon dioxide emissions of the construction industry
Contact force sensing in ablation of ventricular arrhythmias using a 56-hole open-irrigation catheter: a propensity-matched analysis.
PURPOSE: The effect of adding contact force (CF) sensing to 56-hole tip irrigation in ventricular arrhythmia (VA) ablation has not been previously studied. We aimed to compare outcomes with and without CF sensing in VA ablation using a 56-hole radiofrequency (RF) catheter. METHODS: A total of 164 patients who underwent first-time VA ablation using Thermocool SmartTouch Surround Flow (TC-STSF) catheter (Biosense-Webster, Diamond Bar, CA, USA) were propensity-matched in a 1:1 fashion to 164 patients who had first-time ablation using Thermocool Surround Flow (TC-SF) catheter. Patients were matched for age, gender, cardiac aetiology, ejection fraction and approach. Acute success, complications and long-term follow-up were compared. RESULTS: There was no difference between procedures utilising either TC-SF or TC-STSF in acute success (TC-SF: 134/164 (82%), TC-STSF: 141/164 (86%), p = 0.3), complications (TC-SF: 11/164 (6.7%), TC-STSF: 11/164 (6.7%), p = 1.0) or VA-free survival (TC-SF: mean arrhythmia-free survival time = 5.9 years, 95% CI = 5.4-6.4, TC-STSF: mean = 3.2 years, 95% CI = 3-3.5, log-rank p = 0.74). Fluoroscopy time was longer in normal hearts with TC-SF (19 min, IQR: 14-30) than TC-STSF (14 min, IQR: 8-25; p = 0.04). CONCLUSION: Both TC-SF and TC-STSF catheters are safe and effective in treating VAs. The use of CF sensing catheters did not improve safety or acute and long-term outcomes, but reduced fluoroscopy time in normal heart VA
Patient-reported outcomes: pathways to better health, better services, and better societies
This is the author accepted manuscript. The final version is available from the publisher via the DOI in this recordWhile the use of PROs in research is well established, many challenges lie ahead as their use is extended to other applications. There is consensus that health outcome evaluations that include PROs along with clinician-reported outcomes and administrative data are necessary to inform clinical and policy decisions. The initiatives presented in this paper underline evolving recognition that PROs play a unique role in adding the patient perspective alongside clinical (e.g., blood pressure) and organizational (e.g., admission rates) indicators for evaluating the effects of new products, selecting treatments, evaluating quality of care, and monitoring the health of the population. In this paper, we first explore the use of PRO measures to support drug approval and labeling claims. We critically evaluate the evidence and challenges associated with using PRO measures to improve healthcare delivery at individual and population levels. We further discuss the challenges associated with selecting from the abundance of measures available, opportunities afforded by agreeing on common metrics for constructs of interest, and the importance of establishing an evidence base that supports integrating PRO measures across the healthcare system to improve outcomes. We conclude that the integration of PROs as a key end point within individual patient care, healthcare organization and program performance evaluations, and population surveillance will be essential for evaluating whether increased healthcare expenditure is translating into better health outcomes.Jose M. Valderas was supported by an
NIHR Clinician Scientist Award (NIHR/CS/010/024)
The epiphyseal scar: changing perceptions in relation to skeletal age estimation.
BACKGROUND: It is imperative that all methods applied in skeletal age estimation and the criteria on which they are based have a strong evidential basis. The relationship between the persistence of epiphyseal scars and chronological age, however, has remained largely untested. AIMS: To assess the relationships between the level of persistence of the epiphyseal scar and chronological age, biological sex and side of the body in relation to the interpretation of epiphyseal scars in methods of skeletal age estimation. SUBJECTS AND METHODS: A sample of radiographic images was obtained from the Tayside NHS Trust, Ninewells Hospital, Dundee, UK. This included images of four anatomical regions from living female and male individuals aged between 20-50 years. RESULTS: Some remnant of an epiphyseal scar was found in 78-99% of individuals examined in this study. The level of persistence of epiphyseal scars was also found to vary between anatomical regions. CONCLUSION: The overall relationship between chronological age and the level of persistence or obliteration of the epiphyseal scar was found to be of insufficient strength to support a causative link. It is, therefore, necessary that caution is employed in their interpretation in relation to skeletal age estimation practices
- …