3,259 research outputs found

    Traumatic rhabdomyolysis (crush syndrome) in the rural setting

    Get PDF
    Background. Patients with traumatic rhabdomyolysis (crush syndrome)(CS) secondary to community beatings commonly present to a rural emergency department that has limited access to dialysis services. We describe a retrospective study of patients admitted with a diagnosis of CS to the emergency department of a government hospital in rural KwaZulu-Natal, between November 2008 and June 2009. Objectives. We assessed identification and management of these patients, considering: (i) early adverse parameters used to identify poor prognosis, (ii) the importance of early recognition, and (iii) appropriate management with aggressive fluid therapy and alkaline diuresis to prevent progression to renal failure. Methods. Diagnosis was based on clinical suspicion and haematuria. Exclusion criteria included a blood creatine kinase level <1 000 U/l on admission. Data captured included demographics, the offending weapon, time of injury and presentation to hospital, and admission laboratory results. Outcome measures included length of time in the resuscitation unit, and subsequent movement to the main ward or dialysis unit, discharge from hospital, or death. Results. Forty-four patients were included in the study (41 male, 3 female), all presenting within 24 hours of injury: 27 were assaulted with sjamboks or sticks, 43 were discharged to the ward with normal or improving renal function, and 1 patient died. Conclusions. Serum potassium, creatinine, and creatine kinase levels were important early parameters for assessing CS severity; 43 patients (98%) had a favourable outcome, owing to early recognition and institution of appropriate therapy – vital in the absence of dialysis services.S Afr Med J, 2012;102:37-3

    Just Keeping Hope: Margaret\u27s Story

    Get PDF
    A first-person narrative about life in Dayton, Ohio, composed as part of the Facing Project, a nationwide storytelling initiative

    Experiences following cataract surgery - patient perspectives

    Get PDF
    PURPOSE: Most patients report being highly satisfied with the outcome of cataract surgery but there are variable reports regarding the impact of cataract surgery on some real-world activities, such as fall rates. We hypothesised that adaptations to changed refractive correction and visual function may cause difficulties in undertaking everyday activities for some patients and used a series of focus groups to explore this issue. METHOD: Qualitative methods were used to explore patients' experiences of their vision following cataract surgery, including adaptation to vision changes and their post-surgical spectacle prescription. Twenty-six participants took part in five focus groups (Mean age = 68.2 ± 11.4 years), and the data were analysed using thematic analysis. RESULTS: We identified three themes. 'Changes to Vision' explores participants' adaptation following cataract surgery. While several had problems with tasks relying on binocular vision, few found them bothersome and they resolved following second eye surgery. Participants described a trial and error approach to solving these problems rather than applying solutions suggested by their eyecare professionals. 'Prescription Restrictions' describes the long-term vision problems that pre-surgery myopic patients experienced as a consequence of becoming emmetropic following surgery and thus needing spectacles for reading and other close work activities, which they did not need before surgery. Very few reported that they had the information or time to make a decision regarding their post-operative correction. 'Information Needs' describes participant's responses to the post-surgical information they were given, and the unmet information need regarding when they can drive following surgery. CONCLUSION: The findings highlight the need for clinicians to provide information on adaptation effects, assist patients to select the refractive outcome that best suits their lifestyle, and provide clear advice about when patients can start driving again. Patients need to be provided with better guidance from clinicians and prescribing guidelines for clinicians would be beneficial, particularly for the period between first- and second-eye surgery

    Visual Predictors of Postural Sway in Older Adults

    Full text link
    Purpose: Accurate perception of body position relative to the environment through visual cues provides sensory input to the control of postural stability. This study explored which vision measures are most important for control of postural sway in older adults with a range of visual characteristics. Methods: Participants included 421 older adults (mean age = 72.6 ± 6.1), 220 with vision impairment associated with a range of eye diseases and 201 with normal vision. Participants completed a series of vision, cognitive, and physical function tests. Postural sway was measured using an electronic forceplate (HUR Labs) on a foam surface with eyes open. Linear regression analysis identified the strongest visual predictors of postu-ral sway, controlling for potential confounding factors, including cognitive and physical function. Results: In univariate regression models, unadjusted and adjusted for age, all of the vision tests were significantly associated with postural sway (P < 0.05), with the strongest predictor being visual motion sensitivity (standardized regression coefficient, β = 0.340; age-adjusted β = 0.253). In multiple regression models, motion sensitivity (β = 0.187), integrated binocular visual fields (β =−0.109), and age (β = 0.234) were the only significant visual predictors of sway, adjusted for confounding factors, explain-ing 23% of the variance in postural sway. Conclusions: Of the vision tests, visual motion perception and binocular visual fields were most strongly associated with postural stability in older adults with and without vision impairment. Translational Relevance: Findings provide insight into the visual contributions to postural stability in older adults and have implications for falls risk assessment

    The use of driver screening tools to predict self-reported crashes and incidents in older drivers

    Full text link
    There is a clear need to identify older drivers at increased crash risk, without additional burden on the individual or licensing system. Brief off-road screening tools have been used to identify unsafe drivers and drivers at risk of losing their license. The aim of the current study was to evaluate and compare driver screening tools in predicting prospective self-reported crashes and incidents over 24 months in drivers aged 60 years and older. 525 drivers aged 63–96 years participated in the prospective Driving Aging Safety and Health (DASH) study, completing an on-road driving assessment and seven off-road screening tools (Multi-D battery, Useful Field of View, 14-Item Road Law, Drive Safe, Drive Safe Intersection, Maze Test, Hazard Perception Test (HPT)), along with monthly self-report diaries on crashes and incidents over a 24-month period. Over the 24 months, 22% of older drivers reported at least one crash, while 42% reported at least one significant incident (e.g., near miss). As expected, passing the on-road driving assessment was associated with a 55% [IRR 0.45, 95% CI 0.29–0.71] reduction in self-reported crashes adjusting for exposure (crash rate), but was not associated with reduced rate of a significant incident. For the off-road screening tools, poorer performance on the Multi-D test battery was associated with a 22% [IRR 1.22, 95% CI 1.08–1.37] increase in crash rate over 24 months. Meanwhile, all other off-road screening tools were not predictive of rates of crashes or incidents reported prospectively. The finding that only the Multi-D battery was predictive of increased crash rate, highlights the importance of accounting for age-related changes in vision, sensorimotor skills and cognition, as well as driving exposure, in older drivers when using off-road screening tools to assess future crash risk

    Exploring perceptions of Advanced Driver Assistance Systems (ADAS) in older drivers with age-related declines

    Full text link
    Perceptions of Advanced Driver Assistance Systems (ADAS) were explored in two semi-structured face-to-face focus group studies of 42 older drivers (aged 65 years and older) with and without age-related declines. Study 1 explored perceptions regarding ADAS, focusing on visual, auditory, physical, and cognitive factors. Study 2 extended this by additionally exploring perceptions following exposure to videos and stationary vehicle demonstrations of an ADAS. Participants had a range of visual, hearing, memory, and health characteristics which impacted on their daily life. In both studies, some participants had insights regarding various ADAS technologies prior to the study, but many were unfamiliar with these systems. Nevertheless, overall, participants reported that ADAS would assist them to drive as they age and increase their mobility and independence. There were comments regarding the benefits of warning alerts, although the potential for them to be distracting was also highlighted. Participants with vision impairment preferred audio alerts and participants with hearing impairment preferred visual display alerts. Findings highlighted the potential for ADAS to assist those with age-related declines and the need to increase the flexibility of warning system alerts to suit the varying requirements of older drivers, as well as to reduce the complexity of vehicle interfaces. Collectively, these strategies would maximize the benefits of these vehicles to increase the mobility, independence, and quality of life of older drivers with and without age-related declines

    A comparison of course-related stressors in undergraduate problem-based learning (PBL) versus non-PBL medical programmes

    Get PDF
    Background: Medical students report high levels of stress related to their medical training as well as to other personal and financial factors. The aim of this study is to investigate whether there are differences in course-related stressors reported by medical students on undergraduate problem-based learning (PBL) and non-PBL programmes in the UK. Method: A cross-sectional study of second-year medical students in two UK medical schools (one PBL and one non-PBL programme) was conducted. A 16-question self-report questionnaire, derived from the Perceived Medical Student Stress Scale and the Higher Education Stress Inventory, was used to measure course-related stressors. Following univariate analysis of each stressor between groups, multivariate logistic regression was used to determine which stressors were the best predictors of each course type, while controlling for socio-demographic differences between the groups. Results: A total of 280 students responded. Compared to the non-PBL students (N = 197), the PBL students (N = 83) were significantly more likely to agree that: they did not know what the faculty expected of them (Odds Ratio (OR) = 0.38, p = 0.03); there were too many small group sessions facilitated only by students resulting in an unclear curriculum (OR = 0.04, p < 0.0001); and that there was a lack of opportunity to explore academic subjects of interest (OR = 0.40, p = 0.02). They were significantly more likely to disagree that: there was a lack of encouragement from teachers (OR = 3.11, p = 0.02); and that the medical course fostered a sense of anonymity and feelings of isolation amongst students (OR = 3.42, p = 0.008). Conclusion: There are significant differences in the perceived course-related stressors affecting medical students on PBL and non-PBL programmes. Course designers and student support services should therefore tailor their work to minimise, or help students cope with, the specific stressors on each course type to ensure optimum learning and wellbeing among our future doctors

    Peripheral tolerance to alloantigen results from altered regulation of the interleukin-2 pathway

    Get PDF
    Tolerance to alloantigen may be induced in rats by administration of blood followed by transplantation of a renal allograft. The mechanism of this tolerance was investigated by directly analyzing the functional activity of graft-infiltrating cells. We have previously shown cytotoxic T lymphocyte infiltration of, and major histocompatibility complex induction on, grafts of tolerant animals. We now report that cells isolated from the grafts of tolerant rats show a reduced expression of the p55 interleukin 2 receptor (IL-2R) chain on the cell surface compared with that seen on the cells of untreated animals. Scatchard analysis further reveals low expression of high affinity IL-2R. This is due to reduced transcription of both IL-2R alpha and beta chain mRNAs and results in a reduced ability of cells to proliferate in response to IL-2. Cells isolated from tolerant animals are unable to make biologically active IL-2 in culture, whereas cells from untreated animals make high levels. This is not reflected at the mRNA level as the IL-2 gene is induced in both tolerant and untreated animals to similar levels. The induction of tolerance is abrogated by administration of recombinant IL-2 to animals at the time of transplantation. Thus, we conclude that an altered regulation of the IL-2 pathway results in tolerance in these alloantigen-treated and transplanted animals
    • …
    corecore