6 research outputs found

    The Level of Vision Necessary for Competitive Performance in Rifle Shooting: Setting the Standards for Paralympic Shooting with Vision Impairment

    Get PDF
    The aim of this study was to investigate the level of vision impairment (VI) that would reduce performance in shooting; to guide development of entry criteria to visually impaired (VI) shooting. Nineteen international-level shooters without VI took part in the study. Participants shot an air rifle, while standing, toward a regulation target placed at the end of a 10 m shooting range. Cambridge simulation glasses were used to simulate six different levels of VI. Visual acuity (VA) and contrast sensitivity (CS) were assessed along with shooting performance in each of seven conditions of simulated impairment and compared to that with habitual vision. Shooting performance was evaluated by calculating each individual’s average score in every level of simulated VI and normalizing this score by expressing it as a percentage of the baseline performance achieved with habitual vision. Receiver Operating Characteristic curves were constructed to evaluate the ability of different VA and CS cut-off criteria to appropriately classify these athletes as achieving ‘expected’ or ‘below expected’ shooting results based on their performance with different levels of VA and CS. Shooting performance remained relatively unaffected by mild decreases in VA and CS, but quickly deteriorated with more moderate losses. The ability of visual function measurements to classify shooting performance was good, with 78% of performances appropriately classified using a cut-off of 0.53 logMAR and 74% appropriately classified using a cut-off of 0.83 logCS. The current inclusion criteria for VI shooting (1.0 logMAR) is conservative, maximizing the chance of including only those with an impairment that does impact performance, but potentially excluding some who do have a genuine impairment in the sport. A lower level of impairment would include more athletes who do have a genuine impairment but would potentially include those who do not actually have an impairment that impacts performance in the sport. An impairment to CS could impact performance in the sport and might be considered in determining eligibility to take part in VI competition

    Evidence-based Classification in Track Athletics for Athletes with a Vision Impairment: A Delphi Study

    No full text
    SIGNIFICANCE: The Delphi analysis presented here highlights the need for a sport-specific evidence-based classification system for track athletics for athletes with a vision impairment (VI). This system may differ for different race distances. Further research is required to develop a useful test battery of vision tests for classification. The issue of intentional misrepresentation during classification needs particular attention. PURPOSE: At present, athletes with VI are placed into competition classes developed on the basis of legal definitions of VI. The International Paralympic Committee Athlete Classification Code states that all sports should have their own classification system designed to reflect the (visual) demands of that individual sport. This project gathered expert opinion on the specific requirements for an evidence-based sport-specific classification system for VI track athletics and to identify any particular issues within track athletics that require further research into their impact on sport performance. METHODS: A three-round Delphi review was conducted with a panel of 17 people with expertise in VI track athletics. RESULTS: The panel agreed that the current classification system in VI track athletics does not completely minimize the impact of impairment on competition outcome, highlighting the need for improvements. There was clear agreement that the existing measures of vision may fail to adequately reflect the type of vision loss that would impact running performance, with additional measures required. Intentional misrepresentation, where athletes “cheat” on classification tests, remains a serious concern. CONCLUSIONS: The panel has identified measures of vision and performance that will inform the development of an evidence-based classification system by better understanding the relationship between VI and performance in track athletics. Issues such as the use of guides and whether the current class system was equitable gave rise to differing opinions within the panel, with these varying across the different running distances

    Cardiovascular Function After Spinal Cord Injury: Prevalence and Progression of Dysfunction During Inpatient Rehabilitation and 5 Years Following Discharge

    No full text
    Background. Autonomic dysfunction after spinal cord injury (SCI) is an under-researched area when compared with motor and sensory dysfunction. Cardiovascular autonomic dysfunction is a particular concern, leading to impaired control of blood pressure and heart rate. Objectives. (1) To determine the prevalence of hypotension in individuals with SCI during and after rehabilitation; (2) To investigate changes in cardiovascular variables during and after rehabilitation; (3) To evaluate the influence of personal and lesion characteristics on cardiovascular variables. Methods. Cardiovascular variables (resting systolic [SAP] and diastolic [DAP] arterial pressures and resting [HR (rest)] and peak heart rates [HR (peak)]) were measured on 5 test occasions: start of inpatient rehabilitation, 3 months later, at discharge, and at 1 and 5 years after discharge. The time course and effects of personal and lesion characteristics on cardiovascular variables were studied using multilevel regression analyses. Results. The prevalence of hypotension was unchanged during rehabilitation and for 5 years after discharge. Odds for hypotension were highest in those with cervical and high thoracic lesions, younger individuals, and men. DAP increased during the 5 years after discharge. HR (rest) decreased during and after rehabilitation. SAP, DAP, HR (rest), and HR (peak) were lowest in those with cervical and high thoracic lesions. SAP and DAP increased with age; HR (peak) decreased with age. Conclusions. These longitudinal data provide normative values for blood pressure and heart rate changes with time after injury according to lesion and personal characteristics. These results can be used to guide clinical practice and place changes in cardiovascular function caused by interventions in perspective

    Table1_Cardiovascular and cerebrovascular responses to urodynamics testing after spinal cord injury: The influence of autonomic injury.DOCX

    No full text
    Autonomic dysfunction is a prominent concern following spinal cord injury (SCI). In particular, autonomic dysreflexia (AD; paroxysmal hypertension and concurrent bradycardia in response to sensory stimuli below the level of injury) is common in autonomically-complete injuries at or above T6. AD is currently defined as a >20 mmHg increase in systolic arterial pressure (SAP) from baseline, without heart rate (HR) criteria. Urodynamics testing (UDS) is performed routinely after SCI to monitor urological sequelae, often provoking AD. We, therefore, aimed to assess the cardiovascular and cerebrovascular responses to UDS and their association with autonomic injury in individuals with chronic (>1 year) SCI. Following blood draw (plasma norepinephrine [NE]), continuous SAP, HR, and middle cerebral artery blood flow velocity (MCAv) were recorded at baseline (10-minute supine), during standard clinical UDS, and recovery (10-minute supine) (n = 22, age 41.1 ± 2 years, 15 male). Low frequency variability in systolic arterial pressure (LF SAP; a marker of sympathetic modulation of blood pressure) and cerebral resistance were determined. High-level injury (≄T6) with blunted/absent LF SAP (2) and/or low plasma NE (−1) indicated autonomically-complete injury. Known electrocardiographic markers of atrial (p-wave duration variability) and ventricular arrhythmia (T-peak–T-end variability) were evaluated at baseline and during UDS. Nine participants were determined as autonomically-complete, yet 20 participants had increased SAP >20 mmHg during UDS. Qualitative autonomic assessment did not discriminate autonomic injury. Maximum SAP was higher in autonomically-complete injuries (207.1 ± 2.3 mmHg) than autonomically-incomplete injuries (165.9 ± 5.3 mmHg) during UDS (p 0.05). Cerebrovascular resistance index was increased during UDS in autonomically-complete injuries compared to baseline (p < 0.001) and recovery (p < 0.001) reflecting intact cerebral autoregulation. Risk for both atrial and ventricular arrhythmia increased during UDS compared to baseline (p < 0.05), particularly in autonomically-complete injuries (p < 0.05). UDS is recommended yearly in chronic SCI but is associated with profound AD and an increased risk of arrhythmia, highlighting the need for continued monitoring during UDS. Our data also highlight the need for HR criteria in the definition of AD and the need for quantitative consideration of autonomic function after SCI.</p
    corecore