1,505 research outputs found

    Does virtual reality simulation have a role in training trauma and orthopaedic surgeons?

    Get PDF
    AIMS: The aim of this study was to assess the current evidence relating to the benefits of virtual reality (VR) simulation in orthopaedic surgical training, and to identify areas of future research. MATERIALS AND METHODS: A literature search using the MEDLINE, Embase, and Google Scholar databases was performed. The results' titles, abstracts, and references were examined for relevance. RESULTS: A total of 31 articles published between 2004 and 2016 and relating to the objective validity and efficacy of specific virtual reality orthopaedic surgical simulators were identified. We found 18 studies demonstrating the construct validity of 16 different orthopaedic virtual reality simulators by comparing expert and novice performance. Eight studies have demonstrated skill acquisition on a simulator by showing improvements in performance with repeated use. A further five studies have demonstrated measurable improvements in operating theatre performance following a period of virtual reality simulator training. CONCLUSION: The demonstration of 'real-world' benefits from the use of VR simulation in knee and shoulder arthroscopy is promising. However, evidence supporting its utility in other forms of orthopaedic surgery is lacking. Further studies of validity and utility should be combined with robust analyses of the cost efficiency of validated simulators to justify the financial investment required for their use in orthopaedic training. Cite this article: Bone Joint J 2018;100-B:559-65

    Accent processing in dementia

    Get PDF
    Accented speech conveys important nonverbal information about the speaker as well as presenting the brain with the problem of decoding a non-canonical auditory signal. The processing of non-native accents has seldom been studied in neurodegenerative disease and its brain basis remains poorly understood. Here we investigated the processing of non-native international and regional accents of English in cohorts of patients with Alzheimer's disease (AD; n=20) and progressive nonfluent aphasia (PNFA; n=6) in relation to healthy older control subjects (n=35). A novel battery was designed to assess accent comprehension and recognition and all subjects had a general neuropsychological assessment. Neuroanatomical associations of accent processing performance were assessed using voxel-based morphometry on MR brain images within the larger AD group. Compared with healthy controls, both the AD and PNFA groups showed deficits of non-native accent recognition and the PNFA group showed reduced comprehension of words spoken in international accents compared with a Southern English accent. At individual subject level deficits were observed more consistently in the PNFA group, and the disease groups showed different patterns of accent comprehension impairment (generally more marked for sentences in AD and for single words in PNFA). Within the AD group, grey matter associations of accent comprehension and recognition were identified in the anterior superior temporal lobe. The findings suggest that accent processing deficits may constitute signatures of neurodegenerative disease with potentially broader implications for understanding how these diseases affect vocal communication under challenging listening conditions

    Fuel for the Work Required: A Theoretical Framework for Carbohydrate Periodization and the Glycogen Threshold Hypothesis.

    Get PDF
    Deliberately training with reduced carbohydrate (CHO) availability to enhance endurance-training-induced metabolic adaptations of skeletal muscle (i.e. the 'train low, compete high' paradigm) is a hot topic within sport nutrition. Train-low studies involve periodically training (e.g., 30-50% of training sessions) with reduced CHO availability, where train-low models include twice per day training, fasted training, post-exercise CHO restriction and 'sleep low, train low'. When compared with high CHO availability, data suggest that augmented cell signalling (73% of 11 studies), gene expression (75% of 12 studies) and training-induced increases in oxidative enzyme activity/protein content (78% of 9 studies) associated with 'train low' are especially apparent when training sessions are commenced within a specific range of muscle glycogen concentrations. Nonetheless, such muscle adaptations do not always translate to improved exercise performance (e.g. 37 and 63% of 11 studies show improvements or no change, respectively). Herein, we present our rationale for the glycogen threshold hypothesis, a window of muscle glycogen concentrations that simultaneously permits completion of required training workloads and activation of the molecular machinery regulating training adaptations. We also present the 'fuel for the work required' paradigm (representative of an amalgamation of train-low models) whereby CHO availability is adjusted in accordance with the demands of the upcoming training session(s). In order to strategically implement train-low sessions, our challenge now is to quantify the glycogen cost of habitual training sessions (so as to inform the attainment of any potential threshold) and ensure absolute training intensity is not compromised, while also creating a metabolic milieu conducive to facilitating the endurance phenotype

    Pathological or physiological erosion—is there a relationship to age?

    Get PDF
    This conventional literature review discusses whether pathological tooth wear is age dependant. It briefly reviews the components of tooth wear and the prevalence of tooth wear in children, adolescents and adults. The emphasis on terminology relating to tooth wear varies. In some countries, the role of erosion is considered the most important, whereas others consider the process to be a combination of erosion, attrition and abrasion often with one being more dominant. The importance of tooth wear or erosion indices in the assessment and the evidence for progression within subject and within lesions is described. The data from the few studies reporting pathological levels of wear reported in children and adults are discussed, in particular its relationship with age. There is little evidence to support the concept that pathological levels of erosion or wear are age dependant. There is, however, some evidence to suggest that normal levels of erosion or wear are age dependant

    Passive and post-exercise cold-water immersion augments PGC-1alpha and VEGF expression in human skeletal muscle.

    Get PDF
    PurposeWe tested the hypothesis that both post-exercise and passive cold water immersion (CWI) increases PGC-1α and VEGF mRNA expression in human skeletal muscle.MethodStudy 1 Nine males completed an intermittent running protocol (8 × 3-min bouts at 90 % V̇O2max , interspersed with 3-min active recovery (1.5-min at 25 % and 1.5-min at 50 % V̇O2max ) before undergoing CWI (10 min at 8 °C) or seated rest (CONT) in a counterbalanced, randomised manner. Study 2 Ten males underwent an identical CWI protocol under passive conditions.ResultsStudy 1 PGC-1α mRNA increased in CONT (~3.4-fold; P < 0.001) and CWI (~5.9-fold; P < 0.001) at 3 h post-exercise with a greater increase observed in CWI (P < 0.001). VEGFtotal mRNA increased after CWI only (~2.4-fold) compared with CONT (~1.1-fold) at 3 h post-exercise (P < 0.01). Study 2 Following CWI, PGC-1α mRNA expression was significantly increased ~1.3-fold (P = 0.001) and 1.4-fold (P = 0.0004) at 3 and 6 h, respectively. Similarly, VEGF165 mRNA was significantly increased in CWI ~1.9-fold (P = 0.03) and 2.2-fold (P = 0.009) at 3 and 6 h post-immersion.ConclusionsData confirm post-exercise CWI augments the acute exercise-induced expression of PGC-1α mRNA in human skeletal muscle compared to exercise per se. Additionally CWI per se mediates the activation of PGC-1α and VEGF mRNA expression in human skeletal muscle. Cold water may therefore enhance the adaptive response to acute exercise

    A prospective cohort study comparing the reactogenicity of trivalent influenza vaccine in pregnant and non-pregnant women

    Get PDF
    Background: Influenza vaccination during pregnancy can prevent serious illness in expectant mothers and provide protection to newborns; however, historically uptake has been limited due to a number of factors, including safety concerns. Symptomatic complaints are common during pregnancy and may be mistakenly associated with reactions to trivalent influenza vaccine (TIV). To investigate this, we compared post-vaccination events self-reported by pregnant women to events reported by non-pregnant women receiving TIV. Methods: A prospective cohort of 1,086 pregnant women and 314 non-pregnant female healthcare workers (HCWs) who received TIV between March-May 2014 were followed-up seven days post-vaccination to assess local and systemic adverse events following immunisation (AEFIs). Women were surveyed by text message regarding perceived reactions to TIV. Those reporting an AEFI completed an interview by telephone or mobile phone to ascertain details. Logistic regression models adjusting for age and residence were used to compare reactions reported by pregnant women and non-pregnant HCWs. Results: Similar proportions of pregnant women and non-pregnant, female HCWs reported ≥1 reaction following vaccination with TIV (13.0% and 17.3%, respectively; OR = 1.2 [95% CI: 0.8-1.8]). Non-pregnant, female HCWs were more likely to report fever or headache compared to pregnant women (OR: 4.6 [95% CI 2.1-10.3] and OR: 2.2 [95% CI 1.0-4.6], respectively). No other significant differences in reported symptoms were observed. No serious vaccine-associated adverse events were reported, and less than 2% of each group sought medical advice for a reaction. Conclusions: We found no evidence suggesting pregnant women are more likely to report adverse events following influenza vaccination when compared to non-pregnant female HCWs of similar age, and in some cases, pregnant women reported significantly fewer adverse events. These results further support the safety of TIV administered in pregnant women

    Application of the speed-duration relationship to normalize the intensity of high-intensity interval training

    Get PDF
    The tolerable duration of continuous high-intensity exercise is determined by the hyperbolic Speed-tolerable duration (S-tLIM) relationship. However, application of the S-tLIM relationship to normalize the intensity of High-Intensity Interval Training (HIIT) has yet to be considered, with this the aim of present study. Subjects completed a ramp-incremental test, and series of 4 constant-speed tests to determine the S-tLIM relationship. A sub-group of subjects (n = 8) then repeated 4 min bouts of exercise at the speeds predicted to induce intolerance at 4 min (WR4), 6 min (WR6) and 8 min (WR8), interspersed with bouts of 4 min recovery, to the point of exercise intolerance (fixed WR HIIT) on different days, with the aim of establishing the work rate that could be sustained for 960 s (i.e. 4×4 min). A sub-group of subjects (n = 6) also completed 4 bouts of exercise interspersed with 4 min recovery, with each bout continued to the point of exercise intolerance (maximal HIIT) to determine the appropriate protocol for maximizing the amount of high-intensity work that can be completed during 4×4 min HIIT. For fixed WR HIIT tLIM of HIIT sessions was 399±81 s for WR4, 892±181 s for WR6 and 1517±346 s for WR8, with total exercise durations all significantly different from each other (P<0.050). For maximal HIIT, there was no difference in tLIM of each of the 4 bouts (Bout 1: 229±27 s; Bout 2: 262±37 s; Bout 3: 235±49 s; Bout 4: 235±53 s; P>0.050). However, there was significantly less high-intensity work completed during bouts 2 (153.5±40. 9 m), 3 (136.9±38.9 m), and 4 (136.7±39.3 m), compared with bout 1 (264.9±58.7 m; P>0.050). These data establish that WR6 provides the appropriate work rate to normalize the intensity of HIIT between subjects. Maximal HIIT provides a protocol which allows the relative contribution of the work rate profile to physiological adaptations to be considered during alternative intensity-matched HIIT protocols

    A Neural Approach to Ordinal Regression for the Preventive Assessment of Developmental Dyslexia

    Full text link
    Developmental Dyslexia (DD) is a learning disability related to the acquisition of reading skills that affects about 5% of the population. DD can have an enormous impact on the intellectual and personal development of affected children, so early detection is key to implementing preventive strategies for teaching language. Research has shown that there may be biological underpinnings to DD that affect phoneme processing, and hence these symptoms may be identifiable before reading ability is acquired, allowing for early intervention. In this paper we propose a new methodology to assess the risk of DD before students learn to read. For this purpose, we propose a mixed neural model that calculates risk levels of dyslexia from tests that can be completed at the age of 5 years. Our method first trains an auto-encoder, and then combines the trained encoder with an optimized ordinal regression neural network devised to ensure consistency of predictions. Our experiments show that the system is able to detect unaffected subjects two years before it can assess the risk of DD based mainly on phonological processing, giving a specificity of 0.969 and a correct rate of more than 0.92. In addition, the trained encoder can be used to transform test results into an interpretable subject spatial distribution that facilitates risk assessment and validates methodology.Comment: 12 pages, 4 figure
    • …
    corecore