1,539 research outputs found

    Measuring, monitoring, and improving sleep variables: its application to professional football players

    Get PDF
    After several papers reported that Whole Body Cryotherapy (WBC) can improve objective and subjective markers of sleep, supported by anecdotal reports of post-exposure sleepiness from players at Southampton FC (SFC; PhD sponsor), the original aim of this thesis was to elucidate the effect of WBC on sleep in professional football players. However, after the UK COVID-19 lockdowns, WBC was not considered covid safe and, therefore, sleep became the central theme. Sleep plays an important role in the maintenance of both physiological and psychological homeostasis. During sleep, the release of human growth hormone and other anabolic hormones peak, inflammatory processes are modulated, and memories and skills are consolidated. Therefore, sleep is considered integral to athletic recovery and player well-being. Despite this, professional football players regularly present with sub-optimal sleep duration and/or quality. However, the factors associated with sleep variability are not fully understood, and there is no consensus on what the optimal level of sleep for athletes is. Therefore, this thesis conceptualised the following research questions: (1) What is known about the quality and duration of sleep amongst professional footballers? (2) What factors affect sleep in professional football players, specifically at SFC? (3) What are suitable and effective ways of improving sleep in professional football players? These questions were addressed across 2 systematic reviews (Chapters 2 & 4), an interventional study (Chapter 3), an observational cohort study (Chapter 5), a method agreement study (Chapter 6), and finally a case study (Chapter 7). Chapter 3 presents a study that aimed to (1) investigate the effect of a WBC applied across an in-season microcycle on the objective and subjective sleep quality in under-18 (U18) professional footballers, and (2) determine the effect of WBC on game-day inflammation, testosterone, and cortisol. Unfortunately, this study was curtailed by the COVID lockdowns. Nevertheless, novel findings were reported. Specifically, whilst objective sleep data were not significantly different between groups, players who received WBC during the microcycle preceding a competitive fixture, reported a greater sense of alertness following wake, as determined by the Leeds Sleep Quality Index. Whilst these results are subjective, they could also be indicative of improved sleep architecture following WBC. However, considering objective sleep was determined from wrist-worn activity monitors without the capability to detect sleep stages, this cannot be known with certainty. In Chapter 4, a scoping review of observational studies was performed that suggested that professional football players’ mean sleep duration, sleep latency, and wake after sleep onset (WASO), were all within recommended guidelines (these same reference limits were also used for Chapter 4). This conclusion was made on the basis that over 63% of the included studies reported means that were above the lower reference boundary for sleep duration. Despite this, several papers reported error bars that exceeded the reference limits, suggesting that suboptimal sleep remains common among individual players. In Chapter 5, an observational study was performed on under-18 professional SFC players, and the results matched what was observed from the scoping review in Chapter 4. Specifically, whilst sleep duration on matchday+1 (the day proceeding matchday) presented with a beta estimate (derived from linear mixed models) of 400mins, the remaining day types presented with sleep durations of above 420mins, the lower end of the reference limits. Nevertheless, in this study, confidence intervals breached the reference limits, therefore, further suggesting that suboptimal sleep occurs in this population. In tandem, results from Chapter 4 and Chapter 5 potentially indicate that group-level interventions are unnecessary. Rather, practitioners may find it more efficient to target support to players who report sleep disturbances. The scoping review presented in Chapter 4 also suggested that professional football players' sleep was also more variable compared to age-matched controls and several factors (e.g. scheduling variables) were associated with disrupted sleep. Chapter 5 builds on these findings by demonstrating for the first time that scheduled start time (the time players were scheduled to arrive at training or for a fixture) was associated with the amount of sleep that U18 players attained. Specifically, for every hour increase in start time, player sleep duration increased by an estimated 19.1mins (CI:9.4–28.79; p<0.001). This occurred in tandem with an 18mins (CI:9.3–26.6; p<0.001) later wake time, per hour increase in scheduled start time. It is not clear to what magnitude start time would have to be extended to generate increases in player performance, secondary to increased sleep duration. However, considering the player's age from this study (age: 17.3 ± 0.7yrs), a later start time may befit their intrinsic chronotype and, therefore, support the players by reinforcing their natural sleep habits. Whilst data from Chapter 5 support the notion that scheduling variables are associated with sleep in U18 professional footballers, they also suggest that sleep is not meaningfully associated with external workload. Global positioning and accelerometry data were collected and collated across 1-day, 7-day, and 28-day periods. For every 100m increase in high-speed running (>5.5 m·s−1), sleep onset and wake time were extended by 4.68min (CI:2.78—6.58mins) and 3.38mins (CI: 1.27—5.5mins), respectively. However, considering that workload had no significant effect on total sleep duration, the changes to wake time and sleep onset time should not concern practitioners. In Chapters 3, 5, and 7, objective sleep monitoring was completed using ReadiBand wrist-worn activity monitors. Though, it was acknowledged that these devices cannot readily link objective sleep quality and performance, and players' data could be missing due to poor band adherence. Therefore, another approach was trialled where the effect that inadequate sleep has on cognitive variables that are sensitive to sleep loss was determined, rather than measuring sleep directly. Consequently, this thesis also assessed the use of a novel virtual reality eye-tracking device that could rapidly administer an oculomotor task which was reported to be sensitive to total sleep deprivation. However, to be efficacious in a footballing environment, the device would have to demonstrate sensitivity to the daily fluctuation of sleep. Target radial variation (a measure of spatial accuracy) was found to be significantly correlated with perceived daytime sleepiness (r=0.33, p=0.005), however, no further relationships were observed between oculomotor function, psychometric vigilance, daytime sleepiness, and sleep metrics. In a retrospective analysis on a second data set from military personnel (that was included to augment the original analysis), only psychomotor vigilance, and not oculomotor function, were associated with the total amount of sleep achieved. This suggested that this device would not be efficacious in a footballing environment as a replacement for sleep monitoring. Following the research presented in Chapters 4 and 5, it was surmised that a bespoke approach to sleep intervention would be more efficacious than team-based interventions. To this end, a framework was conceptualised in collaboration with a multidisciplinary team from SFC (Chapter 7). Next, a player was referred to the scheme after reporting excessive night time awakenings. After consultation, the player completed several subjective questionnaires to assess sleep quality (Pittsburgh Sleep Quality Index), insomnia severity (Insomnia Severity Index), and daytime sleepiness (Epworth Sleepiness Scale) followed by a period of objective sleep monitoring. The sleep monitoring confirmed excessive nighttime awakenings and based on the responses from the initial consultation, a sleep hygiene intervention was applied tailored to the players' responses during the initial consultation. Results revealed improved subjective sleep quality, insomnia severity, and nighttime awakenings. Whilst a case study cannot establish causality, it does provide a potential framework for practitioners looking to provide targeted sleep interventions. Conclusions: In general, professional football players' sleep quantity, latency, and WASO is within available population-based reference limits. Scheduling variables, and not workload variables, are associated with activity monitor-derived objective sleep metrics in professional football players. Scheduled start time is associated with the amount of sleep that professional U18 football players receive. An oculomotor task does not have the requisite sensitivity to detect acute sleep loss in professional football players. A bespoke sleep intervention strategy can be efficacious in an applied footballing environment for players reporting sleep disruption

    3D deep convolutional neural network-based ventilated lung segmentation using multi-nuclear hyperpolarized gas MRI

    Get PDF
    Hyperpolarized gas MRI enables visualization of regional lung ventilation with high spatial resolution. Segmentation of the ventilated lung is required to calculate clinically relevant biomarkers. Recent research in deep learning (DL) has shown promising results for numerous segmentation problems. In this work, we evaluate a 3D V-Net to segment ventilated lung regions on hyperpolarized gas MRI scans. The dataset consists of 743 helium-3 (3He) or xenon-129 (129Xe) volumetric scans and corresponding expert segmentations from 326 healthy subjects and patients with a wide range of pathologies. We evaluated segmentation performance for several DL experimental methods via overlap, distance and error metrics and compared them to conventional segmentation methods, namely, spatial fuzzy c-means (SFCM) and K-means clustering. We observed that training on combined 3He and 129Xe MRI scans outperformed other DL methods, achieving a mean ± SD Dice of 0.958 ± 0.022, average boundary Hausdorff distance of 2.22 ± 2.16 mm, Hausdorff 95th percentile of 8.53 ± 12.98 mm and relative error of 0.087 ± 0.049. Moreover, no difference in performance was observed between 129Xe and 3He scans in the testing set. Combined training on 129Xe and 3He yielded statistically significant improvements over the conventional methods (p < 0.0001). The DL approach evaluated provides accurate, robust and rapid segmentations of ventilated lung regions and successfully excludes non-lung regions such as the airways and noise artifacts and is expected to eliminate the need for, or significantly reduce, subsequent time-consuming manual editing

    Women’s experiences of wearing therapeutic footwear in three European countries

    Get PDF
    Background: Therapeutic footwear is recommended for those people with severe foot problems associated with rheumatoid arthritis (RA). However, it is known that many do not wear them. Although previous European studies have recommended service and footwear design improvements, it is not known if services have improved or if this footwear meets the personal needs of people with RA. As an earlier study found that this footwear has more impact on women than males, this study explores women’s experiences of the process of being provided with it and wearing it. No previous work has compared women’s experiences of this footwear in different countries, therefore this study aimed to explore the potential differences between the UK, the Netherlands and Spain. Method: Women with RA and experience of wearing therapeutic footwear were purposively recruited. Ten women with RA were interviewed in each of the three countries. An interpretive phenomenological approach (IPA) was adopted during data collection and analysis. Conversational style interviews were used to collect the data. Results: Six themes were identified: feet being visibly different because of RA; the referring practitioners’ approach to the patient; the dispensing practitioners’ approach to the patient; the footwear being visible as different to others; footwear influencing social participation; and the women’s wishes for improved footwear services. Despite their nationality, these women revealed that therapeutic footwear invokes emotions of sadness, shame and anger and that it is often the final and symbolic marker of the effects of RA on self perception and their changed lives. This results in severe restriction of important activities, particularly those involving social participation. However, where a patient focussed approach was used, particularly by the practitioners in Spain and the Netherlands, the acceptance of this footwear was much more evident and there was less wastage as a result of the footwear being prescribed and then not worn. In the UK, the women were more likely to passively accept the footwear with the only choice being to reject it once it had been provided. All the women were vocal about what would improve their experiences and this centred on the consultation with both the referring practitioner and the practitioner that provides the footwear. Conclusion: This unique study, carried out in three countries has revealed emotive and personal accounts of what it is like to have an item of clothing replaced with an ‘intervention’. The participant’s experience of their consultations with practitioners has revealed the tension between the practitioners’ requirements and the women’s ‘social’ needs. Practitioners need greater understanding of the social and emotional consequences of using therapeutic footwear as an intervention

    Edible crabs “Go West”: migrations and incubation cycle of Cancer pagurus revealed by electronic tags

    Get PDF
    Crustaceans are key components of marine ecosystems which, like other exploited marine taxa, show seasonable patterns of distribution and activity, with consequences for their availability to capture by targeted fisheries. Despite concerns over the sustainability of crab fisheries worldwide, difficulties in observing crabs’ behaviour over their annual cycles, and the timings and durations of reproduction, remain poorly understood. From the release of 128 mature female edible crabs tagged with electronic data storage tags (DSTs), we demonstrate predominantly westward migration in the English Channel. Eastern Channel crabs migrated further than western Channel crabs, while crabs released outside the Channel showed little or no migration. Individual migrations were punctuated by a 7-month hiatus, when crabs remained stationary, coincident with the main period of crab spawning and egg incubation. Incubation commenced earlier in the west, from late October onwards, and brooding locations, determined using tidal geolocation, occurred throughout the species range. With an overall return rate of 34%, our results demonstrate that previous reluctance to tag crabs with relatively high-cost DSTs for fear of loss following moulting is unfounded, and that DSTs can generate precise information with regards life-history metrics that would be unachievable using other conventional means

    Overcoming intratumoural heterogeneity for reproducible molecular risk stratification: a case study in advanced kidney cancer.

    Get PDF
    BACKGROUND: Metastatic clear cell renal cell cancer (mccRCC) portends a poor prognosis and urgently requires better clinical tools for prognostication as well as for prediction of response to treatment. Considerable investment in molecular risk stratification has sought to overcome the performance ceiling encountered by methods restricted to traditional clinical parameters. However, replication of results has proven challenging, and intratumoural heterogeneity (ITH) may confound attempts at tissue-based stratification. METHODS: We investigated the influence of confounding ITH on the performance of a novel molecular prognostic model, enabled by pathologist-guided multiregion sampling (n = 183) of geographically separated mccRCC cohorts from the SuMR trial (development, n = 22) and the SCOTRRCC study (validation, n = 22). Tumour protein levels quantified by reverse phase protein array (RPPA) were investigated alongside clinical variables. Regularised wrapper selection identified features for Cox multivariate analysis with overall survival as the primary endpoint. RESULTS: The optimal subset of variables in the final stratification model consisted of N-cadherin, EPCAM, Age, mTOR (NEAT). Risk groups from NEAT had a markedly different prognosis in the validation cohort (log-rank p = 7.62 × 10(-7); hazard ratio (HR) 37.9, 95% confidence interval 4.1-353.8) and 2-year survival rates (accuracy = 82%, Matthews correlation coefficient = 0.62). Comparisons with established clinico-pathological scores suggest favourable performance for NEAT (Net reclassification improvement 7.1% vs International Metastatic Database Consortium score, 25.4% vs Memorial Sloan Kettering Cancer Center score). Limitations include the relatively small cohorts and associated wide confidence intervals on predictive performance. Our multiregion sampling approach enabled investigation of NEAT validation when limiting the number of samples analysed per tumour, which significantly degraded performance. Indeed, sample selection could change risk group assignment for 64% of patients, and prognostication with one sample per patient performed only slightly better than random expectation (median logHR = 0.109). Low grade tissue was associated with 3.5-fold greater variation in predicted risk than high grade (p = 0.044). CONCLUSIONS: This case study in mccRCC quantitatively demonstrates the critical importance of tumour sampling for the success of molecular biomarker studies research where ITH is a factor. The NEAT model shows promise for mccRCC prognostication and warrants follow-up in larger cohorts. Our work evidences actionable parameters to guide sample collection (tumour coverage, size, grade) to inform the development of reproducible molecular risk stratification methods.We acknowledge financial support from the Royal Society of Edinburgh Scottish Government Fellowship co-funded by Marie Curie Actions (IMO), Carnegie Trust (50115; IMO, DJH, GDS), IGMM DTF (IMO, GDS), Medical Research Council (MC_UU_12018/25; IMO), Chief Scientist Office Scotland (ETM37; GDS, DJH), Cancer Research UK (Experimental Medicine Centre; TP, DJH), Renal Cancer Research Fund (GDS), Kidney Cancer Scotland (GDS), MRC Clinical Training Fellowship (AL), RCSEd Robertson Trust (AL) and Melville Trust (AL)

    A specific case in the classification of woods by FTIR and chemometric: discrimination of Fagales from Malpighiales

    Get PDF
    Fourier transform infrared (FTIR) spectroscopic data was used to classify wood samples from nine species within the Fagales and Malpighiales using a range of multivariate statistical methods. Taxonomic classification of the family Fagaceae and Betulaceae from Angiosperm Phylogenetic System Classification (APG II System) was successfully performed using supervised pattern recognition techniques. A methodology for wood sample discrimination was developed using both sapwood and heartwood samples. Ten and eight biomarkers emerged from the dataset to discriminate order and family, respectively. In the species studied FTIR in combination with multivariate analysis highlighted significant chemical differences in hemicelluloses, cellulose and guaiacyl (lignin) and shows promise as a suitable approach for wood sample classification

    Safety and effectiveness of adalimumab in a clinical setting that reflects Canadian standard of care for the treatment of rheumatoid arthritis (RA): Results from the CanACT study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This multicenter, open-label, prospective, single cohort study evaluated the effectiveness and safety of adalimumab in a clinical setting reflecting the Canadian standard of care for the treatment of patients with rheumatoid arthritis (RA).</p> <p>Methods</p> <p>Patients ≥ 18 years of age with a history of active RA ≥ 3 months and fulfilling Canadian requirements for biological therapy received adalimumab 40 mg subcutaneously every other week for 12 weeks. Pre-study DMARD treatment regimens, corticosteroids, or NSAIDs were allowed throughout the study. The primary effectiveness outcome measure was the mean change in 28-joint disease activity score (DAS28) from baseline to Week 12. Secondary measures included the proportion of patients achieving joint remission (DAS28 < 2.6) and low-disease activity (DAS28 < 3.2) at Week 12, and European League Against Rheumatism (EULAR: moderate and good) and American College of Rheumatology (ACR: ACR20, 50, and 70) responses, as well as responses in ACR core components at Weeks 4, 8, and 12. Subgroup analysis included a comparison of patients naïve to biological DMARD (BDMARD) therapy versus BDMARD-experienced patients. Safety was assessed in terms of adverse and serious adverse events.</p> <p>Results</p> <p>A total of 879 patients (mean disease duration > 12 years) were enrolled; 772 (87.9%) completed the 12-week period. Adalimumab treatment was associated with rapid and sustained improvements in the signs and symptoms of RA. Significant improvements in mean DAS28 score were observed as early as Week 4. After 12 weeks of adalimumab treatment, 15.3% and 28.9% of patients achieved clinical remission and low-disease activity, respectively. Similarly, significant improvements in ACR core components were observed as early as Week 4, with continued improvements occurring through 12 weeks. Patients naïve to BDMARD therapy demonstrated numerically greater clinical responses when compared with patients who had experienced prior BDMARD therapy, although both subgroups were associated with significant improvements from baseline. The rates and types of adverse events, as well as the results of laboratory measures, demonstrated that adalimumab was generally safe and well-tolerated.</p> <p>Conclusions</p> <p>This study demonstrated that, under conditions reflective of the normal clinical practice in Canada, adalimumab is an effective and safe treatment for patients with RA.</p> <p>Trial registration</p> <p><a href="http://www.clinicaltrials.gov/ct2/show/NCT00649545">NCT00649545</a>.</p

    The HLA class II allele DRB1*1501 is over-represented in patients with idiopathic pulmonary fibrosis

    Get PDF
    Background: Idiopathic pulmonary fibrosis (IPF) is a progressive and medically refractory lung disease with a grim prognosis. Although the etiology of IPF remains perplexing, abnormal adaptive immune responses are evident in many afflicted patients. We hypothesized that perturbations of human leukocyte antigen (HLA) allele frequencies, which are often seen among patients with immunologic diseases, may also be present in IPF patients. Methods/Principal Findings: HLA alleles were determined in subpopulations of IPF and normal subjects using molecular typing methods. HLA-DRB1*15 was over-represented in a discovery cohort of 79 Caucasian IPF subjects who had lung transplantations at the University of Pittsburgh (36.7%) compared to normal reference populations. These findings were prospectively replicated in a validation cohort of 196 additional IPF subjects from four other U.S. medical centers that included both ambulatory patients and lung transplantation recipients. High-resolution typing was used to further define specific HLA-DRB1*15 alleles. DRB1*1501 prevalence in IPF subjects was similar among the 143 ambulatory patients and 132 transplant recipients (31.5% and 34.8%, respectively, p = 0.55). The aggregate prevalence of DRB1*1501 in IPF patients was significantly greater than among 285 healthy controls (33.1% vs. 20.0%, respectively, OR 2.0; 95%CI 1.3-2.9, p = 0.0004). IPF patients with DRB1*1501 (n = 91) tended to have decreased diffusing capacities for carbon monoxide (DLCO) compared to the 184 disease subjects who lacked this allele (37.8±1.7% vs. 42.8±1.4%, p = 0.036). Conclusions/Significance: DRB1*1501 is more prevalent among IPF patients than normal subjects, and may be associated with greater impairment of gas exchange. These data are novel evidence that immunogenetic processes can play a role in the susceptibility to and/or manifestations of IPF. Findings here of a disease association at the HLA-DR locus have broad pathogenic implications, illustrate a specific chromosomal area for incremental, targeted genomic study, and may identify a distinct clinical phenotype among patients with this enigmatic, morbid lung disease

    An Expanded Multi-scale Monte Carlo Simulation Method for Personalized Radiobiological Effect Estimation in Radiotherapy: a feasibility study

    Get PDF
    A novel and versatile “bottom-up� approach is developed to estimate the radiobiological effect of clinic radiotherapy. The model consists of multi-scale Monte Carlo simulations from organ to cell levels. At cellular level, accumulated damages are computed using a spectrum-based accumulation algorithm and predefined cellular damage database. The damage repair mechanism is modeled by an expanded reaction-rate two-lesion kinetic model, which were calibrated through replicating a radiobiological experiment. Multi-scale modeling is then performed on a lung cancer patient under conventional fractionated irradiation. The cell killing effects of two representative voxels (isocenter and peripheral voxel of the tumor) are computed and compared. At microscopic level, the nucleus dose and damage yields vary among all nucleuses within the voxels. Slightly larger percentage of cDSB yield is observed for the peripheral voxel (55.0%) compared to the isocenter one (52.5%). For isocenter voxel, survival fraction increase monotonically at reduced oxygen environment. Under an extreme anoxic condition (0.001%), survival fraction is calculated to be 80% and the hypoxia reduction factor reaches a maximum value of 2.24. In conclusion, with biological-related variations, the proposed multi-scale approach is more versatile than the existing approaches for evaluating personalized radiobiological effects in radiotherapy

    Early changes in bone mineral density measured by digital X-ray radiogrammetry predict up to 20 years radiological outcome in rheumatoid arthritis

    Get PDF
    ABSTRACT: INTRODUCTION: Change in bone mineral density (BMD) in the hand, as evaluated by digital X-ray radiogrammetry (DXR) of the II-IV metacarpal bones, has been suggested to predict future joint damage in rheumatoid arthritis (RA). This study's objective was to investigate if DXR-BMD loss early in the disease predicts development of joint damage in RA patients followed for up to 20 years. METHODS: 183 patients (115 women and 68 men) with early RA (mean disease duration 11 months) included from 1985 to 1989 were followed prospectively (the Lund early RA cohort). Clinical and functional measures were assessed yearly. Joint damage was evaluated according to the Larsen score on radiographs of hands and feet taken in years 0 to 5, 10, 15 and 20. These radiographs were digitized and BMD of the II-IV metacarpal bones was evaluated by DXR (Sectra, Linkoping. Sweden). Early DXR-BMD change rate (bone loss) per year calculated from the first 2 radiographs taken on average 9 months apart (SD 4.8) were available for 135 patients. Mean values of right and left hand were used. RESULTS: Mean early DXR-BMD loss during the first year calculated was -0.023 g/cm2 (SD 0.025). Patients with marked bone loss, i.e. early DXR-BMD loss above the median for the group, had significantly worse progression of joint damage at all examinations during the 20-year period. CONCLUSIONS: Early DXR-BMD progression rate predicted development of joint damage evaluated according to Larsen at year one and further onwards up to 20 years in this cohort of early RA patients
    corecore