73 research outputs found

    Assessment of neuromuscular and haemodynamic activity in individuals with and without chronic low back pain

    Get PDF
    BACKGROUND: Biering-Sørenson (1984) found that individuals with less lumbar extensor muscle endurance had an increased occurrence of first episode low back pain. As a result, back endurance tests have been recommended for inclusion in health assessment protocols. However, different studies have reported markedly different values for endurance times, leading some researchers to believe that the back is receiving support from the biceps femoris and gluteus maximus. Therefore, this study was designed to examine the haemodynamic and neuromuscular activity of the erector spinae, biceps femoris, and gluteus maximus musculature during the Biering-Sørenson Muscular Endurance Test (BSME). METHODS: Seventeen healthy individuals and 46 individuals with chronic low back pain performed the Biering-Sørenson Muscular Endurance Test while surface electromyography was used to quantify neuromuscular activity. Disposable silver-silver-chloride electrodes were placed in a bipolar arrangement over the right or left biceps femoris, gluteus maximus, and the lumbosacral paraspinal muscles at the level of L(3). Near Infrared Spectroscopy was used simultaneously to measure tissue oxygenation and blood volume changes of the erector spinae and biceps femoris. RESULTS: The healthy group displayed a significantly longer time to fatigue (Healthy: 168.5s, LBP: 111.1s; p ≤ 0.05). Significant differences were shown in the median frequency slope of the erector spinae between the two groups at 90–100% of the time to fatigue while no significant differences were noted in the haemodynamic data for the two groups. CONCLUSION: Although the BSME has been recognized as a test for back endurance, individuals with chronic LBP appear to incorporate a strategy that may help support the back musculature by utilizing the biceps femoris and gluteus maximus to a greater degree than their healthy counterparts

    Muscle oxygenation trends after tapering in trained cyclists

    Get PDF
    BACKGROUND: This study examined muscle deoxygenation trends before and after a 7-day taper using non-invasive near infrared spectroscopy (NIRS). METHODS: Eleven cyclists performed an incremental cycle ergometer test to determine maximal oxygen consumption (VO(2)max = 4.68 ± 0.57 L·min(-1)) prior to the study, and then completed two or three high intensity (85–90% VO(2)max) taper protocols after being randomly assigned to a taper group: T30 (n = 5), T50 (n = 5), or T80 (n = 5) [30%, 50%, 80% reduction in training volume, respectively]. Physiological measurements were recorded during a simulated 20 km time trials (20TT) performed on a set of wind-loaded rollers. RESULTS AND DISCUSSION: The results showed that the physiological variables of oxygen consumption (VO(2)), carbon dioxide (VCO(2)) and heart rate (HR) were not significantly different after tapering, except for a decreased ventilatory equivalent for oxygen (V(E)/VO(2)) in T50 (p ≤ 0.05). However, during the 20TT muscle deoxygenation measured continuously in the vastus medialis was significantly lower (-749 ± 324 vs. -1140 ± 465 mV) in T50 after tapering, which was concomitant with a 4.53% improvement (p = 0.057) in 20TT performance time, and a 0.18 L·min(-1 )(4.5%) increase in VO(2). Furthermore, when changes in performance time and tissue deoxygenation (post- minus pre-taper) were plotted (n = 11), a moderately high correlation was found (r = 0.82). CONCLUSION: It was concluded that changes in simulated 20TT performance appeared to be related, in part, to changes in muscle deoxygenation following tapering, and that NIRS can be used effectively to monitor muscle deoxygenation during a taper period

    Daily survey participation and positive changes in mental health symptom scores among Royal Canadian Mounted Police Cadets

    Get PDF
    IntroductionRoyal Canadian Mounted Police (RCMP) officers self-report high levels of mental health disorder symptoms, such as alcohol use disorder, generalized anxiety disorder, major depressive disorder, panic disorder, and posttraumatic stress disorder. Participation in regular mental health monitoring has been associated with improved mental health disorder symptom reporting and may provide an accessible tool to support RCMP mental health. The current study assessed relationships between self-reported mental health disorder symptoms and the completion of daily surveys (i.e., daily mental health disorder symptom monitoring) by RCMP cadets during the Cadet Training Program (CTP).MethodsParticipants were RCMP cadets (n = 394; 76.1% men) in the Standard Training Program who completed the 26-week CTP and daily self-monitoring surveys, as well as full mental health assessments at pre-training (i.e., starting the CTP) and pre-deployment (i.e., ~2 weeks prior to deployment to the field). Symptoms of alcohol use disorder, generalized anxiety disorder, major depressive disorder, panic disorder, and posttraumatic stress disorder were assessed. Changes in mental health disorder symptom reporting from pre-training to pre-deployment were calculated. Spearman’s rank correlations were estimated for number of daily surveys completed and change in mental health disorder symptom scores between pre-training and pre-deployment.ResultsThere were statistically significant inverse relationships between number of daily surveys completed and number of mental health disorder symptoms reported; specifically, cadets who completed more daily surveys during CTP reported fewer symptoms of alcohol use disorder, generalized anxiety disorder, major depressive disorder, panic disorder, and posttraumatic stress disorder.ConclusionAn inverse correlation between number of daily surveys completed and mental health disorder symptom scores indicated that participation in daily mental health monitoring was associated with improvements in self-reported mental health disorder symptoms between pre-training and pre-deployment. Regular self-monitoring of mental health disorder symptoms may help to mitigate mental health challenges among RCMP cadets and officers

    The genetics and neuropathology of frontotemporal lobar degeneration

    Get PDF
    Frontotemporal lobar degeneration (FTLD) is a heterogeneous group of disorders characterized by disturbances of behavior and personality and different types of language impairment with or without concomitant features of motor neuron disease or parkinsonism. FTLD is characterized by atrophy of the frontal and anterior temporal brain lobes. Detailed neuropathological studies have elicited proteinopathies defined by inclusions of hyperphosphorylated microtubule-associated protein tau, TAR DNA-binding protein TDP-43, fused-in-sarcoma or yet unidentified proteins in affected brain regions. Rather than the type of proteinopathy, the site of neurodegeneration correlates relatively well with the clinical presentation of FTLD. Molecular genetic studies identified five disease genes, of which the gene encoding the tau protein (MAPT), the growth factor precursor gene granulin (GRN), and C9orf72 with unknown function are most frequently mutated. Rare mutations were also identified in the genes encoding valosin-containing protein (VCP) and charged multivesicular body protein 2B (CHMP2B). These genes are good markers to distinguish underlying neuropathological phenotypes. Due to the complex landscape of FTLD diseases, combined characterization of clinical, imaging, biological and genetic biomarkers is essential to establish a detailed diagnosis. Although major progress has been made in FTLD research in recent years, further studies are needed to completely map out and correlate the clinical, pathological and genetic entities, and to understand the underlying disease mechanisms. In this review, we summarize the current state of the rapidly progressing field of genetic, neuropathological and clinical research of this intriguing condition

    Dose prediction for repurposing nitazoxanide in SARS-CoV-2 treatment or chemoprophylaxis

    Get PDF
    Background Severe acute respiratory syndrome coronavirus 2 (SARS‐CoV‐2) has been declared a global pandemic and urgent treatment and prevention strategies are needed. Nitazoxanide, an anthelmintic drug has been shown to exhibit in vitro activity against SARS‐CoV‐2. The present study used physiologically‐based pharmacokinetic (PBPK) modelling to inform optimal doses of nitazoxanide capable of maintaining plasma and lung tizoxanide exposures above the reported SARS‐CoV‐2 EC90. Methods A whole‐body PBPK model was validated against available pharmacokinetic data for healthy individuals receiving single and multiple doses between 500–4000 mg with and without food. The validated model was used to predict doses expected to maintain tizoxanide plasma and lung concentrations above the EC90 in >90% of the simulated population. PopDes was used to estimate an optimal sparse sampling strategy for future clinical trials. Results The PBPK model was successfully validated against the reported human pharmacokinetics. The model predicted optimal doses of 1200 mg QID, 1600 mg TID, 2900 mg BID in the fasted state and 700 mg QID, 900 mg TID and 1400 mg BID when given with food. For BID regimens an optimal sparse sampling strategy of 0.25, 1, 3 and 12h post dose was estimated. Conclusion The PBPK model predicted tizoxanide concentrations within doses of nitazoxanide already given to humans previously. The reported dosing strategies provide a rational basis for design of clinical trials with nitazoxanide for the treatment or prevention of SARS‐CoV‐2 infection. A concordant higher dose of nitazoxanide is now planned for investigation in the seamless phase I/IIa AGILE trial (www.agiletrial.net)

    Accelerated surgery versus standard care in hip fracture (HIP ATTACK): an international, randomised, controlled trial

    Get PDF

    Mortality and pulmonary complications in patients undergoing surgery with perioperative SARS-CoV-2 infection: an international cohort study

    Get PDF
    Background: The impact of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on postoperative recovery needs to be understood to inform clinical decision making during and after the COVID-19 pandemic. This study reports 30-day mortality and pulmonary complication rates in patients with perioperative SARS-CoV-2 infection. Methods: This international, multicentre, cohort study at 235 hospitals in 24 countries included all patients undergoing surgery who had SARS-CoV-2 infection confirmed within 7 days before or 30 days after surgery. The primary outcome measure was 30-day postoperative mortality and was assessed in all enrolled patients. The main secondary outcome measure was pulmonary complications, defined as pneumonia, acute respiratory distress syndrome, or unexpected postoperative ventilation. Findings: This analysis includes 1128 patients who had surgery between Jan 1 and March 31, 2020, of whom 835 (74·0%) had emergency surgery and 280 (24·8%) had elective surgery. SARS-CoV-2 infection was confirmed preoperatively in 294 (26·1%) patients. 30-day mortality was 23·8% (268 of 1128). Pulmonary complications occurred in 577 (51·2%) of 1128 patients; 30-day mortality in these patients was 38·0% (219 of 577), accounting for 81·7% (219 of 268) of all deaths. In adjusted analyses, 30-day mortality was associated with male sex (odds ratio 1·75 [95% CI 1·28–2·40], p\textless0·0001), age 70 years or older versus younger than 70 years (2·30 [1·65–3·22], p\textless0·0001), American Society of Anesthesiologists grades 3–5 versus grades 1–2 (2·35 [1·57–3·53], p\textless0·0001), malignant versus benign or obstetric diagnosis (1·55 [1·01–2·39], p=0·046), emergency versus elective surgery (1·67 [1·06–2·63], p=0·026), and major versus minor surgery (1·52 [1·01–2·31], p=0·047). Interpretation: Postoperative pulmonary complications occur in half of patients with perioperative SARS-CoV-2 infection and are associated with high mortality. Thresholds for surgery during the COVID-19 pandemic should be higher than during normal practice, particularly in men aged 70 years and older. Consideration should be given for postponing non-urgent procedures and promoting non-operative treatment to delay or avoid the need for surgery. Funding: National Institute for Health Research (NIHR), Association of Coloproctology of Great Britain and Ireland, Bowel and Cancer Research, Bowel Disease Research Foundation, Association of Upper Gastrointestinal Surgeons, British Association of Surgical Oncology, British Gynaecological Cancer Society, European Society of Coloproctology, NIHR Academy, Sarcoma UK, Vascular Society for Great Britain and Ireland, and Yorkshire Cancer Research

    Polygenic risk score in postmortem diagnosed sporadic early-onset Alzheimer’s disease

    Get PDF
    Sporadic early onset Alzheimer’s disease (sEOAD) exhibits the symptoms of late onset Alzheimer’s disease (LOAD) but lacks the familial aspect of the early onset familial form. The genetics of Alzheimer’s disease (AD) identifies APOEε4 to be the greatest risk factor; however, it is a complex disease involving both environmental risk factors and multiple genetic loci. Polygenic risk scores (PRS) accumulate the total risk of a phenotype in an individual based on variants present in their genome. We determined whether sEOAD cases had a higher PRS compared to controls. A cohort of sEOAD cases were genotyped on the NeuroX array and PRS were generated using PRSice. The target dataset consisted of 408 sEOAD cases and 436 controls. The base dataset was collated by the IGAP consortium, with association data from 17,008 LOAD cases and 37,154 controls, which can be used for identifying sEOAD cases due to having shared phenotype. PRS were generated using all common SNPs between the base and target dataset, PRS were also generated using only SNPs within a 500kb region surrounding the APOE gene. Sex and number of APOE ε2 or ε4 alleles were used as variables for logistic regression and combined with PRS. The results show that PRS is higher on average in sEOAD cases than controls, although there is still overlap amongst the whole cohort. Predictive ability of identifying cases and controls using PRSice was calculated with 72.9% accuracy, greater than the APOE locus alone (65.2%). Predictive ability was further improved with logistic regression, identifying cases and controls with 75.5% accuracy

    A Comparison of Neuroimaging Abnormalities in Multiple Sclerosis, Major Depression and Chronic Fatigue Syndrome (Myalgic Encephalomyelitis): is There a Common Cause?

    Get PDF
    corecore