99 research outputs found

    Application of the speed-duration relationship to normalize the intensity of high-intensity interval training

    Get PDF
    The tolerable duration of continuous high-intensity exercise is determined by the hyperbolic Speed-tolerable duration (S-tLIM) relationship. However, application of the S-tLIM relationship to normalize the intensity of High-Intensity Interval Training (HIIT) has yet to be considered, with this the aim of present study. Subjects completed a ramp-incremental test, and series of 4 constant-speed tests to determine the S-tLIM relationship. A sub-group of subjects (n = 8) then repeated 4 min bouts of exercise at the speeds predicted to induce intolerance at 4 min (WR4), 6 min (WR6) and 8 min (WR8), interspersed with bouts of 4 min recovery, to the point of exercise intolerance (fixed WR HIIT) on different days, with the aim of establishing the work rate that could be sustained for 960 s (i.e. 4×4 min). A sub-group of subjects (n = 6) also completed 4 bouts of exercise interspersed with 4 min recovery, with each bout continued to the point of exercise intolerance (maximal HIIT) to determine the appropriate protocol for maximizing the amount of high-intensity work that can be completed during 4×4 min HIIT. For fixed WR HIIT tLIM of HIIT sessions was 399±81 s for WR4, 892±181 s for WR6 and 1517±346 s for WR8, with total exercise durations all significantly different from each other (P<0.050). For maximal HIIT, there was no difference in tLIM of each of the 4 bouts (Bout 1: 229±27 s; Bout 2: 262±37 s; Bout 3: 235±49 s; Bout 4: 235±53 s; P>0.050). However, there was significantly less high-intensity work completed during bouts 2 (153.5±40. 9 m), 3 (136.9±38.9 m), and 4 (136.7±39.3 m), compared with bout 1 (264.9±58.7 m; P>0.050). These data establish that WR6 provides the appropriate work rate to normalize the intensity of HIIT between subjects. Maximal HIIT provides a protocol which allows the relative contribution of the work rate profile to physiological adaptations to be considered during alternative intensity-matched HIIT protocols

    Impact of adiposity on cardiac structure in adult life: the Childhood Determinants of Adult Health (CDAH) study.

    Get PDF
    BACKGROUND: We have examined the association between adiposity and cardiac structure in adulthood, using a life course approach that takes account of the contribution of adiposity in both childhood and adulthood. METHODS: The Childhood Determinants of Adult Health study (CDAH) is a follow-up study of 8,498 children who participated in the 1985 Australian Schools Health and Fitness Survey (ASHFS). The CDAH follow-up study included 2,410 participants who attended a clinic examination. Of these, 181 underwent cardiac imaging and provided complete data. The measures were taken once when the children were aged 9 to 15 years, and once in adult life, aged 26 to 36 years. RESULTS: There was a positive association between adult left ventricular mass (LVM) and childhood body mass index (BMI) in males (regression coefficient (β) 0.41; 95% confidence interval (CI): 0.14 to 0.67; p = 0.003), and females (β = 0.53; 95% CI: 0.34 to 0.72; p < 0.001), and with change in BMI from childhood to adulthood (males: β = 0.27; 95% CI: 0.04 to 0.51; p < 0.001, females: β = 0.39; 95% CI: 0.20 to 0.58; p < 0.001), after adjustment for confounding factors (age, fitness, triglyceride levels and total cholesterol in adulthood). After further adjustment for known potential mediating factors (systolic BP and fasting plasma glucose in adulthood) the relationship of LVM with childhood BMI (males: β = 0.45; 95% CI: 0.19 to 0.71; p = 0.001, females: β = 0.49; 95% CI: 0.29 to 0.68; p < 0.001) and change in BMI (males: β = 0.26; 95% CI: 0.04 to 0.49; p = 0.02, females: β = 0.40; 95% CI: 0.20 to 0.59; p < 0.001) did not change markedly. CONCLUSIONS: Adiposity and increased adiposity from childhood to adulthood appear to have a detrimental effect on cardiac structure

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Modeling of Human Prokineticin Receptors: Interactions with Novel Small-Molecule Binders and Potential Off-Target Drugs

    Get PDF
    The Prokineticin receptor (PKR) 1 and 2 subtypes are novel members of family A GPCRs, which exhibit an unusually high degree of sequence similarity. Prokineticins (PKs), their cognate ligands, are small secreted proteins of ∟80 amino acids; however, non-peptidic low-molecular weight antagonists have also been identified. PKs and their receptors play important roles under various physiological conditions such as maintaining circadian rhythm and pain perception, as well as regulating angiogenesis and modulating immunity. Identifying binding sites for known antagonists and for additional potential binders will facilitate studying and regulating these novel receptors. Blocking PKRs may serve as a therapeutic tool for various diseases, including acute pain, inflammation and cancer.Ligand-based pharmacophore models were derived from known antagonists, and virtual screening performed on the DrugBank dataset identified potential human PKR (hPKR) ligands with novel scaffolds. Interestingly, these included several HIV protease inhibitors for which endothelial cell dysfunction is a documented side effect. Our results suggest that the side effects might be due to inhibition of the PKR signaling pathway. Docking of known binders to a 3D homology model of hPKR1 is in agreement with the well-established canonical TM-bundle binding site of family A GPCRs. Furthermore, the docking results highlight residues that may form specific contacts with the ligands. These contacts provide structural explanation for the importance of several chemical features that were obtained from the structure-activity analysis of known binders. With the exception of a single loop residue that might be perused in the future for obtaining subtype-specific regulation, the results suggest an identical TM-bundle binding site for hPKR1 and hPKR2. In addition, analysis of the intracellular regions highlights variable regions that may provide subtype specificity

    A population-specific material model for sagittal craniosynostosis to predict surgical shape outcomes

    Get PDF
    Sagittal craniosynostosis consists of premature fusion (ossification) of the sagittal suture during infancy, resulting in head deformity and brain growth restriction. Spring-assisted cranioplasty (SAC) entails skull incisions to free the fused suture and insertion of two springs (metallic distractors) to promote cranial reshaping. Although safe and effective, SAC outcomes remain uncertain. We aimed hereby to obtain and validate a skull material model for SAC outcome prediction. Computed tomography data relative to 18 patients were processed to simulate surgical cuts and spring location. A rescaling model for age matching was created using retrospective data and validated. Design of experiments was used to assess the effect of different material property parameters on the model output. Subsequent material optimization—using retrospective clinical spring measurements—was performed for nine patients. A population-derived material model was obtained and applied to the whole population. Results showed that bone Young’s modulus and relaxation modulus had the largest effect on the model predictions: the use of the population-derived material model had a negligible effect on improving the prediction of on-table opening while significantly improved the prediction of spring kinematics at follow-up. The model was validated using on-table 3D scans for nine patients: the predicted head shape approximated within 2 mm the 3D scan model in 80% of the surface points, in 8 out of 9 patients. The accuracy and reliability of the developed computational model of SAC were increased using population data: this tool is now ready for prospective clinical application

    The Molecular Diversity of Freshwater Picoeukaryotes Reveals High Occurrence of Putative Parasitoids in the Plankton

    Get PDF
    Eukaryotic microorganisms have been undersampled in biodiversity studies in freshwater environments. We present an original 18S rDNA survey of freshwater picoeukaryotes sampled during spring/summer 2005, complementing an earlier study conducted in autumn 2004 in Lake Pavin (France). These studies were designed to detect the small unidentified heterotrophic flagellates (HF, 0.6–5 µm) which are considered the main bacterivores in aquatic systems. Alveolates, Fungi and Stramenopiles represented 65% of the total diversity and differed from the dominant groups known from microscopic studies. Fungi and Telonemia taxa were restricted to the oxic zone which displayed two fold more operational taxonomic units (OTUs) than the oxycline. Temporal forcing also appeared as a driving force in the diversification within targeted organisms. Several sequences were not similar to those in databases and were considered as new or unsampled taxa, some of which may be typical of freshwater environments. Two taxa known from marine systems, the genera Telonema and Amoebophrya, were retrieved for the first time in our freshwater study. The analysis of potential trophic strategies displayed among the targeted HF highlighted the dominance of parasites and saprotrophs, and provided indications that these organisms have probably been wrongfully regarded as bacterivores in previous studies. A theoretical exercise based on a new ‘parasite/saprotroph-dominated HF hypothesis’ demonstrates that the inclusion of parasites and saprotrophs may increase the functional role of the microbial loop as a link for carbon flows in pelagic ecosystems. New interesting perspectives in aquatic microbial ecology are thus opened

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics
    • …
    corecore