1,514 research outputs found

    A Cross-Sectional Study of the Associations between Chronotype, Social Jetlag and Subjective Sleep Quality in Healthy Adults

    Get PDF
    Social jetlag, a mismatch between internal biological time and social schedules, and a later timing of the midpoint of sleep on work-free days as an indicator of the circadian phase of entrainment (late chronotype), may be factors associated with poor quality sleep. This study examined the association of social jetlag and chronotype with subjective sleep quality ratings in a healthy young adult cohort and interrogated the moderating effects of sex and age on these associations. A total of 1322 participants aged 18 to 40 completed the Pittsburg Sleep Quality Index (PSQI) and the Munich Chronotype Questionnaire. Later timing of midsleep on “free” days (an indicator of chronotype) had a small-to-medium association with poorer subjective sleep quality, independently of sex and age (rho = 0.212, P < 0.001). Greater social jetlag had a small association with poorer subjective sleep quality ratings (rho = 0.077), and this effect was moderated by sex with there being a relationship between social jetlag and sleep quality only in males. Social jetlag did not mediate the relationship between chronotype and sleep quality. These results indicate differential relationships of the chronotype and social jetlag with subjective sleep quality and indicate that sex is a moderating factor for sleep quality’s relationship with social jetlag, but not for the association between sleep quality and chronotype

    Prescription patterns and injury risk among children with attention deficit hyperactivity disorder in the UK

    Get PDF
    The long-term effectiveness of pharmacotherapy for attention deficit hyperactivity disorder (ADHD) may be compromised by poor adherence to medication. Children with ADHD experience high rates of injuries and stimulant medication use is hypothesized to decrease injury risk by reducing symptoms. Longitudinal population-based primary health care data was used to 1) describe the initial pharmacological treatment patterns among children with ADHD and independent predictors of persistence with initial ADHD treatment and 2) assess the association between stimulant medication (such as methylphenidate) and risk for injury among children with ADHD. Children diagnosed with ADHD at age 1-18 between 1994 and 2008 were selected from The Health Improvement Network (THIN) database to form the overall study group (n=4234). Prescription patterns were described among 1314 children treated with medication. The association between child, clinical and treatment factors and medication persistence (defined as initial treatment length &gt; 6 months) was estimated using binomial regression. A self-controlled case series design was conducted among 328 children who experienced an incident injury event and received at least 1 stimulant medication prescription. Incident rate ratios (IRR) and 95% confidence intervals (CI) for injury comparing periods of stimulant medication treatment with untreated periods were estimated using conditional Poisson regression. Only 35.3% (n=464) of children were persistent with treatment. Children initially prescribed long-acting methylphenidate were more likely to persist with treatment as compared to standard methylphenidate (Risk Ratio= 1.2; 95% Confidence Interval (CI): 1.1, 1.4). Injury rates were lower during periods of stimulant medication use as compared with untreated periods (IRR= 0.70; 95%CI: 0.52, 0.93). The association was clearly apparent for males and did not decline with increasing time on treatment. Conclusions: The majority of children prescribed medication for ADHD do not continue on initial treatment for more than 6 months. Evaluation of the effects of ADHD medication in both clinical and research settings should consider the observed poor persistence with pharmacological treatment. Periods of stimulant medication use were associated with a decreased risk of injury among children treated for ADHD. Injury risk should be considered in decision-making about stimulant medication use among children with ADHD who have been previously treated

    Effects of societal-level COVID-19 mitigation measures on the timing and quality of sleep in Ireland

    Get PDF
    Objectives Under usual circumstances, sleep timing is strongly influenced by societal imperatives. The sweeping whole-of-society measures introduced in response to the COVID-19 pandemic may represent a unique opportunity to examine the impact of large-scale changes in work practices on sleep timing. As such, we examined the impact of the travel restrictions and work from home orders imposed in Ireland in March 2020 on sleep timing and quality. Methods We utilized a cross-sectional survey deployed shortly after the imposition of restrictions which assessed current and retrospective ratings of sleep timing and quality; the final response set analysed was from 797 adults. Participants completed the ultra-short Munich Chronotype Questionnaire, the Pittsburg Sleep Quality Index, and answered questions pertaining to work status such as working from home during the period of restrictions. Results and conclusion There was a significant shift to later sleep start and end times, as well as delayed time of midsleep on both work and free days, during the period of restrictions. Sleep duration was longer for work days, while free day sleep duration was shorter and there was a reduction in social jetlag during the restrictions. Those who worked from home during restrictions had longer sleep duration on work day and had a significantly larger difference in sleep end on work day than “essential” workers who continued to attend their normal place of work

    A Data-Informed Perspective on Public Preferences for Retaining or Abolishing Biannual Clock Changes

    Get PDF
    Scientific, public, and political discourse around the perennial changing of the clocks during the transitions into and out of daylight saving time (DST) is a touchstone issue for the translation of fundamental chronobiology into societal impacts. The Society for Research on Biological Rhythms, along with other sleep science bodies, has issued a position statement that advocates for the abolition of the biannual clock changes and the adoption of permanent standard time for the optimization of population circadian health. However, there is a paucity of data on preexisting public perceptions and preferences with regard to these issues. In this perspective, we examine 5 issues that we believe are pertinent for chronobiologists to consider to enable effective advocacy on these policies; in particular, we discuss public preference for permanent DST and steps that may need to be taken to understand this preference. We inform our discussion with reference to cross-sectional studies we undertook in Spring 2020 and Fall 2019, around the transition out of and into DST Ireland. We conclude that there appears to be a gap between existing public perceptions and preferences around the clock changes and chronobiological and sleep science-informed positions, and that the chronobiology community may benefit from interdisciplinary collaboration with colleagues with specific social sciences expertise to most effectively advocate for these research-informed positions

    Prevalence and risk factors associated with self-reported carpal tunnel syndrome (CTS) among office workers in Kuwait.

    Get PDF
    AbstractBackgroundThe prevalence of carpal tunnel syndrome (CTS) is not well understood in many Arabian Peninsula countries. The objective of this study was to investigate the prevalence and factors associated with self-reported CTS in Kuwait.FindingsA cross-sectional, self-administered survey of CTS-related symptoms was used in this study. Multivariate logistic regression was also used to estimate adjusted odds ratios for factors of interest. Participants in this study were adult office workers in Kuwait (n = 470, 55.6% males), who worked in companies employing more than 50 people. Self-reported CTS was reported in 18.7% of the group (88/470). CTS was significantly associated with the following demographic factors: female gender, obesity and number of comorbid conditions. Self-identification of CTS was also associated with key symptoms and impairment in daily activities (e.g., wrist pain, numbness, weakness, night pain, difficulty carrying bags, difficulty grasping [Chi-Square Test for Association: P < 0.05 for all symptoms/activities]). However, symptoms such as wrist pain, weakness, and functional disabilities were also frequently reported among those who do not self report CTS (range: 12.1%–38.2%).ConclusionsPrevalence of self-reported CTS among office workers in Kuwait is 18.7%, and the risk factors for CTS in this population included female gender, obesity and number of related comorbidities. The frequency of symptoms in the sample who did not self report CTS suggest that CTS may be under-recognized, however further research is required to assess the prevalence of clinically diagnosed CTS

    Molecular mapping across three populations reveals a QTL hotspot region on chromosome 3 for secondary traits associated with drought tolerance in tropical maize

    Get PDF
    Identifying quantitative trait loci (QTL) of sizeable effects that are expressed in diverse genetic backgrounds across contrasting water regimes particularly for secondary traits can significantly complement the conventional drought tolerance breeding efforts. We evaluated three tropical maize biparental populations under water-stressed and well-watered regimes for drought-related morpho-physiological traits, such as anthesis-silking interval (ASI), ears per plant (EPP), stay-green (SG) and plant-to-ear height ratio (PEH). In general, drought stress reduced the genetic variance of grain yield (GY), while that of morpho-physiological traits remained stable or even increased under drought conditions. We detected consistent genomic regions across different genetic backgrounds that could be target regions for marker-assisted introgression for drought tolerance in maize. A total of 203 QTL for ASI, EPP, SG and PEH were identified under both the water regimes. Meta-QTL analysis across the three populations identified six constitutive genomic regions with a minimum of two overlapping traits. Clusters of QTL were observed on chromosomes 1.06, 3.06, 4.09, 5.05, 7.03 and 10.04/06. Interestingly, a ~8-Mb region delimited in 3.06 harboured QTL for most of the morpho-physiological traits considered in the current study. This region contained two important candidate genes viz., zmm16 (MADS-domain transcription factor) and psbs1 (photosystem II unit) that are responsible for reproductive organ development and photosynthate accumulation, respectively. The genomic regions identified in this study partially explained the association of secondary traits with GY. Flanking single nucleotide polymorphism markers reported herein may be useful in marker-assisted introgression of drought tolerance in tropical maize

    SOAT1: a suitable target for therapy in high-grade astrocytic glioma?

    Get PDF
    Targeting molecular alterations as an effective treatment for isocitrate dehydrogenase-wildtype glioblastoma (GBM) patients has not yet been established. Sterol-O-Acyl Transferase 1 (SOAT1), a key enzyme in the conversion of endoplasmic reticulum cholesterol to esters for storage in lipid droplets (LD), serves as a target for the orphan drug mitotane to treat adrenocortical carcinoma. Inhibition of SOAT1 also suppresses GBM growth. Here, we refined SOAT1-expression in GBM and IDH-mutant astrocytoma, CNS WHO grade 4 (HGA), and assessed the distribution of LD in these tumors. Twenty-seven GBM and three HGA specimens were evaluated by multiple GFAP, Iba1, IDH1 R132H, and SOAT1 immunofluorescence labeling as well as Oil Red O staining. To a small extent SOAT1 was expressed by tumor cells in both tumor entities. In contrast, strong expression was observed in glioma-associated macrophages. Triple immunofluorescence labeling revealed, for the first time, evidence for SOAT1 colocalization with Iba1 and IDH1 R132H, respectively. Furthermore, a notable difference in the amount of LD between GBM and HGA was observed. Therefore, SOAT1 suppression might be a therapeutic option to target GBM and HGA growth and invasiveness. In addition, the high expression in cells related to neuroinflammation could be beneficial for a concomitant suppression of protumoral microglia/macrophages

    SOAT1: A Suitable Target for Therapy in High-Grade Astrocytic Glioma?

    Get PDF
    Targeting molecular alterations as an effective treatment for isocitrate dehydrogenasewildtype glioblastoma (GBM) patients has not yet been established. Sterol-O-Acyl Transferase 1 (SOAT1), a key enzyme in the conversion of endoplasmic reticulum cholesterol to esters for storage in lipid droplets (LD), serves as a target for the orphan drug mitotane to treat adrenocortical carcinoma. Inhibition of SOAT1 also suppresses GBM growth. Here, we refined SOAT1-expression in GBM and IDH-mutant astrocytoma, CNS WHO grade 4 (HGA), and assessed the distribution of LD in these tumors. Twenty-seven GBM and three HGA specimens were evaluated by multiple GFAP, Iba1, IDH1 R132H, and SOAT1 immunofluorescence labeling as well as Oil Red O staining. To a small extent SOAT1 was expressed by tumor cells in both tumor entities. In contrast, strong expression was observed in glioma-associated macrophages. Triple immunofluorescence labeling revealed, for the first time, evidence for SOAT1 colocalization with Iba1 and IDH1 R132H, respectively. Furthermore, a notable difference in the amount of LD between GBM and HGA was observed. Therefore, SOAT1 suppression might be a therapeutic option to target GBM and HGA growth and invasiveness. In addition, the high expression in cells related to neuroinflammation could be beneficial for a concomitant suppression of protumoral microglia/macrophages

    Leveraging electronic health records for clinical research

    Get PDF
    Electronic health records (EHRs) can be a major tool in the quest to decrease costs and timelines of clinical trial research, generate better evidence for clinical decision making, and advance health care. Over the past decade, EHRs have increasingly offered opportunities to speed up, streamline, and enhance clinical research. EHRs offer a wide range of possible uses in clinical trials, including assisting with prestudy feasibility assessment, patient recruitment, and data capture in care delivery. To fully appreciate these opportunities, health care stakeholders must come together to face critical challenges in leveraging EHR data, including data quality/completeness, information security, stakeholder engagement, and increasing the scale of research infrastructure and related governance. Leaders from academia, government, industry, and professional societies representing patient, provider, researcher, industry, and regulator perspectives convened the Leveraging EHR for Clinical Research Now! Think Tank in Washington, DC (February 18-19, 2016), to identify barriers to using EHRs in clinical research and to generate potential solutions. Think tank members identified a broad range of issues surrounding the use of EHRs in research and proposed a variety of solutions. Recognizing the challenges, the participants identified the urgent need to look more deeply at previous efforts to use these data, share lessons learned, and develop a multidisciplinary agenda for best practices for using EHRs in clinical research. We report the proceedings from this think tank meeting in the following paper
    corecore