118 research outputs found
Associations between Sleep Quality and Heart Rate Variability: Implications for a Biological Model of Stress Detection Using Wearable Technology.
INTRODUCTION: The autonomic nervous system plays a vital role in the modulation of many vital bodily functions, one of which is sleep and wakefulness. Many studies have investigated the link between autonomic dysfunction and sleep cycles; however, few studies have investigated the links between short-term sleep health, as determined by the Pittsburgh Quality of Sleep Index (PSQI), such as subjective sleep quality, sleep latency, sleep duration, habitual sleep efficiency, sleep disturbances, use of sleeping medication, and daytime dysfunction, and autonomic functioning in healthy individuals. AIM: In this cross-sectional study, the aim was to investigate the links between short-term sleep quality and duration, and heart rate variability in 60 healthy individuals, in order to provide useful information about the effects of stress and sleep on heart rate variability (HRV) indices, which in turn could be integrated into biological models for wearable devices. METHODS: Sleep parameters were collected from participants on commencement of the study, and HRV was derived using an electrocardiogram (ECG) during a resting and stress task (Trier Stress Test). RESULT: Low-frequency to high-frequency (LF:HF) ratio was significantly higher during the stress task than during the baseline resting phase, and very-low-frequency and high-frequency HRV were inversely related to impaired sleep during stress tasks. CONCLUSION: Given the ubiquitous nature of wearable technologies for monitoring health states, in particular HRV, it is important to consider the impacts of sleep states when using these technologies to interpret data. Very-low-frequency HRV during the stress task was found to be inversely related to three negative sleep indices: sleep quality, daytime dysfunction, and global sleep score
Stress Watch: The Use of Heart Rate and Heart Rate Variability to Detect Stress: A Pilot Study Using Smart Watch Wearables.
Stress is an inherent part of the normal human experience. Although, for the most part, this stress response is advantageous, chronic, heightened, or inappropriate stress responses can have deleterious effects on the human body. It has been suggested that individuals who experience repeated or prolonged stress exhibit blunted biological stress responses when compared to the general population. Thus, when assessing whether a ubiquitous stress response exists, it is important to stratify based on resting levels in the absence of stress. Research has shown that stress that causes symptomatic responses requires early intervention in order to mitigate possible associated mental health decline and personal risks. Given this, real-time monitoring of stress may provide immediate biofeedback to the individual and allow for early self-intervention. This study aimed to determine if the change in heart rate variability could predict, in two different cohorts, the quality of response to acute stress when exposed to an acute stressor and, in turn, contribute to the development of a physiological algorithm for stress which could be utilized in future smartwatch technologies. This study also aimed to assess whether baseline stress levels may affect the changes seen in heart rate variability at baseline and following stress tasks. A total of 30 student doctor participants and 30 participants from the general population were recruited for the study. The Trier Stress Test was utilized to induce stress, with resting and stress phase ECGs recorded, as well as inter-second heart rate (recorded using a FitBit). Although the present study failed to identify ubiquitous patterns of HRV and HR changes during stress, it did identify novel changes in these parameters between resting and stress states. This study has shown that the utilization of HRV as a measure of stress should be calculated with consideration of resting (baseline) anxiety and stress states in order to ensure an accurate measure of the effects of additive acute stress
Classifying multi-level stress responses from brain cortical EEG in Nurses and Non-health professionals using Machine Learning Auto Encoder
ObjectiveMental stress is a major problem in our society and has become an area of interest for many psychiatric researchers. One primary research focus area is the identification of bio-markers that not only identify stress but also predict the conditions (or tasks) that cause stress. Electroencephalograms (EEGs) have been used for a long time to study and identify bio-markers. While these bio-markers have successfully predicted stress in EEG studies for binary conditions, their performance is suboptimal for multiple conditions of stress.MethodsTo overcome this challenge, we propose using latent based representations of the bio-markers, which have been shown to significantly improve EEG performance compared to traditional bio-markers alone. We evaluated three commonly used EEG based bio-markers for stress, the brain load index (BLI), the spectral power values of EEG frequency bands (alpha, beta and theta), and the relative gamma (RG), with their respective latent representations using four commonly used classifiers.ResultsThe results show that spectral power value based bio-markers had a high performance with an accuracy of 83%, while the respective latent representations had an accuracy of 91%
BRAF V600E status may facilitate decision-making on active surveillance of low-risk papillary thyroid microcarcinoma.
Introduction: Conservative active surveillance has been proposed for low-risk papillary thyroid microcarcinoma (PTMC), defined as 641.0 cm and lacking clinical aggressive features, but controversy exists with accepting it as not all such PTMCs are uniformly destined for benign prognosis. This study investigated whether BRAF V600E status could further risk stratify PTMC, particularly low-risk PTMC, and can thus help with more accurate case selection for conservative management. Methods: This international multicenter study included 743 patients treated with total thyroidectomy for PTMC (584 women and 159 men), with a median age of 49 years (interquartile range [IQR], 39-59 years) and a median follow-up time of 53 months (IQR, 25-93 months). Results: On overall analyses of all PTMCs, tumour recurrences were 6.4% (32/502) versus 10.8% (26/241) in BRAF mutation-negative versus BRAF mutation-positive patients (P = 0.041), with a hazard ratio (HR) of 2.44 (95% CI (confidence interval), 1.15-5.20) after multivariate adjustment for confounding clinical factors. On the analyses of low-risk PTMC, recurrences were 1.3% (5/383) versus 4.3% (6/139) in BRAF mutation-negative versus BRAF mutation-positive patients, with an HR of 6.65 (95% CI, 1.80-24.65) after adjustment for confounding clinical factors. BRAF mutation was associated with a significant decline in the Kaplan-Meier recurrence-free survival curve in low-risk PTMC. Conclusions: BRAF V600E differentiates the recurrence risk of PTMC, particularly low-risk PTMC. Given the robust negative predictive value, conservative active surveillance of BRAF mutation-negative low-risk PTMC is reasonable whereas the increased recurrence risk and other well-known adverse effects of BRAF V600E make the feasibility of long-term conservative surveillance uncertain for BRAF mutation-positive PTMC
Patient Age-Associated Mortality Risk Is Differentiated by BRAF V600E Status in Papillary Thyroid Cancer
PurposeFor the past 65 years, patient age at diagnosis has been widely used as a major mortality risk factor in the risk stratification of papillary thyroid cancer (PTC), but whether this is generally applicable, particularly in patients with different BRAF genetic backgrounds, is unclear. The current study was designed to test whether patient age at diagnosis is a major mortality risk factor.Patients and MethodsWe conducted a comparative study of the relationship between patient age at diagnosis and PTC-specific mortality with respect to BRAF status in 2,638 patients (623 men and 2,015 women) with a median age of 46 years (interquartile range, 35 to 58 years) at diagnosis and a median follow-up time of 58 months (interquartile range, 26 to 107 months). Eleven medical centers from six countries participated in this study.ResultsThere was a linear association between patient age and mortality in patients with BRAF V600E mutation, but not in patients with wild-type BRAF, in whom the mortality rate remained low and flat with increasing age. Kaplan-Meier survival curves rapidly declined with increasing age in patients with BRAF V600E mutation but did not decline in patients with wild-type BRAF, even beyond age 75 years. The association between mortality and age in patients with BRAF V600E was independent of clinicopathologic risk factors. Similar results were observed when only patients with the conventional variant of PTC were analyzed.ConclusionThe long-observed age-associated mortality risk in PTC is dependent on BRAF status; age is a strong, continuous, and independent mortality risk factor in patients with BRAF V600E mutation but not in patients with wild-type BRAF. These results question the conventional general use of patient age as a high-risk factor in PTC and call for differentiation between patients with BRAF V600E and wild-type BRAF when applying age to risk stratification and management of PTC
Clinical Consensus Guideline on the Management of Phaeochromocytoma and Paraganglioma in Patients Harbouring Germline SDHD Pathogenic Variants
Patients with germline SDHD pathogenic variants (encoding succinate dehydrogenase subunit D; ie, paraganglioma 1 syndrome) are predominantly affected by head and neck paragangliomas, which, in almost 20% of patients, might coexist with paragangliomas arising from other locations (eg, adrenal medulla, para-aortic, cardiac or thoracic, and pelvic). Given the higher risk of tumour multifocality and bilaterality for phaeochromocytomas and paragangliomas (PPGLs) because of SDHD pathogenic variants than for their sporadic and other genotypic counterparts, the management of patients with SDHD PPGLs is clinically complex in terms of imaging, treatment, and management options. Furthermore, locally aggressive disease can be discovered at a young age or late in the disease course, which presents challenges in balancing surgical intervention with various medical and radiotherapeutic approaches. The axiom-first, do no harm-should always be considered and an initial period of observation (ie, watchful waiting) is often appropriate to characterise tumour behaviour in patients with these pathogenic variants. These patients should be referred to specialised high-volume medical centres. This consensus guideline aims to help physicians with the clinical decision-making process when caring for patients with SDHD PPGLs
Genome-Wide Association Studies in an Isolated Founder Population from the Pacific Island of Kosrae
It has been argued that the limited genetic diversity and reduced allelic heterogeneity observed in isolated founder populations facilitates discovery of loci contributing to both Mendelian and complex disease. A strong founder effect, severe isolation, and substantial inbreeding have dramatically reduced genetic diversity in natives from the island of Kosrae, Federated States of Micronesia, who exhibit a high prevalence of obesity and other metabolic disorders. We hypothesized that genetic drift and possibly natural selection on Kosrae might have increased the frequency of previously rare genetic variants with relatively large effects, making these alleles readily detectable in genome-wide association analysis. However, mapping in large, inbred cohorts introduces analytic challenges, as extensive relatedness between subjects violates the assumptions of independence upon which traditional association test statistics are based. We performed genome-wide association analysis for 15 quantitative traits in 2,906 members of the Kosrae population, using novel approaches to manage the extreme relatedness in the sample. As positive controls, we observe association to known loci for plasma cholesterol, triglycerides, and C-reactive protein and to a compelling candidate loci for thyroid stimulating hormone and fasting plasma glucose. We show that our study is well powered to detect common alleles explaining ≥5% phenotypic variance. However, no such large effects were observed with genome-wide significance, arguing that even in such a severely inbred population, common alleles typically have modest effects. Finally, we show that a majority of common variants discovered in Caucasians have indistinguishable effect sizes on Kosrae, despite the major differences in population genetics and environment
One year soy protein supplementation has positive effects on bone formation markers but not bone density in postmenopausal women
BACKGROUND: Although soy protein and its isoflavones have been reported to reduce the risk of osteoporosis in peri- and post-menopausal women, most of these studies are of short duration (i.e. six months). The objective of this study was to examine if one year consumption of soy-containing foods (providing 25 g protein and 60 mg isoflavones) exerts beneficial effects on bone in postmenopausal women. METHODS: Eighty-seven eligible postmenopausal women were randomly assigned to consume soy or control foods daily for one year. Bone mineral density (BMD) and bone mineral content (BMC) of the whole body, lumbar (L1-L4), and total hip were measured using dual energy x-ray absorptiometry at baseline and after one year. Blood and urine markers of bone metabolism were also assessed. RESULTS AND DISCUSSION: Sixty-two subjects completed the one-year long study. Whole body and lumbar BMD and BMC were significantly decreased in both the soy and control groups. However, there were no significant changes in total hip BMD and BMC irrespective of treatment. Both treatments positively affected markers of bone formation as indicated by increased serum bone-specific alkaline phosphatase (BSAP) activity, insulin-like growth factor-I (IGF-I), and osteocalcin (BSAP: 27.8 and 25.8%, IGF-I: 12.8 and 26.3%, osteocalcin: 95.2 and 103.4% for control and soy groups, respectively). Neither of the protein supplements had any effect on urinary deoxypyridinoline excretion, a marker of bone resorption. CONCLUSION: Our findings suggest that although one year supplementation of 25 g protein per se positively modulated markers of bone formation, this amount of protein was unable to prevent lumbar and whole body bone loss in postmenopausal women
Congenital hypothyroidism
Congenital hypothyroidism (CH) occurs in approximately 1:2,000 to 1:4,000 newborns. The clinical manifestations are often subtle or not present at birth. This likely is due to trans-placental passage of some maternal thyroid hormone, while many infants have some thyroid production of their own. Common symptoms include decreased activity and increased sleep, feeding difficulty, constipation, and prolonged jaundice. On examination, common signs include myxedematous facies, large fontanels, macroglossia, a distended abdomen with umbilical hernia, and hypotonia. CH is classified into permanent and transient forms, which in turn can be divided into primary, secondary, or peripheral etiologies. Thyroid dysgenesis accounts for 85% of permanent, primary CH, while inborn errors of thyroid hormone biosynthesis (dyshormonogeneses) account for 10-15% of cases. Secondary or central CH may occur with isolated TSH deficiency, but more commonly it is associated with congenital hypopitiutarism. Transient CH most commonly occurs in preterm infants born in areas of endemic iodine deficiency. In countries with newborn screening programs in place, infants with CH are diagnosed after detection by screening tests. The diagnosis should be confirmed by finding an elevated serum TSH and low T4 or free T4 level. Other diagnostic tests, such as thyroid radionuclide uptake and scan, thyroid sonography, or serum thyroglobulin determination may help pinpoint the underlying etiology, although treatment may be started without these tests. Levothyroxine is the treatment of choice; the recommended starting dose is 10 to 15 mcg/kg/day. The immediate goals of treatment are to rapidly raise the serum T4 above 130 nmol/L (10 ug/dL) and normalize serum TSH levels. Frequent laboratory monitoring in infancy is essential to ensure optimal neurocognitive outcome. Serum TSH and free T4 should be measured every 1-2 months in the first 6 months of life and every 3-4 months thereafter. In general, the prognosis of infants detected by screening and started on treatment early is excellent, with IQs similar to sibling or classmate controls. Studies show that a lower neurocognitive outcome may occur in those infants started at a later age (> 30 days of age), on lower l-thyroxine doses than currently recommended, and in those infants with more severe hypothyroidism
- …