114 research outputs found
Effect of a 14-day course of systemic corticosteroids on the hypothalamic-pituitary-adrenal-axis in patients with acute exacerbation of chronic obstructive pulmonary disease
<p/> <p>Background</p> <p>As supra-physiological intake of corticosteroids is a well known risk factor for the development of adrenal insufficiency, we investigated the function of the hypothalamic-pituitary-adrenal (HPA) axis during a 14-day course of systemic corticosteroids in patients with acute exacerbation of chronic obstructive pulmonary disease using clinical and laboratory measures.</p> <p>Methods</p> <p>A systematic clinical and laboratory assessment including measurement of basal cortisol levels and the response to low dose (1 μg) ACTH stimulation was performed in nine patients before, on the first and the last day of treatment, as well as 2, 7 and 21 days after corticosteroid withdrawal.</p> <p>Results</p> <p>At baseline, all nine patients had normal responses to 1 μg ACTH. On the first day of steroid treatment, 78% had a blunted peak cortisol response. This percentage increased to 89% after 14 days of steroid treatment. 78%, 33% and 33% of the patients had a blunted cortisol response to ACTH 2, 7, and 21 days after corticosteroid withdrawal, respectively. ROC curve analysis revealed that only basal cortisol concentrations (AUC 0.89), but not ACTH concentrations (AUC 0.49) or clinical signs (AUC 0.47) were predictive of an impaired function of the HPA axis. Basal cortisol levels of > 400 and < 150 nmol/l were 96% and 100% sensitive for a normal or pathological response to the ACTH stimulation test, respectively.</p> <p>Conclusion</p> <p>Immediate and prolonged suppression of the HPA axis is a common finding in otherwise asymptomatic patients undergoing systemic steroid treatment for acute exacerbation of chronic obstructive pulmonary disease and can reliably be assessed with the low-dose ACTH test.</p
Association between lung function and dyspnoea and its variation in the multinational Burden of Obstructive Lung Disease (BOLD) study
Background: Dyspnoea is a common symptom of respiratory disease. However, data on its prevalence in general populations and its association with lung function are limited and are mainly from high-income countries. This study aimed to estimate the prevalence of dyspnoea across several world regions and to investigate the association of dyspnoea with lung function. Methods: Dyspnoea was assessed, and lung function was measured in 25,806 adult participants of the multinational Burden of Obstructive Lung Disease study. Dyspnoea was defined as ≥2 on the modified Medical Research Council (mMRC) dyspnoea scale. The prevalence of dyspnoea was estimated for each of the study sites and compared across countries and world regions. Multivariable logistic regression was used to assess the association of dyspnoea with lung function in each site. Results were then pooled using random-effects meta-analysis. Results: The prevalence of dyspnoea varied widely across sites without a clear geographical pattern. The mean prevalence of dyspnoea was 13.7 % (SD=8.2 %), ranging from 0 % in Mysore (India) to 28.8 % in Nampicuan-Talugtug (Philippines). Dyspnoea was strongly associated with both spirometry restriction (FVC<LLN: OR 2.07, 95 %CI 1.75–2.45) and spirometry airflow obstruction (FEV1/FVC<LLN: OR 3.76, 95 %CI 1.04–4.65). These associations did not significantly differ between sexes, age groups, or smoking history. The association of dyspnoea with airflow obstruction was weaker among obese participants (OR 2.20, 95 %CI 1.61–3.01). Conclusion: The prevalence of dyspnoea varies substantially across the world and is strongly associated with lung function impairment. Using the mMRC scale in epidemiological research should be discussed.info:eu-repo/semantics/publishedVersio
Sick leave among home-care personnel: a longitudinal study of risk factors
BACKGROUND: Sick leave due to neck, shoulder and back disorders (NSBD) is higher among health-care workers, especially nursing aides/assistant nurses, compared with employees in other occupations. More information is needed about predictors of sick leave among health care workers. The aim of the study was to assess whether self-reported factors related to health, work and leisure time could predict: 1) future certified sick leave due to any cause, in nursing aides/assistant nurses (Study group I) and 2) future self-reported sick leave due to NSBD in nursing aides/assistant nurses (Study group II). METHODS: Study group I, comprised 443 female nursing aides/assistant nurses, not on sick leave at baseline when a questionnaire was completed. Data on certified sick leave were collected after 18 months. Study group II comprised 274 of the women, who at baseline reported no sick leave during the preceding year due to NSBD and who participated at the 18 month follow-up. Data on sick leave due to NSBD were collected from the questionnaire at 18 months. The associations between future sick leave and factors related to health, work and leisure time were tested by logistic regression analyses. RESULTS: Health-related factors such as previous low back disorders (OR: 1.89; 95% CI 1.20–2.97) and previous sick leave (OR 6.40; 95%CI 3.97–10.31), were associated with a higher risk of future sick leave due to any cause. Factors related to health, work and leisure time, i.e. previous low back disorders (OR: 4.45; 95% CI 1.27–15.77) previous sick leave, not due to NSBD (OR 3.30; 95%CI 1.33–8.17), high strain work (OR 2.34; 95%CI 1.05–5.23) and high perceived physical exertion in domestic work (OR 2.56; 95%CI 1.12–5.86) were associated with a higher risk of future sick leave due to NSBD. In the final analyses, previous low back disorders and previous sick leave remained significant in both study groups. CONCLUSION: The results suggest a focus on previous low back disorders and previous sick leave for the design of early prevention programmes aiming at reducing future sick leave due to any cause, as well as due to NSBD, among nursing aides/assistant nurses. A multifactorial approach may be of importance in the early prevention of sick leave due to NSBD
A global database on holdover time of lightning-ignited wildfires
Holdover fires are usually associated with lightning-ignited wildfires (LIWs), which can experience a smoldering phase or go undetected for several hours, days or even weeks before being reported. Since the existence and duration of the smoldering combustion in LIWs is usually unknown, holdover time is conventionally defined as the time between the lightning event that ignited the fire and the time the fire is detected. Therefore, all LIWs have an associated holdover time, which may range from a few minutes to several days. However, we lack a comprehensive understanding of holdover times. Here, we introduce a global database on holdover times of LIWs. We have collected holdover time data from 29 different studies across the world through a literature review and datasets assembled by authors of the original studies. The database is composed of three data files (censored data, non-censored data, ancillary data) and three metadata files (description of database variables, list of references, reproducible examples). Censored data are the core of the database and consist of different frequency distributions reporting the number or relative frequency of LIWs per interval of holdover time. In addition, ancillary data provide further information to understand the methods and contexts in which the data were generated in the original studies. The first version of the database contains 42 frequency distributions of holdover time built with data on more than 152 375 LIWs from 13 countries in five continents covering a time span from 1921 to 2020. This database is the first freely available, harmonized and ready-to-use global source of holdover time data, which may be used in different ways to investigate LIWs and model the holdover phenomenon. The complete database can be downloaded at https://doi.org/10.5281/zenodo.7352172 (Moris et al., 2022)
Neocortical Axon Arbors Trade-off Material and Conduction Delay Conservation
The brain contains a complex network of axons rapidly communicating information between billions of synaptically connected neurons. The morphology of individual axons, therefore, defines the course of information flow within the brain. More than a century ago, Ramón y Cajal proposed that conservation laws to save material (wire) length and limit conduction delay regulate the design of individual axon arbors in cerebral cortex. Yet the spatial and temporal communication costs of single neocortical axons remain undefined. Here, using reconstructions of in vivo labelled excitatory spiny cell and inhibitory basket cell intracortical axons combined with a variety of graph optimization algorithms, we empirically investigated Cajal's conservation laws in cerebral cortex for whole three-dimensional (3D) axon arbors, to our knowledge the first study of its kind. We found intracortical axons were significantly longer than optimal. The temporal cost of cortical axons was also suboptimal though far superior to wire-minimized arbors. We discovered that cortical axon branching appears to promote a low temporal dispersion of axonal latencies and a tight relationship between cortical distance and axonal latency. In addition, inhibitory basket cell axonal latencies may occur within a much narrower temporal window than excitatory spiny cell axons, which may help boost signal detection. Thus, to optimize neuronal network communication we find that a modest excess of axonal wire is traded-off to enhance arbor temporal economy and precision. Our results offer insight into the principles of brain organization and communication in and development of grey matter, where temporal precision is a crucial prerequisite for coincidence detection, synchronization and rapid network oscillations
Tracking the Australian plate motion through the Cenozoic: Constraints from 40Ar/39Ar geochronology
Here we use geochronology of Australian intraplate volcanoes to construct a high-resolution plate-velocity record and to explore how tectonic events in the southwest Pacific may have influenced plate motion. Nine samples from five volcanoes yield ages from 33.6 ± 0.5 to 27.3 ± 0.4 Ma and, when combined with published ages from 30 to 16 Ma, show that the rate of volcanic migration was not constant. Instead, the results indicate distinct changes in Australian plate motion. Fast northward velocities (61 ± 8 and 57 ± 4 km/Ma) prevailed from 34 to 30 (±0.5) and from 23 to 16 (±0.5) Ma, respectively, with distinct reductions to 20 ± 10 and 22 ± 5 km/Ma from 30 to 29 (±0.5) Ma and from 26 to 23 (±0.5) Ma. These velocity reductions are concurrent with tectonic collisions in New Guinea and Ontong Java, respectively. Interspersed between the periods of sluggish motion is a brief 29-26 (±0.5) Ma burst of atypically fast northward plate movement of 100 ± 20 km/Ma. We evaluate potential mechanisms for this atypically fast velocity, including catastrophic slab penetration into the lower mantle, thermomechanical erosion of the lithosphere, and plume-push forces; none are appropriate. This period of fast motion was, however, coincident with a major southward propagating slab tear that developed along the northeastern plate margin, following partial jamming of subduction and ophiolite obduction in New Caledonia. Although it is unclear whether such an event can play a role in driving fast plate motion, numerical or analogue models may help address this question. Key Points We determine nine 40Ar/39Ar ages from five Cenozoic volcanoes in Australia Slow velocities correlate with New Guinea and Ontong Java collisions Anomalously fast velocity of 100 +/- 20 km/Ma is identified from 29-26 M
- …