9 research outputs found
Xenon lighting adjusted to plant requirements
Xenon lamps are available as low and high power lamps with relatively high efficiency and a relatively long lifetime up to several thousand hours. Different construction types of short-arc and long-arc lamps permit a good adaptation to various applications in projection and illumination techniques without substantial changes of the spectral quality. Hence, the xenon lamp was the best choice for professional technical purposes where high power at simultaneously good spectral quality of the light was required. However, technical development does not stand still. Between the luminous efficacy of xenon lamps of 25-50 lm/W and the theoretical limit for 'white light' of 250 lm/W is still much room for improvement. The present development mainly favors other lamp types, like metal halide lamps and fluorescent lamps for commercial lighting purposes. The enclosed sections deal with some of the properties of xenon lamps relevant to plant illumination; particularly the spectral aspects, the temporal characteristics of the emission, and finally the economy of xenon lamps will be addressed. Due to radiation exceeding the natural global radiation in both the ultraviolet (UV) and the infrared (IR) regions, filter techniques have to be included into the discussion referring to the requirements of plant illumination. Most of the presented results were obtained by investigations in the GSF phytotron or in the closed Phytocell chambers of the University of Erlangen. As our experiences are restricted to area plant illumination rather than spot lights our discussion will concentrate on low pressure long-arc xenon lamps which are commonly used for such plant illuminations. As the spectral properties of short-arc lamps do not differ much from those of long-arc lamps most of our conclusions will be valid for high pressure xenon lamps too. These lamps often serve as light sources for small sun simulators and for monochromators which are used for action spectroscopy of plant responses
Pathophysiology of growth hormone secretion disorders and their impact on bone microstructure as measured by trabecular bone score.
This article is focused on endocrine-mediated osteoporosis caused by growth hormone (GH) disorders; adult GH deficiency and acromegaly. GH and insulin like growth factor-1 (IGF-1) stimulate linear bone growth through complex hormonal interactions and activates epiphyseal prechondrocytes. GH, via receptor activator of nuclear factor-kappaB (RANK), its ligand (RANK-L), and the osteoprotegerin system, stimulates production of osteoprotegerin and its accumulation in bone matrix. Malfunction of this mechanism, could lead to specific bone impairment. However, the primary problem of bone disease in GH secretion disorders is the primary prevention of osteoporotic fractures, so it is important to determine bone quality that better reflects the patient's actual predisposition to fracture. A method estimating bone quality from lumbar spine dual X-ray absorptiometry (DXA) scans is trabecular bone score (TBS). TBS in addition to bone mineral density (BMD) is a promising predictor of the osteoporotic fracture risk in women with postmenopausal osteopenia. In acromegaly TBS better defines risk of fracture because BMD is normal or even increased. TBS helps to monitor the effect of growth hormone therapy. Despite these findings, TBS should not be used alone, but a comprehensive consideration of all fracture risk factors, BMD and bone turnover markers is necessary
Less strict intervention thresholds for the FRAX and TBS-adjusted FRAX predict clinical fractures in osteopenic postmenopausal women with no prior fractures.
Little is known about the clinical relevance of treating post-menopausal women with no prior history of fragility fracture and bone mineral densities (BMD) within the osteopenic range. In recent years, in addition to BMD and FRAX fracture probability assessments, a surrogate measure of bone micro-architecture quality, called the trabecular bone score (TBS), has been proven to predict future fragility fractures independently of both BMD and the FRAX. In this retrospective analysis of a follow-up study, we compared three risk assessment instruments-the FRAX, the TBS, and a TBS-adjusted FRAX score-in their ability, to predict future fragility fractures over a minimum of five years of follow-up among post-menopausal osteopenic women with no prior fragility fractures. We also sought to determine if more- versus less-stringent criteria were better when stratifying patients into higher-risk patients warranting osteoporosis-targeted intervention versus lower-risk patients in whom intervention would usually be deemed unnecessary. Over a mean 5.2 years follow-up, 18 clinical fragility fractures were documented among 127 women in the age 50 years and older (mean age = 66.1). On multivariate analysis utilizing regression models and Kaplan-Meier curve analysis, less-stringent criteria for the FRAX and TBS-adjusted FRAX were capable of predicting future fractures (with sensitivity/specificity of 83/31; 39/77 and 78/50% for TBS, FRAX and TBS-adjusted FRAX, respectively), while more-stringent criteria were incapable of doing so (with sensitivity/specificity of 56/60; 39/77 and 39/74 for TBS, FRAX and TBS-adjusted FRAX, respectively). Neither TBS threshold alone was a significant predictor of future fracture in our study. However, hazard ratio analysis revealed slight superiority of the TBS-adjusted FRAX over the FRAX alone (HR = 3.09 vs. 2.79). Adjusting the FRAX tool by incorporating the TBS may be useful to optimize the detection of post-menopausal osteopenic women with no prior fractures who warrant osteoporosis-targeted therapy
Decrease of trabecular bone score reflects severity of Crohn's disease: results of a case-control study.
Osteoporosis and osteopaenia are known chronic complications of inflammatory bowel diseases. The trabecular bone score (TBS) provides an indirect measurement of bone microarchitecture, independent of bone mineral density (BMD).
The study was designed as a case-control study with the aim to assess and compare bone quantity and quality in patients with Crohn's disease (CD). We purposefully excluded postmenopausal women and patients on long-term corticosteroid therapy.
The cohort consisted of 50 CD patients and 25 healthy controls who matched in age, sex, weight, or vitamin D status. There was no significant difference between CD patients versus controls in the mean lumbar BMD of 0.982±0.119 versus 0.989±0.12 g/cm and the mean TBS score of 1.37±0.12 versus 1.38±0.12. We observed significantly lower TBS, but not lumbar BMD, in CD patients with stricturing (B2, 1.36±0.08) or penetrating (B3, 1.32±0.11) disease compared with those with luminal disease (B1, 1.42±0.11; P=0.003 and <0.0001, respectively). We also observed lower mean±SD TBS in patients on versus not on anti-tumour necrosis factor-α therapy: 1.341±0.138 versus 1.396±0.099, respectively. However, the difference between these groups failed to reach statistical significance (P=0.11). No similar finding was seen comparing lumbar BMD in these groups.
For the first time, it was observed that TBS, but not BMD, correlates with the severity of CD. Our results therefore suggest that TBS can potentially help to identify high fracture risk CD patients better than BMD alone