15 research outputs found

    Comparison of established and emerging biodosimetry assays

    Get PDF
    Rapid biodosimetry tools are required to assist with triage in the case of a large-scale radiation incident. Here, we aimed to determine the dose-assessment accuracy of the well-established dicentric chromosome assay (DCA) and cytokinesis-block micronucleus assay (CBMN) in comparison to the emerging γ-H2AX foci and gene expression assays for triage mode biodosimetry and radiation injury assessment. Coded blood samples exposed to 10 X-ray doses (240 kVp, 1 Gy/min) of up to 6.4 Gy were sent to participants for dose estimation. Report times were documented for each laboratory and assay. The mean absolute difference (MAD) of estimated doses relative to the true doses was calculated. We also merged doses into binary dose categories of clinical relevance and examined accuracy, sensitivity and specificity of the assays. Dose estimates were reported by the first laboratories within 0.3-0.4 days of receipt of samples for the γ-H2AX and gene expression assays compared to 2.4 and 4 days for the DCA and CBMN assays, respectively. Irrespective of the assay we found a 2.5-4-fold variation of interlaboratory accuracy per assay and lowest MAD values for the DCA assay (0.16 Gy) followed by CBMN (0.34 Gy), gene expression (0.34 Gy) and γ-H2AX (0.45 Gy) foci assay. Binary categories of dose estimates could be discriminated with equal efficiency for all assays, but at doses ≥1.5 Gy a 10% decrease in efficiency was observed for the foci assay, which was still comparable to the CBMN assay. In conclusion, the DCA has been confirmed as the gold standard biodosimetry method, but in situations where speed and throughput are more important than ultimate accuracy, the emerging rapid molecular assays have the potential to become useful triage tools

    Dystropathology increases energy expenditure and protein turnover in the mdx mouse model of Duchenne muscular dystrophy

    Get PDF
    The skeletal muscles in Duchenne muscular dystrophy and the mdx mouse model lack functional dystrophin and undergo repeated bouts of necrosis, regeneration, and growth. These processes have a high metabolic cost. However, the consequences for whole body energy and protein metabolism, and on the dietary requirements for these macronutrients at different stages of the disease, are not well-understood. This study used juvenile (4- to 5- wk-old) and adult (12- to 14-wk-old) male dystrophic C57BL/10ScSn-mdx/J and age-matched C57BL/10ScSn/J control male mice to measure total and resting energy expenditure, food intake, spontaneous activity, body composition, whole body protein turnover, and muscle protein synthesis rates. In juvenile mdx mice that have extensive muscle damage, energy expenditure, muscle protein synthesis, and whole body protein turnover rates were higher than in age-matched controls. Adaptations in food intake and decreased activity were insufficient to meet the increased energy and protein needs of juvenile mdx mice and resulted in stunted growth. In (non-growing) adult mdx mice with less severe dystropathology, energy expenditure, muscle protein synthesis, and whole body protein turnover rates were also higher than in age-matched controls. Food intake was sufficient to meet their protein and energy needs, but insufficient to result in fat deposition. These data show that dystropathology impacts the protein and energy needs of mdx mice and that tailored dietary interventions are necessary to redress this imbalance. If not met, the resultant imbalance blunts growth, and may limit the benefits of therapies designed to protect and repair dystrophic muscles

    Quantitation of brain metabolites by HRMAS-NMR spectroscopy in rats exposed to sublethal irradiation.

    No full text
    Puropose : In the event of an acute total-body irradiation, whatever it is therapeutic or accidental, the physiopathology explaining the long-term neurological effects is unknown. We have developed a model of adult rats for which frequent behavioral assays were performed before and after a non-lethal whole-body ionizing radiation (60Co, 4,5 Gy). Learning and memory processing is an aspect of cognition involving mainly the hippocampus. We used high-resolution magic angle spinning (HRMAS) 1H NMR spectroscopy to characterize the biochemistry of four specific brain regions. The biological data for each animal will be compared to their behavioral performances, in order to underline any possible correlations. The best understanding of the physiopathological process in the Central Nervous System (CNS) will allow determining some prevention means or some enhancements for the radio-induced neurological late effects. Experimental procedures: Twenty male Wistar rats were experimented for each sample period, among which ten were gamma radiated (4.5 Gy). The cerebral structures (cortex, striata, anterior and posterior hippocampus, hypothalamus) were removed at three times: 48 hours, 8 days and 30 days after radiation. The HRMAS 1H NMR experiments were performed on a Brüker DRX Avance spectrometer at 9.4 T. Samples were spun at 4 kHz and the temperature maintained at 4°C. A spin-echo sequence with a 30ms total echo time was used. Eighteen metabolites were included in the basis. They were quantitated using the quest procedure of JMRUI software, and statistically analyzed. Moreover, another group of animals was radiated and tested in the same conditions. Then a immuno-histological study of apoptosis and neurogenesis events in the CNS was made at the same removal times. Results: NMR HRMAS results present significant differences (p<0.05) between the radiated group and the non-radiated one. GPC decreased at 48 hours post-radiation whereas Cho and PC increased, potentially with relation to a cerebral oedema. At Day-8, a decrease of Gly and Tau, and an increase in Gln, are observed in the posterior hippocampus. One month after total body irradiation, we observe an increase of GABA in cortex and striatum. Perspectives: The behavioral data, showing a significant difference in the cognitive capacities of the rats between the radiated group and the witnesses at one month, may suggest that relevant correlations are possible with biochemical and morphological modifications of the CNS. For example, the increase in GABA levels in cortex and striatum might explain the least performances of learning test

    Effect of wheat bran and wheat germ on the intestinal uptake of oleic acid, monoolein, and cholesterol in the rat

    No full text
    International audienceThe effects of fiber-rich wheat bran and wheat germ on the intestinal absorption of dietary cholesterol, free fatty acids, and monoglycerides were studied. Rats were given a test meal containing [(14)C]oleic acid, [(14)C]monoolein, and [(3)H]cholesterol. After a 1-hour digestion period, wheat bran or wheat germ (10% of meal solids) did not significantly modify the gastric emptying of lipids. No effect of wheat bran was evidenced on the amounts of lipids and cholesterol in the intestinal content or the mucosal segments, whereas wheat germ significantly increased the cholesterol in the small intestine content, decreasing its intestinal absorption. Both fractions only slightly influenced the levels of absorbed lipids and cholesterol in the plasma and liver. In vitro binding measurements showed that the wheat fractions bind only 7% to 15% of both lipids and cholesterol. Results indicate that wheat bran has no direct effect on the mucosal uptake process, whereas wheat germ might decrease the uptake of dietary cholesterol by an as yet unknown mechanism

    Introduction to IEEE P1900.4 activities

    No full text
    The Project Authorization Request (PAR) for the IEEE P1900.4 Working Group (WG), under the IEEE Standards Coordinating Committee 41 (SCC41) was approved in December 2006, leading to this WG being officially launched in February 2007 [1]. The scope of this standard is to devise a functional architecture comprising building blocks to enable coordinated network-device distributed decision making, with the goal of aiding the optimization of radio resource usage, including spectrum access control, in heterogeneous wireless access networks. This paper introduces the activities and work under progress in IEEE P1900.4, including its scope and purpose in Sects. 1 and 2, the reference usage scenarios where the standard would be applicable in Sect. 4, and its current system architecture in Sect. 5. © 2008 The Institute of Electronics, Information and Communication Engineers

    Laboratory intercomparison of the dicentric chromosome analysis assay.

    No full text
    The study design and obtained results represent an intercomparison of various laboratories performing dose assessment using the dicentric chromosome analysis (DCA) as a diagnostic triage tool for individual radiation dose assessment. Homogenously X-irradiated (240 kVp, 1 Gy/min) blood samples for establishing calibration data (0.25-5 Gy) as well as blind samples (0.1-6.4 Gy) were sent to the participants. DCA was performed according to established protocols. The time taken to report dose estimates was documented for each laboratory. Additional information concerning laboratory organization/characteristics as well as assay performance was collected. The mean absolute difference (MAD) was calculated and radiation doses were merged into four triage categories reflecting clinical aspects to calculate accuracy, sensitivity and specificity. The earliest report time was 2.4 days after sample arrival. DCA dose estimates were reported with high and comparable accuracy, with MAD values ranging between 0.16-0.5 Gy for both manual and automated scoring. No significant differences were found for dose estimates based either on 20, 30, 40 or 50 cells, suggesting that the scored number of cells can be reduced from 50 to 20 without loss of precision of triage dose estimates, at least for homogenous exposure scenarios. Triage categories of clinical significance could be discriminated efficiently using both scoring procedures

    Laboratory intercomparison of the cytokinesis-block micronucleus assay.

    Get PDF
    The focus of the study is an intercomparison of laboratories&#39; dose-assessment performances using the cytokinesis-block micronucleus (CBMN) assay as a diagnostic triage tool for individual radiation dose assessment. Homogenously X-irradiated (240 kVp, 1 Gy/min) blood samples for establishing calibration data (0.25-5 Gy) as well as blind samples (0.1-6.4 Gy) were sent to the participants. The CBMN assay was performed according to protocols individually established and varying among participating laboratories. The time taken to report dose estimates was documented for each laboratory. Additional information concerning laboratory organization/characteristics as well as assay performance was collected. The mean absolute difference (MAD) was calculated and radiation doses were merged into four triage categories reflecting clinical aspects to calculate accuracy, sensitivity and specificity. The earliest report time was 4 days after sample arrival. The CBMN dose estimates were reported with high accuracy (MAD values of 0.20-0.50 Gy at doses below 6.4 Gy for both manual and automated scoring procedures), but showed a limitation of the assay at the dose point of 6.4 Gy, which resulted in a clear dose underestimation in all cases. The MAD values (without 6.4 Gy) differed significantly (P = 0.03) between manual (0.25 Gy, SEM = 0.06, n = 4) or automated scoring procedures (0.37 Gy, SEM = 0.08, n = 5), but lowest MAD were equal (0.2 Gy) for both scoring procedures. Likewise, both scoring procedures led to the same allocation of dose estimates to triage categories of clinical significance (about 83% accuracy and up to 100% specificity)
    corecore