6 research outputs found

    Risk-Based Bridge Inspection Practices

    Get PDF
    Improving bridge safety, reliability, and the allocation of bridge inspection resources are the goals of the proposed risk based bridge inspection practices. Currently, most bridges in the United States are inspected at a fixed calendar interval of 24 months, without regard to the condition of the bridge. Newer bridges with little or no damage are inspected with the same frequency as older, more deteriorated bridges thus creating inefficiency in the allocation of inspection resources. Because of limited resources, it is not possible to spend the necessary time examining bridges that are in poor condition and require extra attention since equal effort is also spent on bridges in good condition. In addition, no quantitative evidence exists to suggest that the 24 month inspection interval is the appropriate interval to achieve the desired level of safety. The proposed methodology incorporates reliability theory and expert elicitation from the Indiana Department of Transportation\u27s Risk Assessment Panel, developed during this research, to rationally determine bridge inspection needs. Assessments are made based on the likelihood and consequence of failure for specific bridge components. The likelihood of failure is determined through attributes based on design, loading, and condition characteristics while the consequence of failure is based on expected structural capacity, public safety, and serviceability. By combining the expressions of likelihood and consequence for each component, an optimum inspection interval for the entire bridge can be determined through the use of risk matrices. The methodology was evaluated through case studies involving Indiana bridges. Over 30 years of historical inspection reports were utilized in the back casting process to evaluate deterioration levels and assess the adequacy of the risk criteria. Results of the case studies conducted during the research indicated that the risk analysis procedures provided suitable inspection intervals ranging from 24 to 72 months for Indiana bridges

    Risk-Based Bridge Inspection Practices

    Get PDF
    Improving bridge safety, reliability, and the allocation of bridge inspection resources are the goals of the proposed risk-based bridge inspection practices. Currently, most bridges in the United States are inspected at a fixed calendar interval of 24 months, without regard to the condition of the bridge. Newer bridges with little or no damage are inspected with the same frequency as older, more deteriorated bridges thus creating inefficiency in the allocation of inspection resources. The proposed methodology incorporates reliability theory and expert elicitation from the Indiana Department of Transportation’s Risk Assessment Panel, developed during this research, to rationally determine bridge inspection needs. Assessments are made based on the likelihood and consequence of failure for specific bridge components. The likelihood of failure is determined through attributes based on design, loading, and condition characteristics while the consequence of failure is based on expected structural capacity, public safety, and serviceability. By combining the expressions of likelihood and consequence for each component, an optimum inspection interval for the entire bridge can be determined through the use of risk matrices. The methodology was evaluated through case studies involving Indiana bridges. Over 30 years of historical inspection reports were utilized in the back-casting process to evaluate deterioration levels and assess the adequacy of the risk criteria. Results of the case studies conducted during the research indicated that the risk analysis procedures provided suitable inspection intervals ranging from 24 to 72 months for Indiana bridges

    Suggested guidelines for validation of real-time PCR assays in veterinary diagnostic laboratories

    No full text
    This consensus document presents the suggested guidelines developed by the Laboratory Technology Committee (LTC) of the American Association of Veterinary Laboratory Diagnosticians (AAVLD) for development, validation, and modification (methods comparability) of real-time PCR (rtPCR) assays. These suggested guidelines are presented with reference to the World Organisation for Animal Health (OIE) guidelines for validation of nucleic acid detection assays used in veterinary diagnostic laboratories. Additionally, our proposed practices are compared to the guidelines from the Foods Program Regulatory Subdivision of the U.S. Food and Drug Administration (FDA) and from the American Society for Veterinary Clinical Pathology (ASVCP). The LTC suggestions are closely aligned with those from the OIE and comply with version 2021-01 of the AAVLD Requirements for an Accredited Veterinary Medical Diagnostic Laboratory, although some LTC recommendations are more stringent and extend beyond the AAVLD requirements. LTC suggested guidelines are substantially different than the guidelines recently published by the U.S. FDA for validation and modification of regulated tests used for detection of pathogens in pet food and animal-derived products, such as dairy. Veterinary diagnostic laboratories that perform assays from the FDA Bacteriological Analytical Method (BAM) manual must be aware of the different standard

    Body-composition changes in the Comprehensive Assessment of Long-term Effects of Reducing Intake of Energy (CALERIE)-2 study: A 2-y randomized controlled trial of calorie restriction in nonobese humans

    No full text
    Background: Calorie restriction (CR) retards aging and increases longevity in many animal models. However, it is unclear whether CR can be implemented in humans without adverse effects on body composition.Objective: We evaluated the effect of a 2-y CR regimen on body composition including the influence of sex and body mass index (BMI; in kg/m2) among participants enrolled in CALERIE-2 (Comprehensive Assessment of Long-term Effects of Reducing Intake of Energy), a multicenter, randomized controlled trial.Design: Participants were 218 nonobese (BMI: 21.9-28.0) adults aged 21-51 y who were randomly assigned to 25% CR (CR, n = 143) or ad libitum control (AL, n = 75) in a 2:1 ratio. Measures at baseline and 12 and 24 mo included body weight, waist circumference, fat mass (FM), fat-free mass (FFM), and appendicular mass by dual-energy X-ray absorptiometry; activity-related energy expenditure (AREE) by doubly labeled water; and dietary protein intake by self-report. Values are expressed as means ± SDs.Results: The CR group achieved 11.9% ± 0.7% CR over 2-y and had significant decreases in weight (-7.6 ± 0.3 compared with 0.4 ± 0.5 kg), waist circumference (-6.2 ± 0.4 compared with 0.9 ± 0.5 cm), FM (-5.4 ± 0.3 compared with 0.5 ± 0.4 kg), and FFM (-2.0 ± 0.2 compared with -0.0 ± 0.2 kg) at 24 mo relative to the AL group (all between-group P < 0.001). Moreover, FFM as a percentage of body weight at 24 mo was higher, and percentage of FM was lower in the CR group than in the AL. AREE, but not protein intake, predicted preservation of FFM during CR (P < 0.01). Men in the CR group lost significantly more trunk fat (P = 0.03) and FFM expressed as a percentage of weight loss (P < 0.001) than women in the CR group.Conclusions: Two years of CR had broadly favorable effects on both whole-body and regional adiposity that could facilitate health span in humans. The decrements in FFM were commensurate with the reduced body mass; although men in the CR group lost more FFM than the women did, the percentage of FFM in the men in the CR group was higher than at baseline. CALERIE was registered at clinicaltrials.gov as NCT00427193
    corecore