111 research outputs found
Air-breathing hypersonic vehicle guidance and control studies; An integrated trajectory/control analysis methodology: Phase 1
A tool which generates optimal trajectory/control histories in an integrated manner is generically adapted to the treatment of single-stage-to-orbit air-breathing hypersonic vehicles. The methodology is implemented as a two point boundary value problem solution technique. Its use permits an assessment of an entire near-minimum-fuel trajectory and desired control strategy from takeoff to orbit while satisfying physically derived inequality constraints and while achieving efficient propulsive mode phasing. A simpler analysis strategy that partitions the trajectory into several boundary condition matched segments is also included to construct preliminary trajectory and control history representations with less computational burden than is required for the overall flight profile assessment. A demonstration was accomplished using a tabulated example (winged-cone accelerator) vehicle model that is combined with a newly developed multidimensional cubic spline data smoothing routine. A constrained near-fuel-optimal trajectory, imposing a dynamic pressure limit of 1000 psf, was developed from horizontal takeoff to 20,000 ft/sec relative air speed while aiming for a polar orbit. Previously unspecified propulsive discontinuities were located. Flight regimes demanding rapid attitude changes were identified, dictating control effector and closed-loop controller authority was ascertained after evaluating effector use for vehicle trim. Also, inadequacies in vehicle model representations and specific subsystem models with insufficient fidelity were determined based on unusual control characteristics and/or excessive sensitivity to uncertainty
Human health risks due to consumption of chemically contaminated fishery products.
A small proportion of fishery products contaminated with appreciable amounts of potentially hazardous inorganic and organic contaminants from natural and environmental sources seem to pose the greatest potential for toxicity to consumers of fishery products in the United States. Health risks due to chemicals (e.g., modest changes in the overall risk of cancer, subtle deficits of neurological development in fetuses and children) are difficult to measure directly in people exposed to low levels. Immunocompetence may increase cancer risk. Inferences about the potential magnitude of these problems must be based on the levels of specific chemical present, observations of human populations and experimental animals exposed to relatively high doses, and theories about the likely mechanisms of action of specific intoxicants and the population distribution of sensitivity of human exposure. Lognormal distributions were found to provide good descriptions of the pattern of variation of contaminant concentrations among different species and geographic areas; this variability offers a solution for reduction of exposure through restricting harvest of aquatic animals from certain sites and by excluding certain species. Available information suggest that risks are not generally of high magnitude; nevertheless, their control will significantly improve public health.(ABSTRACT TRUNCATED AT 250 WORDS
Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?
Β© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. Β© 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio
Recommended from our members
Performance-based building and innovation: Balancing client and industry needs
One reason for the interest in performance-based building is that it is commonly advocated as a powerful way of enhancing innovation performance by articulating building performance outcomes, and by offering relevant procurement actors the discretion to innovate to meet these performance requirements more effectively and/or efficiently. The paper argues that the current approach to performance-based building assumes that relevant actors have the capacity, ability and motivation to innovate from a business perspective. It is proposed that the prevailing conceptualization of PBB is too restrictive and should be broadened explicitly to accommodate the required business logic that must be in place before actors will innovate. The relevant performance-based building and innovation literature is synthesized to support the assertion. The paper concludes with an innovation-focused definition of performance-based building
State-of-the-Science Workshop Report: Issues and Approaches in Low-DoseβResponse Extrapolation for Environmental Health Risk Assessment
Low-dose extrapolation model selection for evaluating the health effects of environmental pollutants is a key component of the risk assessment process. At a workshop held in Baltimore, Maryland, on 23β24 April 2007, sponsored by U.S. Environmental Protection Agency and Johns Hopkins Risk Sciences and Public Policy Institute, a multidisciplinary group of experts reviewed the state of the science regarding low-dose extrapolation modeling and its application in environmental health risk assessments. Participants identified discussion topics based on a literature review, which included examples for which human responses to ambient exposures have been extensively characterized for cancer and/or noncancer outcomes. Topics included the need for formalized approaches and criteria to assess the evidence for mode of action (MOA), the use of human versus animal data, the use of MOA information in biologically based models, and the implications of interindividual variability, background disease processes, and background exposures in threshold versus nonthreshold model choice. Participants recommended approaches that differ from current practice for extrapolating high-dose animal data to low-dose human exposures, including categorical approaches for integrating information on MOA, statistical approaches such as model averaging, and inference-based models that explicitly consider uncertainty and interindividual variability
- β¦