1,198 research outputs found

    Minimizing Bias in Biomass Allometry: Model Selection and Log‐Transformation of Data

    Get PDF
    Nonlinear regression is increasingly used to develop allometric equations for forest biomass estimation (i.e., as opposed to the traditional approach of log‐transformation followed by linear regression). Most statistical software packages, however, assume additive errors by default, violating a key assumption of allometric theory and possibly producing spurious models. Here, we show that such models may bias stand‐level biomass estimates by up to 100 percent in young forests, and we present an alternative nonlinear fitting approach that conforms with allometric theory

    Design of Launch Abort System Thrust Profile and Concept of Operations

    Get PDF
    This paper describes how the Abort Motor thrust profile has been tailored and how optimizing the Concept of Operations on the Launch Abort System (LAS) of the Orion Crew Exploration Vehicle (CEV) aides in getting the crew safely away from a failed Crew Launch Vehicle (CLV). Unlike the passive nature of the Apollo system, the Orion Launch Abort Vehicle will be actively controlled, giving the program a more robust abort system with a higher probability of crew survival for an abort at all points throughout the CLV trajectory. By optimizing the concept of operations and thrust profile the Orion program will be able to take full advantage of the active Orion LAS. Discussion will involve an overview of the development of the abort motor thrust profile and the current abort concept of operations as well as their effects on the performance of LAS aborts. Pad Abort (for performance) and Maximum Drag (for separation from the Launch Vehicle) are the two points that dictate the required thrust and shape of the thrust profile. The results in this paper show that 95% success of all performance requirements is not currently met for Pad Abort. Future improvements to the current parachute sequence and other potential changes will mitigate the current problems, and meet abort performance requirements

    Intravenous iron or placebo for anaemia in intensive care: the IRONMAN multicentre randomized blinded trial. A randomized trial of IV iron in critical illness

    Get PDF
    PURPOSE: Both anaemia and allogenic red blood cell transfusion are common and potentially harmful in patients admitted to the intensive care unit. Whilst intravenous iron may decrease anaemia and RBC transfusion requirement, the safety and efficacy of administering iron intravenously to critically ill patients is uncertain. METHODS: The multicentre, randomized, placebo-controlled, blinded Intravenous Iron or Placebo for Anaemia in Intensive Care (IRONMAN) study was designed to test the hypothesis that, in anaemic critically ill patients admitted to the intensive care unit, early administration of intravenous iron, compared with placebo, reduces allogeneic red blood cell transfusion during hospital stay and increases the haemoglobin level at the time of hospital discharge. RESULTS: Of 140 patients enrolled, 70 were assigned to intravenous iron and 70 to placebo. The iron group received 97 red blood cell units versus 136 red blood cell units in the placebo group, yielding an incidence rate ratio of 0.71 [95 % confidence interval (0.43-1.18), P = 0.19]. Overall, median haemoglobin at hospital discharge was significantly higher in the intravenous iron group than in the placebo group [107 (interquartile ratio IQR 97-115) vs. 100 g/L (IQR 89-111), P = 0.02]. There was no significant difference between the groups in any safety outcome. CONCLUSIONS: In patients admitted to the intensive care unit who were anaemic, intravenous iron, compared with placebo, did not result in a significant lowering of red blood cell transfusion requirement during hospital stay. Patients who received intravenous iron had a significantly higher haemoglobin concentration at hospital discharge. The trial was registered at http://www.anzctr.org.au as # ACTRN12612001249842

    Subjective evaluation of the environmental quality in China's industrial corridors.

    Get PDF
    Based on 270 questionnaire surveys in 8 cities of 5 industrial corridors in China, this study aims to examine the effects of industry construction on the evaluation of environmental pollution, natural environment, built environment, personal perception and development and policy. The results show that the evaluations on environmental pollution and landscape design are both below the medium level, but the evaluations of the living comfort and safety are both above the medium level. Further analysis, females usually give lower evaluation scores than males, and age and health situations are negatively related to the evaluation results; People indicate a great desire to reduce the environmental pollution and protect the natural environment. Moreover, the landscape was analysed using colour extraction techniques based on video recording, there are significant correlations between industrial pixel ratio and evaluation results of air quality, vegetation pixel ratio and evaluation results of river water quality, and public facilities pixel ratio and evaluation results of comfort levels

    Hybrid Reynolds-Averaged/Large Eddy Simulation of the Flow in a Model SCRamjet Cavity Flameholder

    Get PDF
    Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. Experimental data available for this configuration include velocity statistics obtained from particle image velocimetry. Several turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged/large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This e ort was undertaken to not only assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community, but to also begin to understand how this capability can best be used to augment standard Reynolds-averaged simulations. The numerical errors were quantified for the steady-state simulations, and at least qualitatively assessed for the scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results displayed a high degree of variability when comparing the flameholder fuel distributions obtained from each turbulence model. This prompted the consideration of applying the higher-fidelity scale-resolving simulations as a surrogate "truth" model to calibrate the Reynolds-averaged closures in a non-reacting setting prior to their use for the combusting simulations. In general, the Reynolds-averaged velocity profile predictions at the lowest fueling level matched the particle imaging measurements almost as well as was observed for the non-reacting condition. However, the velocity field predictions proved to be more sensitive to the flameholder fueling rate than was indicated in the measurements

    Molecular methodologies for improved polymicrobial sepsis diagnosis

    Get PDF
    Polymicrobial sepsis is associated with worse patient outcomes than monomicrobial sepsis. Routinely used culture-dependent microbiological diagnostic techniques have low sensitivity, often leading to missed identification of all causative organisms. To overcome these limitations, culture-independent methods incorporating advanced molecular technologies have recently been explored. However, contamination, assay inhibition and interference from host DNA are issues that must be addressed before these methods can be relied on for routine clinical use. While the host component of the complex sepsis host–pathogen interplay is well described, less is known about the pathogen’s role, including pathogen–pathogen interactions in polymicrobial sepsis. This review highlights the clinical significance of polymicrobial sepsis and addresses how promising alternative molecular microbiology methods can be improved to detect polymicrobial infections. It also discusses how the application of shotgun metagenomics can be used to uncover pathogen/pathogen interactions in polymicrobial sepsis cases and their potential role in the clinical course of this condition

    Deconvolution in Random Effects Models via Normal Mixtures

    Get PDF
    This dissertation describes a minimum distance method for density estimation when the variable of interest is not directly observed. It is assumed that the underlying target density can be well approximated by a mixture of normals. The method compares a density estimate of observable data with a density of the observable data induced from assuming the target density can be written as a mixture of normals. The goal is to choose the parameters in the normal mixture that minimize the distance between the density estimate of the observable data and the induced density from the model. The method is applied to the deconvolution problem to estimate the density of XiX_{i} when the variable % Y_{i}=X_{i}+Z_{i}, i=1,,ni=1,\ldots ,n, is observed, and the density of ZiZ_{i} is known. Additionally, it is applied to a location random effects model to estimate the density of ZijZ_{ij} when the observable quantities are pp data sets of size nn given by Xij=αi+γZij, i=1,,p, j=1,,nX_{ij}=\alpha _{i}+\gamma Z_{ij},~i=1,\ldots ,p,~j=1,\ldots ,n, where the densities of αi\alpha_{i} and ZijZ_{ij} are both unknown. The performance of the minimum distance approach in the measurement error model is compared with the deconvoluting kernel density estimator of Stefanski and Carroll (1990). In the location random effects model, the minimum distance estimator is compared with the explicit characteristic function inversion method from Hall and Yao (2003). In both models, the methods are compared using simulated and real data sets. In the simulations, performance is evaluated using an integrated squared error criterion. Results indicate that the minimum distance methodology is comparable to the deconvoluting kernel density estimator and outperforms the explicit characteristic function inversion method

    Harmonising and linking biomedical and clinical data across disparate data archives to enable integrative cross-biobank research

    Get PDF
    A wealth of biospecimen samples are stored in modern globally distributed biobanks. Biomedical researchers worldwide need to be able to combine the available resources to improve the power of large-scale studies. A prerequisite for this effort is to be able to search and access phenotypic, clinical and other information about samples that are currently stored at biobanks in an integrated manner. However, privacy issues together with heterogeneous information systems and the lack of agreed-upon vocabularies have made specimen searching across multiple biobanks extremely challenging. We describe three case studies where we have linked samples and sample descriptions in order to facilitate global searching of available samples for research. The use cases include the ENGAGE (European Network for Genetic and Genomic Epidemiology) consortium comprising at least 39 cohorts, the SUMMIT (surrogate markers for micro- and macro-vascular hard endpoints for innovative diabetes tools) consortium and a pilot for data integration between a Swedish clinical health registry and a biobank. We used the Sample avAILability (SAIL) method for data linking: first, created harmonised variables and then annotated and made searchable information on the number of specimens available in individual biobanks for various phenotypic categories. By operating on this categorised availability data we sidestep many obstacles related to privacy that arise when handling real values and show that harmonised and annotated records about data availability across disparate biomedical archives provide a key methodological advance in pre-analysis exchange of information between biobanks, that is, during the project planning phase
    corecore