646 research outputs found

    An Expert System-Driven Method for Parametric Trajectory Optimization During Conceptual Design

    Get PDF
    During the early phases of engineering design, the costs committed are high, costs incurred are low, and the design freedom is high. It is well documented that decisions made in these early design phases drive the entire design's life cycle cost. In a traditional paradigm, key design decisions are made when little is known about the design. As the design matures, design changes become more difficult in both cost and schedule to enact. The current capability-based paradigm, which has emerged because of the constrained economic environment, calls for the infusion of knowledge usually acquired during later design phases into earlier design phases, i.e. bringing knowledge acquired during preliminary and detailed design into pre-conceptual and conceptual design. An area of critical importance to launch vehicle design is the optimization of its ascent trajectory, as the optimal trajectory will be able to take full advantage of the launch vehicle's capability to deliver a maximum amount of payload into orbit. Hence, the optimal ascent trajectory plays an important role in the vehicle's affordability posture yet little of the information required to successfully optimize a trajectory is known early in the design phase. Thus, the current paradigm of optimizing ascent trajectories involves generating point solutions for every change in a vehicle's design parameters. This is often a very tedious, manual, and time-consuming task for the analysts. Moreover, the trajectory design space is highly non-linear and multi-modal due to the interaction of various constraints. When these obstacles are coupled with the Program to Optimize Simulated Trajectories (POST), an industry standard program to optimize ascent trajectories that is difficult to use, expert trajectory analysts are required to effectively optimize a vehicle's ascent trajectory. Over the course of this paper, the authors discuss a methodology developed at NASA Marshall's Advanced Concepts Office to address these issues. The methodology is two-fold: first, capture the heuristics developed by human analysts over their many years of experience; and secondly, leverage the power of modern computing to evaluate multiple trajectories simultaneously and therefore enable the exploration of the trajectory's design space early during the pre- conceptual and conceptual phases of design. This methodology is coupled with design of experiments in order to train surrogate models, which enables trajectory design space visualization and parametric optimal ascent trajectory information to be available when early design decisions are being made

    Assessing the applicability of terrestrial laser scanning for mapping englacial conduits

    Get PDF
    his is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike licence (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the same Creative Commons licence is included and the original work is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use.The morphology of englacial drainage networks and their temporal evolution are poorly characterised, particularly within cold ice masses. At present, direct observations of englacial channels are restricted in both spatial and temporal resolution. Through novel use of a terrestrial laser scanning (TLS) system, the interior geometry of an englacial channel in Austre Brøggerbreen, Svalbard, was reconstructed and mapped. Twenty-eight laser scan surveys were conducted in March 2016, capturing the glacier surface around a moulin entrance and the uppermost 122 m reach of the adjoining conduit. The resulting point clouds provide detailed 3-D visualisation of the channel with point accuracy of 6.54 mm, despite low (<60%) overall laser returns as a result of the physical and optical properties of the clean ice, snow, hoar frost and sediment surfaces forming the conduit interior. These point clouds are used to map the conduit morphology, enabling extraction of millimetre-to-centimetre scale geometric measurements. The conduit meanders at a depth of 48 m, with a sinuosity of 2.7, exhibiting teardrop shaped cross-section morphology. This improvement upon traditional surveying techniques demonstrates the potential of TLS as an investigative tool to elucidate the nature of glacier hydrological networks, through reconstruction of channel geometry and wall composition.Peer reviewedFinal Published versio

    Correcting index databases improves metagenomic studies

    Get PDF
    AbstractAssessing the taxonomic composition of metagenomic samples is an important first step in understanding the biology and ecology of microbial communities in complex environments. Despite a wealth of algorithms and tools for metagenomic classification, relatively little effort has been put into the critical task of improving the quality of reference indices to which metagenomic reads are assigned. Here, we inferred the taxonomic composition of 404 publicly available metagenomes from human, marine and soil environments, using custom index databases modified according to two factors: the number of reference genomes used to build the databases, and the monophyletic strictness of species definitions. Index databases built following the NCBI taxonomic system were also compared to others using Genome Taxonomy Database (GTDB) taxonomic redefinitions. We observed a considerable increase in the rate of read classification using modified reference index databases as compared to a default NCBI RefSeq database, with up to a 4.4-, 6.4- and 2.2-fold increase in classified reads per sample for human, marine and soil metagenomes, respectively. Importantly, targeted correction for 70 common human pathogens and bacterial genera in the index database increased their specific detection levels in human metagenomes. We also show the choice of index database can influence downstream diversity and distance estimates for microbiome data. Overall, the study shows a large amount of accessible information in metagenomes remains unexploited using current methods, and that the same data analysed using different index databases could potentially lead to different conclusions. These results have implications for the power and design of individual microbiome studies, and for comparison and meta-analysis of microbiome datasets.</jats:p

    Neurophysiology

    Get PDF
    Contains research objectives, summary of research and reports on six research objectives.National Institutes of Health (Grant 5 ROl NB-04985-05)National Institutes of Health (Grant NB-07501-02)National Institutes of Health (Grant NB-06251-03)National Institutes of Health (Grant NB-07576-02)U. S. Air Force (Aerospace Medical Division) under Contract AF33(615)-3885Bell Telephone Laboratories IncorporatedNational Institutes of Health (Grant 5 TO1 GM-01555-02

    Digital PCR provides sensitive and absolute calibration for high throughput sequencing

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Next-generation DNA sequencing on the 454, Solexa, and SOLiD platforms requires absolute calibration of the number of molecules to be sequenced. This requirement has two unfavorable consequences. First, large amounts of sample-typically micrograms-are needed for library preparation, thereby limiting the scope of samples which can be sequenced. For many applications, including metagenomics and the sequencing of ancient, forensic, and clinical samples, the quantity of input DNA can be critically limiting. Second, each library requires a titration sequencing run, thereby increasing the cost and lowering the throughput of sequencing.</p> <p>Results</p> <p>We demonstrate the use of digital PCR to accurately quantify 454 and Solexa sequencing libraries, enabling the preparation of sequencing libraries from nanogram quantities of input material while eliminating costly and time-consuming titration runs of the sequencer. We successfully sequenced low-nanogram scale bacterial and mammalian DNA samples on the 454 FLX and Solexa DNA sequencing platforms. This study is the first to definitively demonstrate the successful sequencing of picogram quantities of input DNA on the 454 platform, reducing the sample requirement more than 1000-fold without pre-amplification and the associated bias and reduction in library depth.</p> <p>Conclusion</p> <p>The digital PCR assay allows absolute quantification of sequencing libraries, eliminates uncertainties associated with the construction and application of standard curves to PCR-based quantification, and with a coefficient of variation close to 10%, is sufficiently precise to enable direct sequencing without titration runs.</p

    An accurate in vitro model of the E. coli envelope

    No full text
    Gram-negative bacteria are an increasingly serious source of antibiotic-resistant infections, partly owing to their characteristic protective envelope. This complex, 20 nm thick barrier includes a highly impermeable, asymmetric bilayer outer membrane (OM), which plays a pivotal role in resisting antibacterial chemotherapy. Nevertheless, the OM molecular structure and its dynamics are poorly understood because the structure is difficult to recreate or study in vitro. The successful formation and characterization of a fully asymmetric model envelope using Langmuir-Blodgett and Langmuir-Schaefer methods is now reported. Neutron reflectivity and isotopic labeling confirmed the expected structure and asymmetry and showed that experiments with antibacterial proteins reproduced published in vivo behavior. By closely recreating natural OM behavior, this model provides a much needed robust system for antibiotic development

    Control of Vancomycin-Resistant Enterococcus in Health Care Facilities in a Region

    Get PDF
    Background In late 1996, vancomycin-resistant enterococci were first detected in the Siouxland region of Iowa, Nebraska, and South Dakota. A task force was created, and in 1997 the assistance of the Centers for Disease Control and Prevention was sought in assessing the prevalence of vancomycin-resistant enterococci in the region’s facilities and implementing recommendations for screening, infection control, and education at all 32 health care facilities in the region. Methods The infection-control intervention was evaluated in October 1998 and October 1999. We performed point-prevalence surveys, conducted a case– control study of gastrointestinal colonization with vancomycin-resistant enterococci, and compared infection-control practices and screening policies for vancomycin-resistant enterococci at the acute care and long-term care facilities in the Siouxland region. Results Perianal-swab samples were obtained from 1954 of 2196 eligible patients (89 percent) in 1998 and 1820 of 2049 eligible patients (89 percent) in 1999. The overall prevalence of vancomycin-resistant enterococci at 30 facilities that participated in all three years of the study decreased from 2.2 percent in 1997 to 1.4 percent in 1998 and to 0.5 percent in 1999 (P Conclusions An active infection-control intervention, which includes the obtaining of surveillance cultures and the isolation of infected patients, can reduce or eliminate the transmission of vancomycinresistant enterococci in the health care facilities of a region. (N Engl J Med 2001;344:1427-33.

    Ecological equivalence: a realistic assumption for niche theory as a testable alternative to neutral theory

    Get PDF
    Hubbell's 2001 neutral theory unifies biodiversity and biogeography by modelling steady-state distributions of species richness and abundances across spatio-temporal scales. Accurate predictions have issued from its core premise that all species have identical vital rates. Yet no ecologist believes that species are identical in reality. Here I explain this paradox in terms of the ecological equivalence that species must achieve at their coexistence equilibrium, defined by zero net fitness for all regardless of intrinsic differences between them. I show that the distinction of realised from intrinsic vital rates is crucial to evaluating community resilience. An analysis of competitive interactions reveals how zero-sum patterns of abundance emerge for species with contrasting life-history traits as for identical species. I develop a stochastic model to simulate community assembly from a random drift of invasions sustaining the dynamics of recruitment following deaths and extinctions. Species are allocated identical intrinsic vital rates for neutral dynamics, or random intrinsic vital rates and competitive abilities for niche dynamics either on a continuous scale or between dominant-fugitive extremes. Resulting communities have steady-state distributions of the same type for more or less extremely differentiated species as for identical species. All produce negatively skewed log-normal distributions of species abundance, zero-sum relationships of total abundance to area, and Arrhenius relationships of species to area. Intrinsically identical species nevertheless support fewer total individuals, because their densities impact as strongly on each other as on themselves. Truly neutral communities have measurably lower abundance/area and higher species/abundance ratios. Neutral scenarios can be parameterized as null hypotheses for testing competitive release, which is a sure signal of niche dynamics. Ignoring the true strength of interactions between and within species risks a substantial misrepresentation of community resilience to habitat los
    • …
    corecore