13,607 research outputs found

    A future perspective on technological obsolescenceat NASA, Langley Research Center

    Get PDF
    The present research effort was the first phase of a study to forecast whether technological obsolescence will be a problem for the engineers, scientists, and technicians at NASA Langley Research Center (LaRC). There were four goals of the research: to review the literature on technological obsolescence; to determine through interviews of division chiefs and branch heads Langley's perspective on future technological obsolescence; to begin making contacts with outside industries to find out how they view the possibility of technological obsolescence; and to make preliminary recommendations for dealing with the problem. A complete description of the findings of this research can be reviewed in a technical report in preparation. The following are a small subset of the key findings of the study: NASA's centers and divisions vary in their missions and because of this, in their capability to control obsolescence; research-oriented organizations within NASA are believed by respondents to keep up to date more than the project-oriented organizations; asked what are the signs of a professional's technological obsolescence, respondents had a variety of responses; top performing scientists were viewed as continuous learners, keeping up to date by a variety of means; when asked what incentives were available to aerospace technologists for keeping up to data, respondents specified a number of ideas; respondents identified many obstacles to professionals' keeping up to date in the future; and most respondents expressed some concern for the future of the professionals at NASA vis a vis the issue of professional obsolescence

    Accreditation of practice educators: An expectation too far ?

    Get PDF
    The successful completion of practice placements is essential to the education of occupational therapists; however, ensuring quality placements is challenging for occupational therapy educators. In 2000, Brunel University introduced a revised system of accreditation of practice educators which involved attendance at a course, the supervision of a student and the submission of an essay to be assessed. An audit revealed that a total of 314 therapists attended 15 courses between 2000 and 2003; of these, 243 (77%) subsequently supervised students and 32 (10%) became accredited. The requirement to accredit practice educators, which is a commendable attempt to ensure quality, may paradoxically have been detrimental in achieving quality. The College of Occupational Therapists’ apparent change of emphasis on this topic is welcome

    Teaching Physics Using Virtual Reality

    Get PDF
    We present an investigation of game-like simulations for physics teaching. We report on the effectiveness of the interactive simulation "Real Time Relativity" for learning special relativity. We argue that the simulation not only enhances traditional learning, but also enables new types of learning that challenge the traditional curriculum. The lessons drawn from this work are being applied to the development of a simulation for enhancing the learning of quantum mechanics

    Titania/alumina bilayer gate insulators for InGaAs metal-oxide-semiconductor devices

    Get PDF
    We describe the electrical properties of atomic layer deposited TiO<sub>2</sub>/Al<sub>2</sub>O<sub>3</sub> bilayer gate oxides which simultaneously achieve high gate capacitance density and low gate leakage current density. Crystallization of the initially amorphous TiO<sub>2</sub> film contributes to a significant accumulation capacitance increase (∼33%) observed after a forming gas anneal at 400 °C. The bilayer dielectrics reduce gate leakage current density by approximately one order of magnitude at flatband compared to Al<sub>2</sub>O<sub>3</sub> single layer of comparable capacitance equivalent thickness. The conduction band offset of TiO<sub>2</sub> relative to InGaAs is 0.6 eV, contributing to the ability of the stacked dielectric to suppress gate leakage conduction

    Intersection tests for single marker QTL analysis can be more powerful than two marker QTL analysis

    Get PDF
    BACKGROUND: It has been reported in the quantitative trait locus (QTL) literature that when testing for QTL location and effect, the statistical power supporting methodologies based on two markers and their estimated genetic map is higher than for the genetic map independent methodologies known as single marker analyses. Close examination of these reports reveals that the two marker approaches are more powerful than single marker analyses only in certain cases. Simulation studies are a commonly used tool to determine the behavior of test statistics under known conditions. We conducted a simulation study to assess the general behavior of an intersection test and a two marker test under a variety of conditions. The study was designed to reveal whether two marker tests are always more powerful than intersection tests, or whether there are cases when an intersection test may outperform the two marker approach. We present a reanalysis of a data set from a QTL study of ovariole number in Drosophila melanogaster. RESULTS: Our simulation study results show that there are situations where the single marker intersection test equals or outperforms the two marker test. The intersection test and the two marker test identify overlapping regions in the reanalysis of the Drosophila melanogaster data. The region identified is consistent with a regression based interval mapping analysis. CONCLUSION: We find that the intersection test is appropriate for analysis of QTL data. This approach has the advantage of simplicity and for certain situations supplies equivalent or more powerful results than a comparable two marker test

    Projections for future radiocarbon content in dissolved inorganic carbon in hardwater lakes: a retrospective approach

    Get PDF
    Inland water bodies contain significant amounts of carbon in the form of dissolved inorganic carbon (DIC) derived from a mixture of modern atmospheric and pre-aged sources, which needs to be considered in radiocarbon-based dating and natural isotope tracer studies. While reservoir effects in hardwater lakes are generally considered to be constant through time, a comparison of recent and historical DI14C data from 2013 and 1969 for Lake Constance reveals that this is not a valid assumption. We hypothesize that changes in atmospheric carbon contributions to lake water DIC have taken place due to anthropogenically forced eutrophication in the 20th century. A return to more oligotrophic conditions in the lake led to reoxygenation and enhanced terrigenous organic matter remineralization, contributing to lake water DIC. Such comparisons using DI14C measurements from different points in time enable nonlinear changes in lake water DIC source and signature to be disentangled from concurrent anthropogenically induced changes in atmospheric 14C. In the future, coeval changes in lake dynamics due to climate change are expected to further perturb these balances. Depending on the scenario, Lake Constance DI14C is projected to decrease from the 2013 measured value of 0.856 Fm to 0.54–0.62 Fm by the end of the century

    Potential-vorticity inversion and the wave-turbulence jigsaw: some recent clarifications

    Get PDF
    Two key ideas stand out as crucial to understanding atmosphere-ocean dynamics, and the dynamics of other planets including the gas giants. The first key idea is the invertibility principle for potential vorticity (PV). Without it, one can hardly give a coherent account of even so important and elementary a process as Rossby-wave propagation, going beyond the simplest textbook cases. Still less can one fully understand nonlinear processes like the self-sharpening or narrowing of jets – the once-mysterious "negative viscosity" phenomenon. The second key idea, also crucial to understanding jets, might be summarized in the phrase "there is no such thing as turbulence without waves", meaning Rossby waves especially. Without this idea one cannot begin to make sense of, for instance, momentum budgets and eddy momentum transports in complex large-scale flows. Like the invertibility principle the idea has long been recognized, or at least adumbrated. However, it is worth articulating explicitly if only because it can be forgotten when, in the usual way, we speak of "turbulence" and "turbulence theory" as if they were autonomous concepts. In many cases of interest, such as the well-studied terrestrial stratosphere, reality is more accurately described as a highly inhomogeneous "wave-turbulence jigsaw puzzle" in which wavelike and turbulent regions fit together and crucially affect each other's evolution. This modifies, for instance, formulae for the Rhines scale interpreted as indicating the comparable importance of wavelike and turbulent dynamics. Also, weakly inhomogeneous turbulence theory is altogether inapplicable. For instance there is no scale separation. Eddy scales are not much smaller than the sizes of the individual turbulent regions in the jigsaw. Here I review some recent progress in clarifying these ideas and their implications

    Fluorine gas as a cleaning agent for Apollo bulk-sample containers

    Get PDF
    A technique has been developed for cleaning Apollo bulk sample containers using fluorine gas as the cleaning agent

    Multi-environment QTL mixed models for drought stress adaptation in wheat

    Get PDF
    Many quantitative trait loci (QTL) detection methods ignore QTL-by-environment interaction (QEI) and are limited in accommodation of error and environment-specific variance. This paper outlines a mixed model approach using a recombinant inbred spring wheat population grown in six drought stress trials. Genotype estimates for yield, anthesis date and height were calculated using the best design and spatial effects model for each trial. Parsimonious factor analytic models best captured the variance-covariance structure, including genetic correlations, among environments. The 1RS.1BL rye chromosome translocation (from one parent) which decreased progeny yield by 13.8 g m(-2) was explicitly included in the QTL model. Simple interval mapping (SIM) was used in a genome-wide scan for significant QTL, where QTL effects were fitted as fixed environment-specific effects. All significant environment-specific QTL were subsequently included in a multi-QTL model and evaluated for main and QEI effects with non-significant QEI effects being dropped. QTL effects (either consistent or environment-specific) included eight yield, four anthesis, and six height QTL. One yield QTL co-located (or was linked) to an anthesis QTL, while another co-located with a height QTL. In the final multi-QTL model, only one QTL for yield (6 g m(-2)) was consistent across environments (no QEI), while the remaining QTL had significant QEI effects (average size per environment of 5.1 g m(-2)). Compared to single trial analyses, the described framework allowed explicit modelling and detection of QEI effects and incorporation of additional classification information about genotypes
    • …
    corecore