3,896 research outputs found

    Remote sensing as an aid to route evaluation for relocated Louisiana Highway 1

    Get PDF
    Aerial photography in the form of color infrared and color positive transparencies was used as an aid for evaluation of the route proposed for relocated Louisiana Highway 1, between LaRose and Golden Meadows, in South Louisiana

    Remote sensing as an aid to route evaluation for relocated Louisiana Highway 1

    Get PDF
    NASA aerial photography in the form of color infrared and color positive transparencies is used as an aid for evaluation of the route proposed for relocated Louisiana Highway 1, between LaRose and Golden Meadow, in South Louisiana

    On the Nature of the Cosmological Constant Problem

    Full text link
    General relativity postulates the Minkowski space-time to be the standard flat geometry against which we compare all curved space-times and the gravitational ground state where particles, quantum fields and their vacuum states are primarily conceived. On the other hand, experimental evidences show that there exists a non-zero cosmological constant, which implies in a deSitter space-time, not compatible with the assumed Minkowski structure. Such inconsistency is shown to be a consequence of the lack of a application independent curvature standard in Riemann's geometry, leading eventually to the cosmological constant problem in general relativity. We show how the curvature standard in Riemann's geometry can be fixed by Nash's theorem on locally embedded Riemannian geometries, which imply in the existence of extra dimensions. The resulting gravitational theory is more general than general relativity, similar to brane-world gravity, but where the propagation of the gravitational field along the extra dimensions is a mathematical necessity, rather than being a a postulate. After a brief introduction to Nash's theorem, we show that the vacuum energy density must remain confined to four-dimensional space-times, but the cosmological constant resulting from the contracted Bianchi identity is a gravitational contribution which propagates in the extra dimensions. Therefore, the comparison between the vacuum energy and the cosmological constant in general relativity ceases to be. Instead, the geometrical fix provided by Nash's theorem suggests that the vacuum energy density contributes to the perturbations of the gravitational field.Comment: LaTex, 5 pages no figutres. Correction on author lis

    Hyper-velocity impact test and simulation of a double-wall shield concept for the Wide Field Monitor aboard LOFT

    Full text link
    The space mission LOFT (Large Observatory For X-ray Timing) was selected in 2011 by ESA as one of the candidates for the M3 launch opportunity. LOFT is equipped with two instruments, the Large Area Detector (LAD) and the Wide Field Monitor (WFM), based on Silicon Drift Detectors (SDDs). In orbit, they would be exposed to hyper-velocity impacts by environmental dust particles, which might alter the surface properties of the SDDs. In order to assess the risk posed by these events, we performed simulations in ESABASE2 and laboratory tests. Tests on SDD prototypes aimed at verifying to what extent the structural damages produced by impacts affect the SDD functionality have been performed at the Van de Graaff dust accelerator at the Max Planck Institute for Nuclear Physics (MPIK) in Heidelberg. For the WFM, where we expect a rate of risky impacts notably higher than for the LAD, we designed, simulated and successfully tested at the plasma accelerator at the Technical University in Munich (TUM) a double-wall shielding configuration based on thin foils of Kapton and Polypropylene. In this paper we summarize all the assessment, focussing on the experimental test campaign at TUM.Comment: Proc. SPIE 9144, Space Telescopes and Instrumentation 2014: Ultraviolet to Gamma Ray, 91446

    Measurement of charge and light yields for Xe 127 L -shell electron captures in liquid xenon

    Get PDF
    Dark matter searches using dual-phase xenon time-projection chambers (LXe-TPCs) rely on their ability to reject background electron recoils (ERs) while searching for signal-like nuclear recoils (NRs). ER response is typically calibrated using β-decay sources, such as tritium, but these calibrations do not characterize events accompanied by an atomic vacancy, as in solar neutrino scatters off inner-shell electrons. Such events lead to emission of x rays and Auger electrons, resulting in higher electron-ion recombination and thus a more NR-like response than inferred from β-decay calibration. We present a cross-calibration of tritium β-decays and Xe127 electron-capture decays (which produce inner-shell vacancies) in a small-scale LXe-TPC and give the most precise measurements to date of light and charge yields for the Xe127 L-shell electron-capture in liquid xenon. We observe a 6.9σ (9.2σ) discrepancy in the L-shell capture response relative to tritium β decays, measured at a drift field of 363±14 V/cm (258±13 V/cm), when compared to simulations tuned to reproduce the correct β-decay response. In dark matter searches, use of a background model that neglects this effect leads to overcoverage (higher limits) for background-only multi-kiloton-year exposures, but at a level much less than the 1-σ experiment-to-experiment variation of the 90% C.L. upper limit on the interaction rate of a 50 GeV/c2 dark matter particle

    Proof of concept: could snake venoms be a potential source of bioactive compounds for control of mould growth and mycotoxin production

    Get PDF
    © 2020 The Authors. This is an open access article under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits use, distribution and reproduction in any medium, provided the original work is properly cited.The objective was to screen 10 snake venoms for their efficacy to control growth and mycotoxin production by important mycotoxigenic fungi including Aspergillus flavus, Aspergillus westerdijkiae, Penicillium verrucosum, Fusarium graminearum and F. langsethiae. The Bioscreen C rapid assay system was used. The venoms from the Viperidae snake family delayed growth of some of the test fungi, especially F. graminearum and F. langsethiae and sometimes A. flavus. Some were also able to reduce mycotoxin production. The two most potent crude snake venoms (Naja nigricollis and N. siamensis; 41 and 43 fractions, respectively) were further fractionated and 83/84 of these fractions were able to reduce mycotoxin production by >90% in two of the mycotoxigenic fungi examined. This study suggests that there may be significant potential for the identification of novel fungistatic/fungicidal bioactive compounds as preservatives of raw and processed food commodities post-harvest from such snake venoms.Peer reviewedFinal Published versio

    Radiometric calibration of the in-flight blackbody calibration system of the GLORIA interferometer

    Get PDF
    GLORIA (Gimballed Limb Observer for Radiance Imaging of the Atmosphere) is an airborne, imaging, infrared Fourier transform spectrometer that applies the limb-imaging technique to perform trace gas and temperature measurements in the Earth's atmosphere with three-dimensional resolution. To ensure the traceability of these measurements to the International Temperature Scale and thereby to an absolute radiance scale, GLORIA carries an on-board calibration system. Basically, it consists of two identical large-area and high-emissivity infrared radiators, which can be continuously and independently operated at two adjustable temperatures in a range from −50 °C to 0 °C during flight. Here we describe the radiometric and thermometric characterization and calibration of the in-flight calibration system at the Reduced Background Calibration Facility of the Physikalisch-Technische Bundesanstalt. This was performed with a standard uncertainty of less than 110 mK. Extensive investigations of the system concerning its absolute radiation temperature and spectral radiance, its temperature homogeneity and its short- and long-term stability are discussed. The traceability chain of these measurements is presented

    National Center for Biomedical Ontology: Advancing biomedicine through structured organization of scientific knowledge

    Get PDF
    The National Center for Biomedical Ontology is a consortium that comprises leading informaticians, biologists, clinicians, and ontologists, funded by the National Institutes of Health (NIH) Roadmap, to develop innovative technology and methods that allow scientists to record, manage, and disseminate biomedical information and knowledge in machine-processable form. The goals of the Center are (1) to help unify the divergent and isolated efforts in ontology development by promoting high quality open-source, standards-based tools to create, manage, and use ontologies, (2) to create new software tools so that scientists can use ontologies to annotate and analyze biomedical data, (3) to provide a national resource for the ongoing evaluation, integration, and evolution of biomedical ontologies and associated tools and theories in the context of driving biomedical projects (DBPs), and (4) to disseminate the tools and resources of the Center and to identify, evaluate, and communicate best practices of ontology development to the biomedical community. Through the research activities within the Center, collaborations with the DBPs, and interactions with the biomedical community, our goal is to help scientists to work more effectively in the e-science paradigm, enhancing experiment design, experiment execution, data analysis, information synthesis, hypothesis generation and testing, and understand human disease
    corecore