327 research outputs found

    Temperature and Kinematics of CIV Absorption Systems

    Full text link
    We use Keck HIRES spectra of three intermediate redshift QSOs to study the physical state and kinematics of the individual components of CIV selected heavy element absorption systems. Fewer than 8 % of all CIV lines with column densities greater than 10^{12.5} cm^{-2} have Doppler parameters b < 6 km/s. A formal decomposition into thermal and non-thermal motion using the simultaneous presence of SiIV gives a mean thermal Doppler parameter b_{therm}(CIV) = 7.2 km/s, corresponding to a temperature of 38,000 K although temperatures possibly in excess of 300,000 K occur occasionally. We also find tentative evidence for a mild increase of temperature with HI column density. Non-thermal motions within components are typically small (< 10 km/s) for most systems, indicative of a quiescent environment. The two-point correlation function (TPCF) of CIV systems on scales up to 500 km/s suggests that there is more than one source of velocity dispersion. The shape of the TPCF can be understood if the CIV systems are caused by ensembles of objects with the kinematics of dwarf galaxies on a small scale, while following the Hubble flow on a larger scale. Individual high redshift CIV components may be the building blocks of future normal galaxies in a hierarchical structure formation scenario.Comment: submitted to the ApJ Letters, March 16, 1996 (in press); (13 Latex pages, 4 Postscript figures, and psfig.sty included

    NDM-504: MULTI-PLATFORM TORNADO DAMAGE SCENE PRESERVATION

    Get PDF
    A severe tornado system produced damage to engineered metal buildings at an industrial facility outside Pampa, TX and toppled several nearby center-pivot irrigation structures. Rapid remote-sensing preservation of this overall damage scene was of particular necessity: access to the industrial facility was prohibited, and the overall size of the center-pivot irrigation system disallowed rapid direct measurement of member displacements. Engineers and architects from West Texas A&M University, University of Nebraska-Lincoln, and Texas Tech University collaborated to acquire and preserve the damage scene for future study, using a suite of existing and emerging platforms: including 3D point clouds derived from aerial FoDAR, aerial drone imaging, terrestrial laser scanning, and terrestrial digital photogrammetry as well as two-dimensional, four-band satellite imaging. Data collection using these various platforms offers guidance for the future remote-sensing preservation of damage scenes, the validation of estimated wind speeds currently employed in the Enhanced Fujita Scale of tornado intensity, and the further development of techniques for automated remote-sensing-based wind damage assessments

    NDM-505: DEVELOPMENT OF THE ASCE/SEI STANDARD FOR THE ESTIMATION OF TORNADO WIND SPEEDS

    Get PDF
    Development of the new ASCE/SEI consensus standard for wind speed estimation in tornadoes began in 2014 and is currently underway. The intent of the new standard is to standardize the methods used to estimate the wind speeds in tornadoes including improvements and expansions for the damaged-based Enhanced Fujita Scale (EF Scale), with potential to extend the scope of the standard to include other windstorms. The standard will include sections on the EF Scale, radar measurements, tree fall pattern analysis, data archives, forensic engineering analysis, in-situ measurements (anemometry), and remote-sensing applications. Users of the standard will include wind, structural and forensic engineers, meteorologists, climatologists, forest biologists, risk analysts, hazards modellers, emergency managers, building and infrastructure designers, the insurance industry, and the media. The standard is intended for adoption by the National Weather Service and for use by storm study teams and researchers as a guide for conducting storm surveys and analysis of storm data. Development of the standard highlights the current state-of-the art in wind speed estimation and also identifies areas where new research is needed. Development of the standard will include a public ballot period. The standard is scheduled to be completed in 2019

    NNSA ASC Exascale Environment Planning, Applications Working Group, Report February 2011

    Get PDF
    The scope of the Apps WG covers three areas of interest: Physics and Engineering Models (PEM), multi-physics Integrated Codes (IC), and Verification and Validation (V&amp;V). Each places different demands on the exascale environment. The exascale challenge will be to provide environments that optimize all three. PEM serve as a test bed for both model development and 'best practices' for IC code development, as well as their use as standalone codes to improve scientific understanding. Rapidly achieving reasonable performance for a small team is the key to maintaining PEM innovation. Thus, the environment must provide the ability to develop portable code at a higher level of abstraction, which can then be tuned, as needed. PEM concentrate their computational footprint in one or a few kernels that must perform efficiently. Their comparative simplicity permits extreme optimization, so the environment must provide the ability to exercise significant control over the lower software and hardware levels. IC serve as the underlying software tools employed for most ASC problems of interest. Often coupling dozens of physics models into very large, very complex applications, ICs are usually the product of hundreds of staff-years of development, with lifetimes measured in decades. Thus, emphasis is placed on portability, maintainability and overall performance, with optimization done on the whole rather than on individual parts. The exascale environment must provide a high-level standardized programming model with effective tools and mechanisms for fault detection and remediation. Finally, V&amp;V addresses the infrastructure and methods to facilitate the assessment of code and model suitability for applications, and uncertainty quantification (UQ) methods for assessment and quantification of margins of uncertainty (QMU). V&amp;V employs both PEM and IC, with somewhat differing goals, i.e., parameter studies and error assessments to determine both the quality of the calculation and to estimate expected deviations of simulations from experiments. The exascale environment must provide a performance envelope suitable both for capacity calculations (high through-put) and full system capability runs (high performance). Analysis of the results place shared demand on both the I/O as well as the visualization subsystems

    Quantification and analysis of icebergs in a tidewater glacier fjord using an object-based approach

    Get PDF
    Tidewater glaciers are glaciers that terminate in, and calve icebergs into, the ocean. In addition to the influence that tidewater glaciers have on physical and chemical oceanography, floating icebergs serve as habitat for marine animals such as harbor seals (Phoca vitulina richardii). The availability and spatial distribution of glacier ice in the fjords is likely a key environmental variable that influences the abundance and distribution of selected marine mammals; however, the amount of ice and the fine-scale characteristics of ice in fjords have not been systematically quantified. Given the predicted changes in glacier habitat, there is a need for the development of methods that could be broadly applied to quantify changes in available ice habitat in tidewater glacier fjords. We present a case study to describe a novel method that uses object-based image analysis (OBIA) to classify floating glacier ice in a tidewater glacier fjord from high-resolution aerial digital imagery. Our objectives were to (i) develop workflows and rule sets to classify high spatial resolution airborne imagery of floating glacier ice; (ii) quantify the amount and fine-scale characteristics of floating glacier ice; (iii) and develop processes for automating the object-based analysis of floating glacier ice for large number of images from a representative survey day during June 2007 in Johns Hopkins Inlet (JHI), a tidewater glacier fjord in Glacier Bay National Park, southeastern Alaska. On 18 June 2007, JHI was comprised of brash ice ([Formula: see text] = 45.2%, SD = 41.5%), water ([Formula: see text] = 52.7%, SD = 42.3%), and icebergs ([Formula: see text] = 2.1%, SD = 1.4%). Average iceberg size per scene was 5.7 m2 (SD = 2.6 m2). We estimate the total area (± uncertainty) of iceberg habitat in the fjord to be 455,400 ± 123,000 m2. The method works well for classifying icebergs across scenes (classification accuracy of 75.6%); the largest classification errors occur in areas with densely-packed ice, low contrast between neighboring ice cover, or dark or sediment-covered ice, where icebergs may be misclassified as brash ice about 20% of the time. OBIA is a powerful image classification tool, and the method we present could be adapted and applied to other ice habitats, such as sea ice, to assess changes in ice characteristics and availability

    Seahawk: moving beyond HTML in Web-based bioinformatics analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therfore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis.</p> <p>Results</p> <p>We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format.</p> <p>Conclusion</p> <p>As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, <it>ad hoc </it>analysis workflow creation without the assistance of a programmer.</p

    1608+656: A Gravitationally Lensed PostStarburst Radio Galaxy

    Get PDF
    The gravitational lens system 1608+656 displays four flat-spectrum, pointlike components that are the images of the unresolved core of a double-lobed radio source. The lensing mass is a galaxy at z = 0.630. New spectra of this system enable us to determine a conclusive redshift of 1.394 for the lensed object. The spectra show prominent high-order Balmer absorption lines and Mg II absorption. These lines, and the absence of [O II] emission, indicate that this is a poststarburst or E + A galaxy. It is unique among lensed objects in not being a quasar and among E + A galaxies in having the highest known redshift. Even allowing for lens magnification, the lensed object is a very luminous galaxy, with an absolute magnitude, M(r) = -22.8 mag. The deconvolved infrared image indicates that the galaxy may be slightly resolved. The radio luminosity density of the lobes is L_(1.4) = 5.78 × 10^(25) W Hz^(-1), which puts the source on the boundary between FR I and FR II radio galaxies. Together with the redshift for the lens and a satisfactory mass model, the determination of the lensed object redshift makes this system an excellent candidate for measuring H_0
    • …
    corecore