10,094 research outputs found

    Vitrification and determination of the crystallization time scales of the bulk-metallic-glass-forming liquid Zr58.5Nb2.8Cu15.6Ni12.8Al10.3

    Get PDF
    The crystallization kinetics of Zr58.5Nb2.8Cu15.6Ni12.8Al10.3 were studied in an electrostatic levitation (ESL) apparatus. The measured critical cooling rate is 1.75 K/s. Zr58.5Nb2.8Cu15.6Ni12.8Al10.3 is the first bulk-metallic-glass-forming liquid that does not contain beryllium to be vitrified by purely radiative cooling in the ESL. Furthermore, the sluggish crystallization kinetics enable the determination of the time-temperature-transformation (TTT) diagram between the liquidus and the glass transition temperatures. The shortest time to reach crystallization in an isothermal experiment; i.e., the nose of the TTT diagram is 32 s. The nose of the TTT diagram is at 900 K and positioned about 200 K below the liquidus temperature

    Program on Earth Observation Data Management Systems (EODMS)

    Get PDF
    An assessment was made of the needs of a group of potential users of satellite remotely sensed data (state, regional, and local agencies) involved in natural resources management in five states, and alternative data management systems to satisfy these needs are outlined. Tasks described include: (1) a comprehensive data needs analysis of state and local users; (2) the design of remote sensing-derivable information products that serve priority state and local data needs; (3) a cost and performance analysis of alternative processing centers for producing these products; (4) an assessment of the impacts of policy, regulation and government structure on implementing large-scale use of remote sensing technology in this community of users; and (5) the elaboration of alternative institutional arrangements for operational Earth Observation Data Management Systems (EODMS). It is concluded that an operational EODMS will be of most use to state, regional, and local agencies if it provides a full range of information services -- from raw data acquisition to interpretation and dissemination of final information products

    A Monte Carlo Approach to Modeling the Breakup of the Space Launch System EM-1 Core Stage with an Integrated Blast and Fragment Catalogue

    Get PDF
    The Liquid Propellant Fragment Overpressure Acceleration Model (L-FOAM) is a tool developed by Bangham Engineering Incorporated (BEi) that produces a representative debris cloud from an exploding liquid-propellant launch vehicle. Here it is applied to the Core Stage (CS) of the National Aeronautics and Space Administration (NASA) Space Launch System (SLS launch vehicle). A combination of Probability Density Functions (PDF) based on empirical data from rocket accidents and applicable tests, as well as SLS specific geometry are combined in a MATLAB script to create unique fragment catalogues each time L-FOAM is run-tailored for a Monte Carlo approach for risk analysis. By accelerating the debris catalogue with the BEi blast model for liquid hydrogen / liquid oxygen explosions, the result is a fully integrated code that models the destruction of the CS at a given point in its trajectory and generates hundreds of individual fragment catalogues with initial imparted velocities. The BEi blast model provides the blast size (radius) and strength (overpressure) as probabilities based on empirical data and anchored with analytical work. The coupling of the L-FOAM catalogue with the BEi blast model is validated with a simulation of the Project PYRO S-IV destruct test. When running a Monte Carlo simulation, L-FOAM can accelerate all catalogues with the same blast (mean blast, 2 blast, etc.), or vary the blast size and strength based on their respective probabilities. L-FOAM then propagates these fragments until impact with the earth. Results from L-FOAM include a description of each fragment (dimensions, weight, ballistic coefficient, type and initial location on the rocket), imparted velocity from the blast, and impact data depending on user desired application. LFOAM application is for both near-field (fragment impact to escaping crew capsule) and far-field (fragment ground impact footprint) safety considerations. The user is thus able to use statistics from a Monte Carlo set of L-FOAM catalogues to quantify risk for a multitude of potential CS destruct scenarios. Examples include the effect of warning time on the survivability of an escaping crew capsule or the maximum fragment velocities generated by the ignition of leaking propellants in internal cavities

    Solid Rocket Launch Vehicle Explosion Environments

    Get PDF
    Empirical explosion data from full scale solid rocket launch vehicle accidents and tests were collected from all available literature from the 1950s to the present. In general data included peak blast overpressure, blast impulse, fragment size, fragment speed, and fragment dispersion. Most propellants were 1.1 explosives but a few were 1.3. Oftentimes the data from a single accident was disjointed and/or missing key aspects. Despite this fact, once the data as a whole was digitized, categorized, and plotted clear trends appeared. Particular emphasis was placed on tests or accidents that would be applicable to scenarios from which a crew might need to escape. Therefore, such tests where a large quantity of high explosive was used to initiate the solid rocket explosion were differentiated. Also, high speed ground impacts or tests used to simulate such were also culled. It was found that the explosions from all accidents and applicable tests could be described using only the pressurized gas energy stored in the chamber at the time of failure. Additionally, fragmentation trends were produced. Only one accident mentioned the elusive "small" propellant fragments, but upon further analysis it was found that these were most likely produced as secondary fragments when larger primary fragments impacted the ground. Finally, a brief discussion of how this data is used in a new launch vehicle explosion model for improving crew/payload survival is presented

    Program on Earth Observation Data Management Systems (EODMS), appendixes

    Get PDF
    The needs of state, regional, and local agencies involved in natural resources management in Illinois, Iowa, Minnesota, Missouri, and Wisconsin are investigated to determine the design of satellite remotely sensed derivable information products. It is concluded that an operational Earth Observation Data Management System (EODMS) will be most beneficial if it provides a full range of services - from raw data acquisition to interpretation and dissemination of final information products. Included is a cost and performance analysis of alternative processing centers, and an assessment of the impacts of policy, regulation, and government structure on implementing large scale use of remote sensing technology in this community of users

    PlaNet - Photo Geolocation with Convolutional Neural Networks

    Full text link
    Is it possible to build a system to determine the location where a photo was taken using just its pixels? In general, the problem seems exceptionally difficult: it is trivial to construct situations where no location can be inferred. Yet images often contain informative cues such as landmarks, weather patterns, vegetation, road markings, and architectural details, which in combination may allow one to determine an approximate location and occasionally an exact location. Websites such as GeoGuessr and View from your Window suggest that humans are relatively good at integrating these cues to geolocate images, especially en-masse. In computer vision, the photo geolocation problem is usually approached using image retrieval methods. In contrast, we pose the problem as one of classification by subdividing the surface of the earth into thousands of multi-scale geographic cells, and train a deep network using millions of geotagged images. While previous approaches only recognize landmarks or perform approximate matching using global image descriptors, our model is able to use and integrate multiple visible cues. We show that the resulting model, called PlaNet, outperforms previous approaches and even attains superhuman levels of accuracy in some cases. Moreover, we extend our model to photo albums by combining it with a long short-term memory (LSTM) architecture. By learning to exploit temporal coherence to geolocate uncertain photos, we demonstrate that this model achieves a 50% performance improvement over the single-image model

    Technical Note: Fast two-dimensional GC-MS with thermal extraction for anhydro-sugars in fine aerosols

    Get PDF
    A fast two-dimensional gas chromatography (GC-MS) method that uses heart-cutting and thermal extraction (TE) and requires no chemical derivatization was developed for the determination of anhydro-sugars in fine aerosols. Evaluation of the TE-GC-GC-MS method shows high average relative accuracy (≥90%), reproducibility (≤10% relative standard deviation), detection limits of less than 3 ng/μL, and negligible carryover for levoglucosan, mannosan, and galactosan markers. TE-GC-GC-MS- and solvent extraction (SE)-GC-MS-measured levoglucosan concentrations correlate across several diverse types of biomass burning aerosols. Because the SE-GC-MS measurements were taken 8 years prior to the TE-GC-GC-MS ones, the stability of levoglucosan is established for quartz filter-collected biomass burning aerosol samples stored at ultra-low temperature (−50 °C). Levoglucosan concentrations (w/w) in aerosols collected following atmospheric dilution near open fires of varying intensity are similar to those in biomass burning aerosols produced in a laboratory enclosure. An average levoglucosan-mannosan-galactosan ratio of 15:2:1 is observed for these two aerosol sets. TE-GC-GC-MS analysis of atmospheric aerosols from the US and Africa produced levoglucosan concentrations (0.01–1.6 μg/m<sup>3</sup>) well within those reported for aerosols collected globally and examined using different analytical techniques (0.004–7.6 μg/m<sup>3</sup>). Further comparisons among techniques suggest that fast TE-GC-GC-MS is among the most sensitive, accurate, and precise methods for compound-specific quantification of anhydro-sugars. In addition, an approximately twofold increase in anhydro-sugar determination may be realized when combining TE with fast chromatography
    • …
    corecore