8,064 research outputs found
The SILCC (SImulating the LifeCycle of molecular Clouds) project: I. Chemical evolution of the supernova-driven ISM
The SILCC project (SImulating the Life-Cycle of molecular Clouds) aims at a
more self-consistent understanding of the interstellar medium (ISM) on small
scales and its link to galaxy evolution. We simulate the evolution of the
multi-phase ISM in a 500 pc x 500 pc x 10 kpc region of a galactic disc, with a
gas surface density of .
The Flash 4.1 simulations include an external potential, self-gravity, magnetic
fields, heating and radiative cooling, time-dependent chemistry of H and CO
considering (self-) shielding, and supernova (SN) feedback. We explore SN
explosions at different (fixed) rates in high-density regions (peak), in random
locations (random), in a combination of both (mixed), or clustered in space and
time (clustered). Only random or clustered models with self-gravity (which
evolve similarly) are in agreement with observations. Molecular hydrogen forms
in dense filaments and clumps and contributes 20% - 40% to the total mass,
whereas most of the mass (55% - 75%) is in atomic hydrogen. The ionised gas
contributes <10%. For high SN rates (0.5 dex above Kennicutt-Schmidt) as well
as for peak and mixed driving the formation of H is strongly suppressed.
Also without self-gravity the H fraction is significantly lower (
5%). Most of the volume is filled with hot gas (90% within 2 kpc).
Only for random or clustered driving, a vertically expanding warm component of
atomic hydrogen indicates a fountain flow. Magnetic fields have little impact
on the final disc structure. However, they affect dense gas () and delay H formation. We highlight that individual chemical
species, in particular atomic hydrogen, populate different ISM phases and
cannot be accurately accounted for by simple temperature-/density-based phase
cut-offs.Comment: 30 pages, 23 figures, submitted to MNRAS. Comments welcome! For
movies of the simulations and download of selected Flash data see the SILCC
website: http://www.astro.uni-koeln.de/silc
Dispersion Relations for Thermally Excited Waves in Plasma Crystals
Thermally excited waves in a Plasma crystal were numerically simulated using
a Box_Tree code. The code is a Barnes_Hut tree code proven effective in
modeling systems composed of large numbers of particles. Interaction between
individual particles was assumed to conform to a Yukawa potential. Particle
charge, mass, density, Debye length and output data intervals are all
adjustable parameters in the code. Employing a Fourier transform on the output
data, dispersion relations for both longitudinal and transverse wave modes were
determined. These were compared with the dispersion relations obtained from
experiment as well as a theory based on a harmonic approximation to the
potential. They were found to agree over a range of 0.9<k<5, where k is the
shielding parameter, defined by the ratio between interparticle distance a and
dust Debye length lD. This is an improvement over experimental data as current
experiments can only verify the theory up to k = 1.5.Comment: 8 pages, Presented at COSPAR '0
Estimation of 3D vegetation structure from waveform and discrete return airborne laser scanning data
This study presents and compares new methods to describe the 3D canopy structure with Airborne Laser Scanning (ALS) waveform data as well as ALS point data. The ALS waveform data were analyzed in three different ways; by summing the intensity of the waveforms in height intervals (a); by first normalizing the waveforms with an algorithm based on Beer-Lambert law to compensate for the shielding effect of higher vegetation layers on reflection from lower layers and then summing the intensity (b); and by deriving points from the waveforms (c). As a comparison, conventional, discrete return ALS point data from the laser scanning system were also analyzed (d). The study area was located in hemi-boreal, spruce dominated forest in the southwest of Sweden (Lat. 58° N, Long. 13° E). The vegetation volume profile was defined as the volume of all tree crowns and shrubs in 1 dm height intervals in a field plot and the total vegetation volume as the sum of the vegetation volume profile in the field plot. The total vegetation volume was estimated for 68 field plots with 12 m radius from the proportion between the amount of ALS reflections from the vegetation and the total amount of ALS reflections based on Beer-Lambert law. ALS profiles were derived from the distribution of the ALS data above the ground in 1 dm height intervals. The ALS profiles were rescaled using the estimated total vegetation volume to derive the amount of vegetation at different heights above the ground. The root mean square error (RMSE) for cross validated regression estimates of the total vegetation volume was 31.9% for ALS waveform data (a), 27.6% for normalized waveform data (b), 29.1% for point data derived from the ALS waveforms (c), and 36.5% for ALS point data from the laser scanning system (d). The correspondence between the estimated vegetation volume profiles was also best for the normalized waveform data and the point data derived from the ALS waveforms and worst for ALS point data from the laser scanning system as demonstrated by the Reynolds error index. The results suggest that ALS waveform data describe the volumetric aspects of vertical vegetation structure somewhat more accurately than ALS point data from the laser scanning system and that compensation for the shielding effect of higher vegetation layers is useful. The new methods for estimation of vegetation volume profiles from ALS data could be used in the future to derive 3D models of the vegetation structure in large areas
Machine Learning technique for isotopic determination of radioisotopes using HPGe -ray spectra
-ray spectroscopy is a quantitative, non-destructive
technique that may be utilized for the identification and quantitative isotopic
estimation of radionuclides. Traditional methods of isotopic determination have
various challenges that contribute to statistical and systematic uncertainties
in the estimated isotopics. Furthermore, these methods typically require
numerous pre-processing steps, and have only been rigorously tested in
laboratory settings with limited shielding. In this work, we examine the
application of a number of machine learning based regression algorithms as
alternatives to conventional approaches for analyzing -ray
spectroscopy data in the Emergency Response arena. This approach not only
eliminates many steps in the analysis procedure, and therefore offers potential
to reduce this source of systematic uncertainty, but is also shown to offer
comparable performance to conventional approaches in the Emergency Response
Application
Rapid gravity filtration operational performance assessment and diagnosis for preventative maintenance from on-line data
Rapid gravity filters, the final particulate barrier in many water treatment systems, are typically monitored using on-line turbidity, flow and head loss instrumentation. Current metrics for assessing filtration performance from on-line turbidity data were critically assessed and observed not to effectively and consistently summarise the important properties of a turbidity distribution and the associated water quality risk. In the absence of a consistent risk function for turbidity in treated water, using on-line turbidity as an indicative rather than a quantitative variable appears to be more practical. Best practice suggests that filtered water turbidity should be maintained below 0.1 NTU, at higher turbidity we can be less confident of an effective particle and pathogen barrier. Based on this simple distinction filtration performance has been described in terms of reliability and resilience by characterising the likelihood, frequency and duration of turbidity spikes greater than 0.1 NTU. This view of filtration performance is then used to frame operational diagnosis of unsatisfactory performance in terms of a machine learning classification problem. Through calculation of operationally relevant predictor variables and application of the Classification and Regression Tree (CART) algorithm the conditions associated with the greatest risk of poor filtration performance can be effectively modelled and communicated in operational terms. This provides a method for an evidence based decision support which can be used to efficiently manage individual pathogen barriers in a multi-barrier system
Assessment of the probability of contaminating Mars
New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth
Recommended from our members
Discovery of high-entropy ceramics via machine learning
AbstractAlthough high-entropy materials are attracting considerable interest due to a combination of useful properties and promising applications, predicting their formation remains a hindrance for rational discovery of new systems. Experimental approaches are based on physical intuition and/or expensive trial and error strategies. Most computational methods rely on the availability of sufficient experimental data and computational power. Machine learning (ML) applied to materials science can accelerate development and reduce costs. In this study, we propose an ML method, leveraging thermodynamic and compositional attributes of a given material for predicting the synthesizability (i.e., entropy-forming ability) of disordered metal carbides. The relative importance of the thermodynamic and compositional features for the predictions are then explored. The approach’s suitability is demonstrated by comparing values calculated with density functional theory to ML predictions. Finally, the model is employed to predict the entropy-forming ability of 70 new compositions; several predictions are validated by additional density functional theory calculations and experimental synthesis, corroborating the effectiveness in exploring vast compositional spaces in a high-throughput manner. Importantly, seven compositions are selected specifically, because they contain all three of the Group VI elements (Cr, Mo, and W), which do not form room temperature-stable rock-salt monocarbides. Incorporating the Group VI elements into the rock-salt structure provides further opportunity for tuning the electronic structure and potentially material performance
- …