1,654 research outputs found
The supernova-regulated ISM. I. The multi-phase structure
We simulate the multi-phase interstellar medium randomly heated and stirred
by supernovae, with gravity, differential rotation and other parameters of the
solar neighbourhood. Here we describe in detail both numerical and physical
aspects of the model, including injection of thermal and kinetic energy by SN
explosions, radiative cooling, photoelectric heating and various transport
processes. With 3D domain extending 1 kpc^2 horizontally and 2 kpc vertically,
the model routinely spans gas number densities 10^-5 - 10^2 cm^-3, temperatures
10-10^8 K, local velocities up to 10^3 km s^-1 (with Mach number up to 25).
The thermal structure of the modelled ISM is classified by inspection of the
joint probability density of the gas number density and temperature. We confirm
that most of the complexity can be captured in terms of just three phases,
separated by temperature borderlines at about 10^3 K and 5x10^5 K. The
probability distribution of gas density within each phase is approximately
lognormal. We clarify the connection between the fractional volume of a phase
and its various proxies, and derive an exact relation between the fractional
volume and the filling factors defined in terms of the volume and probabilistic
averages. These results are discussed in both observational and computational
contexts. The correlation scale of the random flows is calculated from the
velocity autocorrelation function; it is of order 100 pc and tends to grow with
distance from the mid-plane. We use two distinct parameterizations of radiative
cooling to show that the multi-phase structure of the gas is robust, as it does
not depend significantly on this choice.Comment: 28 pages, 22 figures and 8 table
The supernova-regulated ISM. II. The mean magnetic field
The origin and structure of the magnetic fields in the interstellar medium of
spiral galaxies is investigated with 3D, non-ideal, compressible MHD
simulations, including stratification in the galactic gravity field,
differential rotation and radiative cooling. A rectangular domain, 1x1x2
kpc^{3} in size, spans both sides of the galactic mid-plane. Supernova
explosions drive transonic turbulence. A seed magnetic field grows
exponentially to reach a statistically steady state within 1.6 Gyr. Following
Germano (1992) we use volume averaging with a Gaussian kernel to separate
magnetic field into a mean field and fluctuations. Such averaging does not
satisfy all Reynolds rules, yet allows a formulation of mean-field theory. The
mean field thus obtained varies in both space and time. Growth rates differ for
the mean-field and fluctuating field and there is clear scale separation
between the two elements, whose integral scales are about 0.7 kpc and 0.3 kpc,
respectively.Comment: 5 pages, 10 figures, submitted to Monthly Notices Letter
Separating the scales in a compressible interstellar medium
We apply Gaussian smoothing to obtain mean density, velocity, magnetic and
energy density fields in simulations of the interstellar medium based on
three-dimensional magnetohydrodynamic equations in a shearing box
in size. Unlike alternative averaging procedures,
such as horizontal averaging, Gaussian smoothing retains the three-dimensional
structure of the mean fields. Although Gaussian smoothing does not obey the
Reynolds rules of averaging, physically meaningful central statistical moments
are defined as suggested by Germano (1992). We discuss methods to identify an
optimal smoothing scale and the effects of this choice on the results.
From spectral analysis of the magnetic, density and velocity fields, we find a
suitable smoothing length for all three fields, of . We discuss the properties of third-order statistical moments in
fluctuations of kinetic energy density in compressible flows and suggest their
physical interpretation. The mean magnetic field, amplified by a mean-field
dynamo, significantly alters the distribution of kinetic energy in space and
between scales, reducing the magnitude of kinetic energy at intermediate
scales. This intermediate-scale kinetic energy is a useful diagnostic of the
importance of SN-driven outflows
Scalable Parallel Numerical Constraint Solver Using Global Load Balancing
We present a scalable parallel solver for numerical constraint satisfaction
problems (NCSPs). Our parallelization scheme consists of homogeneous worker
solvers, each of which runs on an available core and communicates with others
via the global load balancing (GLB) method. The parallel solver is implemented
with X10 that provides an implementation of GLB as a library. In experiments,
several NCSPs from the literature were solved and attained up to 516-fold
speedup using 600 cores of the TSUBAME2.5 supercomputer.Comment: To be presented at X10'15 Worksho
Introduction
Non peer reviewe
Symmetry breaking in numeric constraint problems
Symmetry-breaking constraints in the form of inequalities between variables have been proposed for a few kind of solution symmetries in numeric CSPs. We show that, for the variable symmetries among those, the proposed inequalities are but a specific case of a relaxation of the well-known LEX constraints extensively used for discrete CSPs. We discuss the merits of this relaxation and present experimental evidences of its practical interest.Postprint (author’s final draft
Entropy-based analysis of the number partitioning problem
In this paper we apply the multicanonical method of statistical physics on
the number-partitioning problem (NPP). This problem is a basic NP-hard problem
from computer science, and can be formulated as a spin-glass problem. We
compute the spectral degeneracy, which gives us information about the number of
solutions for a given cost and cardinality . We also study an extension
of this problem for partitions. We show that a fundamental difference on
the spectral degeneracy of the generalized () NPP exists, which could
explain why it is so difficult to find good solutions for this case. The
information obtained with the multicanonical method can be very useful on the
construction of new algorithms.Comment: 6 pages, 4 figure
Certainty Closure: Reliable Constraint Reasoning with Incomplete or Erroneous Data
Constraint Programming (CP) has proved an effective paradigm to model and
solve difficult combinatorial satisfaction and optimisation problems from
disparate domains. Many such problems arising from the commercial world are
permeated by data uncertainty. Existing CP approaches that accommodate
uncertainty are less suited to uncertainty arising due to incomplete and
erroneous data, because they do not build reliable models and solutions
guaranteed to address the user's genuine problem as she perceives it. Other
fields such as reliable computation offer combinations of models and associated
methods to handle these types of uncertain data, but lack an expressive
framework characterising the resolution methodology independently of the model.
We present a unifying framework that extends the CP formalism in both model
and solutions, to tackle ill-defined combinatorial problems with incomplete or
erroneous data. The certainty closure framework brings together modelling and
solving methodologies from different fields into the CP paradigm to provide
reliable and efficient approches for uncertain constraint problems. We
demonstrate the applicability of the framework on a case study in network
diagnosis. We define resolution forms that give generic templates, and their
associated operational semantics, to derive practical solution methods for
reliable solutions.Comment: Revised versio
- …