3,239 research outputs found
ELVIS: Exploring the Local Volume in Simulations
We introduce a set of high-resolution dissipationless simulations that model
the Local Group (LG) in a cosmological context: Exploring the Local Volume in
Simulations (ELVIS). The suite contains 48 Galaxy-size halos, each within
high-resolution volumes that span 2-5 Mpc in size, and each resolving thousands
of systems with masses below the atomic cooling limit. Half of the ELVIS galaxy
halos are in paired configurations similar to the Milky Way (MW) and M31; the
other half are isolated, mass-matched analogs. We find no difference in the
abundance or kinematics of substructure within the virial radii of isolated
versus paired hosts. On Mpc scales, however, LG-like pairs average almost twice
as many companions and the velocity field is kinematically hotter and more
complex. We present a refined abundance matching relation between stellar mass
and halo mass that reproduces the observed satellite stellar mass functions of
the MW and M31 down to the regime where incompleteness is an issue, . Within a larger region spanning approximately 3
Mpc, the same relation predicts that there should be 1000 galaxies with
awaiting discovery. We show that up to 50% of halos
within 1 Mpc of the MW or M31 could be systems that have previously been within
the virial radius of either giant. By associating never-accreted halos with
gas-rich dwarfs, we show that there are plausibly 50 undiscovered dwarf
galaxies with HI masses within the Local Volume. The radial
velocity distribution of these predicted gas-rich dwarfs can be used to inform
follow-up searches based on ultra-compact high-velocity clouds found in the
ALFALFA survey.Comment: 22 pages, 19 figures, 3 tables; v2 -- accepted to MNRAS. Movies,
images, and data are available at http://localgroup.ps.uci.edu/elvi
Combined 3D thinning and greedy algorithm to approximate realistic particles with corrected mechanical properties
The shape of irregular particles has significant influence on micro- and
macro-scopic behavior of granular systems. This paper presents a combined 3D
thinning and greedy set-covering algorithm to approximate realistic particles
with a clump of overlapping spheres for discrete element method (DEM)
simulations. First, the particle medial surface (or surface skeleton), from
which all candidate (maximal inscribed) spheres can be generated, is computed
by the topological 3D thinning. Then, the clump generation procedure is
converted into a greedy set-covering (SCP) problem.
To correct the mass distribution due to highly overlapped spheres inside the
clump, linear programming (LP) is used to adjust the density of each component
sphere, such that the aggregate properties mass, center of mass and inertia
tensor are identical or close enough to the prototypical particle. In order to
find the optimal approximation accuracy (volume coverage: ratio of clump's
volume to the original particle's volume), particle flow of 3 different shapes
in a rotating drum are conducted. It was observed that the dynamic angle of
repose starts to converge for all particle shapes at 85% volume coverage
(spheres per clump < 30), which implies the possible optimal resolution to
capture the mechanical behavior of the system.Comment: 34 pages, 13 figure
The Language of Mathematics: Mathematical Terminology Simplified for Classroom Use.
After recognizing the need for a simpler approach to the teaching of mathematical terminology, I concluded it would be valuable to make a unit of simplified terms and describe methods of teaching these terms. In this thesis I have compared the terminology found in the Virginia Standards of Learning objectives to the materials found at each grade level. The units developed are as follows: The Primary Persistence Unit- for grades K-2; The Elementary Expansion Unit- for grades 3-5; and The Middle School Mastery Unit- for grades 6-8
A probabilistic multidimensional data model and its applications in business management
This dissertation develops a conceptual data model that can efficiently handle huge volumes of data containing uncertainty and are subject to frequent changes. This model can be used to build Decision Support Systems to improve decision-making process. Business intelligence and decision-making in today\u27s business world require extensive use of huge volumes of data. Real world data contain uncertainty and change over time. Business leaders should have access to Decision Support Systems that can efficiently handle voluminous data, uncertainty, and modifications to uncertain data. Database product vendors provide several extensions and features to support these requirements; however, these extensions lack support of standard conceptual models. Standardization generally creates more competition and leads to lower prices and improved standards of living. Results from this study could become a data model standard in the area of applied decisions sciences.
The conceptual data model developed in this dissertation uses a mathematical concept based on set theory, probability axioms, and the Bayesian framework. Conceptual data model, algebra to manipulate data, a framework and an algorithm to modify the data are presented. The data modification algorithm is analyzed for time and space efficiency. Formal mathematical proof is provided to support identified properties of model, algebra, and the modification framework. Decision-making ability of this model was investigated using sample data. Advantages of this model and improvements in inventory management through its application are described. Comparison and contrast between this model and Bayesian belief networks are presented. Finally, scope and topics for further research are described
Exploring and interrogating astrophysical data in virtual reality
Scientists across all disciplines increasingly rely on machine learning algorithms to analyse and sort datasets of ever increasing volume and complexity. Although trends and outliers are easily extracted, careful and close inspection will still be necessary to explore and disentangle detailed behaviour, as well as identify systematics and false positives. We must therefore incorporate new technologies to facilitate scientific analysis and exploration. Astrophysical data is inherently multi-parameter, with the spatial-kinematic dimensions at the core of observations and simulations. The arrival of mainstream virtual-reality (VR) headsets and increased GPU power, as well as the availability of versatile development tools for video games, has enabled scientists to deploy such technology to effectively interrogate and interact with complex data. In this paper we present development and results from custom-built interactive VR tools, called the iDaVIE suite, that are informed and driven by research on galaxy evolution, cosmic large-scale structure, galaxy–galaxy interactions, and gas/kinematics of nearby galaxies in survey and targeted observations. In the new era of Big Data ushered in by major facilities such as the SKA and LSST that render past analysis and refinement methods highly constrained, we believe that a paradigm shift to new software, technology and methods that exploit the power of visual perception, will play an increasingly important role in bridging the gap between statistical metrics and new discovery. We have released a beta version of the iDaVIE software system that is free and open to the community
Piles of piles: An inter-country comparison of nuclear pile development during World War II
Between the time of the discovery of nuclear fission in early 1939 and the
end of 1946, approximately 90 nuclear piles were constructed in six countries.
These devices ranged from simple graphite columns containing neutron sources
but no uranium to others as complex as the water-cooled 250-megawatt plutonium
production reactors built at Hanford, Washington. This paper summarizes and
compares the properties of these piles.Comment: 45 pages, 9 figure
The large-scale correlations of multi-cell densities and profiles, implications for cosmic variance estimates
In order to quantify the error budget in the measured probability
distribution functions of cell densities, the two-point statistics of cosmic
densities in concentric spheres is investigated. Bias functions are introduced
as the ratio of their two-point correlation function to the two-point
correlation of the underlying dark matter distribution. They describe how cell
densities are spatially correlated. They are computed here via the so-called
large deviation principle in the quasi-linear regime. Their large-separation
limit is presented and successfully compared to simulations for density and
density slopes: this regime is shown to be rapidly reached allowing to get
sub-percent precision for a wide range of densities and variances. The
corresponding asymptotic limit provides an estimate of the cosmic variance of
standard concentric cell statistics applied to finite surveys. More generally,
no assumption on the separation is required for some specific moments of the
two-point statistics, for instance when predicting the generating function of
cumulants containing any powers of concentric densities in one location and one
power of density at some arbitrary distance from the rest. This exact "one
external leg" cumulant generating function is used in particular to probe the
rate of convergence of the large-separation approximation.Comment: 17 pages, 10 figures, replaced to match the MNRAS accepted versio
- …