6,209 research outputs found
A Unified Gas-kinetic Scheme for Continuum and Rarefied Flows IV: full Boltzmann and Model Equations
Fluid dynamic equations are valid in their respective modeling scales. With a
variation of the modeling scales, theoretically there should have a continuous
spectrum of fluid dynamic equations. In order to study multiscale flow
evolution efficiently, the dynamics in the computational fluid has to be
changed with the scales. A direct modeling of flow physics with a changeable
scale may become an appropriate approach. The unified gas-kinetic scheme (UGKS)
is a direct modeling method in the mesh size scale, and its underlying flow
physics depends on the resolution of the cell size relative to the particle
mean free path. The cell size of UGKS is not limited by the particle mean free
path. With the variation of the ratio between the numerical cell size and local
particle mean free path, the UGKS recovers the flow dynamics from the particle
transport and collision in the kinetic scale to the wave propagation in the
hydrodynamic scale.
The previous UGKS is mostly constructed from the evolution solution of
kinetic model equations. This work is about the further development of the UGKS
with the implementation of the full Boltzmann collision term in the region
where it is needed. The central ingredient of the UGKS is the coupled treatment
of particle transport and collision in the flux evaluation across a cell
interface, where a continuous flow dynamics from kinetic to hydrodynamic scales
is modeled. The newly developed UGKS has the asymptotic preserving (AP)
property of recovering the NS solutions in the continuum flow regime, and the
full Boltzmann solution in the rarefied regime. In the mostly unexplored
transition regime, the UGKS itself provides a valuable tool for the flow study
in this regime. The mathematical properties of the scheme, such as stability,
accuracy, and the asymptotic preserving, will be analyzed in this paper as
well
Multiscale likelihood analysis and complexity penalized estimation
We describe here a framework for a certain class of multiscale likelihood
factorizations wherein, in analogy to a wavelet decomposition of an L^2
function, a given likelihood function has an alternative representation as a
product of conditional densities reflecting information in both the data and
the parameter vector localized in position and scale. The framework is
developed as a set of sufficient conditions for the existence of such
factorizations, formulated in analogy to those underlying a standard
multiresolution analysis for wavelets, and hence can be viewed as a
multiresolution analysis for likelihoods. We then consider the use of these
factorizations in the task of nonparametric, complexity penalized likelihood
estimation. We study the risk properties of certain thresholding and
partitioning estimators, and demonstrate their adaptivity and near-optimality,
in a minimax sense over a broad range of function spaces, based on squared
Hellinger distance as a loss function. In particular, our results provide an
illustration of how properties of classical wavelet-based estimators can be
obtained in a single, unified framework that includes models for continuous,
count and categorical data types
The Topology ToolKit
This system paper presents the Topology ToolKit (TTK), a software platform
designed for topological data analysis in scientific visualization. TTK
provides a unified, generic, efficient, and robust implementation of key
algorithms for the topological analysis of scalar data, including: critical
points, integral lines, persistence diagrams, persistence curves, merge trees,
contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots,
Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due
to a tight integration with ParaView. It is also easily accessible to
developers through a variety of bindings (Python, VTK/C++) for fast prototyping
or through direct, dependence-free, C++, to ease integration into pre-existing
complex systems. While developing TTK, we faced several algorithmic and
software engineering challenges, which we document in this paper. In
particular, we present an algorithm for the construction of a discrete gradient
that complies to the critical points extracted in the piecewise-linear setting.
This algorithm guarantees a combinatorial consistency across the topological
abstractions supported by TTK, and importantly, a unified implementation of
topological data simplification for multi-scale exploration and analysis. We
also present a cached triangulation data structure, that supports time
efficient and generic traversals, which self-adjusts its memory usage on demand
for input simplicial meshes and which implicitly emulates a triangulation for
regular grids with no memory overhead. Finally, we describe an original
software architecture, which guarantees memory efficient and direct accesses to
TTK features, while still allowing for researchers powerful and easy bindings
and extensions. TTK is open source (BSD license) and its code, online
documentation and video tutorials are available on TTK's website
- …