478,978 research outputs found
Tenfold your photons -- a physically-sound approach to filtering-based variance reduction of Monte-Carlo-simulated dose distributions
X-ray dose constantly gains interest in the interventional suite. With dose
being generally difficult to monitor reliably, fast computational methods are
desirable. A major drawback of the gold standard based on Monte Carlo (MC)
methods is its computational complexity. Besides common variance reduction
techniques, filter approaches are often applied to achieve conclusive results
within a fraction of time. Inspired by these methods, we propose a novel
approach. We down-sample the target volume based on the fraction of mass,
simulate the imaging situation, and then revert the down-sampling. To this end,
the dose is weighted by the mass energy absorption, up-sampled, and distributed
using a guided filter. Eventually, the weighting is inverted resulting in
accurate high resolution dose distributions. The approach has the potential to
considerably speed-up MC simulations since less photons and boundary checks are
necessary. First experiments substantiate these assumptions. We achieve a
median accuracy of 96.7 % to 97.4 % of the dose estimation with the proposed
method and a down-sampling factor of 8 and 4, respectively. While maintaining a
high accuracy, the proposed method provides for a tenfold speed-up. The overall
findings suggest the conclusion that the proposed method has the potential to
allow for further efficiency.Comment: 6 pages, 3 figures, Bildverarbeitung f\"ur die Medizin 202
String and Membrane Gaussian Processes
In this paper we introduce a novel framework for making exact nonparametric
Bayesian inference on latent functions, that is particularly suitable for Big
Data tasks. Firstly, we introduce a class of stochastic processes we refer to
as string Gaussian processes (string GPs), which are not to be mistaken for
Gaussian processes operating on text. We construct string GPs so that their
finite-dimensional marginals exhibit suitable local conditional independence
structures, which allow for scalable, distributed, and flexible nonparametric
Bayesian inference, without resorting to approximations, and while ensuring
some mild global regularity constraints. Furthermore, string GP priors
naturally cope with heterogeneous input data, and the gradient of the learned
latent function is readily available for explanatory analysis. Secondly, we
provide some theoretical results relating our approach to the standard GP
paradigm. In particular, we prove that some string GPs are Gaussian processes,
which provides a complementary global perspective on our framework. Finally, we
derive a scalable and distributed MCMC scheme for supervised learning tasks
under string GP priors. The proposed MCMC scheme has computational time
complexity and memory requirement , where
is the data size and the dimension of the input space. We illustrate the
efficacy of the proposed approach on several synthetic and real-world datasets,
including a dataset with millions input points and attributes.Comment: To appear in the Journal of Machine Learning Research (JMLR), Volume
1
Structural Variability from Noisy Tomographic Projections
In cryo-electron microscopy, the 3D electric potentials of an ensemble of
molecules are projected along arbitrary viewing directions to yield noisy 2D
images. The volume maps representing these potentials typically exhibit a great
deal of structural variability, which is described by their 3D covariance
matrix. Typically, this covariance matrix is approximately low-rank and can be
used to cluster the volumes or estimate the intrinsic geometry of the
conformation space. We formulate the estimation of this covariance matrix as a
linear inverse problem, yielding a consistent least-squares estimator. For
images of size -by- pixels, we propose an algorithm for calculating this
covariance estimator with computational complexity
, where the condition number
is empirically in the range --. Its efficiency relies on the
observation that the normal equations are equivalent to a deconvolution problem
in 6D. This is then solved by the conjugate gradient method with an appropriate
circulant preconditioner. The result is the first computationally efficient
algorithm for consistent estimation of 3D covariance from noisy projections. It
also compares favorably in runtime with respect to previously proposed
non-consistent estimators. Motivated by the recent success of eigenvalue
shrinkage procedures for high-dimensional covariance matrices, we introduce a
shrinkage procedure that improves accuracy at lower signal-to-noise ratios. We
evaluate our methods on simulated datasets and achieve classification results
comparable to state-of-the-art methods in shorter running time. We also present
results on clustering volumes in an experimental dataset, illustrating the
power of the proposed algorithm for practical determination of structural
variability.Comment: 52 pages, 11 figure
A Survey on Continuous Time Computations
We provide an overview of theories of continuous time computation. These
theories allow us to understand both the hardness of questions related to
continuous time dynamical systems and the computational power of continuous
time analog models. We survey the existing models, summarizing results, and
point to relevant references in the literature
- …