7,484 research outputs found
Optical Flow on Evolving Surfaces with an Application to the Analysis of 4D Microscopy Data
We extend the concept of optical flow to a dynamic non-Euclidean setting.
Optical flow is traditionally computed from a sequence of flat images. It is
the purpose of this paper to introduce variational motion estimation for images
that are defined on an evolving surface. Volumetric microscopy images depicting
a live zebrafish embryo serve as both biological motivation and test data.Comment: The final publication is available at link.springer.co
Active skeleton for bacteria modeling
The investigation of spatio-temporal dynamics of bacterial cells and their
molecular components requires automated image analysis tools to track cell
shape properties and molecular component locations inside the cells. In the
study of bacteria aging, the molecular components of interest are protein
aggregates accumulated near bacteria boundaries. This particular location makes
very ambiguous the correspondence between aggregates and cells, since computing
accurately bacteria boundaries in phase-contrast time-lapse imaging is a
challenging task. This paper proposes an active skeleton formulation for
bacteria modeling which provides several advantages: an easy computation of
shape properties (perimeter, length, thickness, orientation), an improved
boundary accuracy in noisy images, and a natural bacteria-centered coordinate
system that permits the intrinsic location of molecular components inside the
cell. Starting from an initial skeleton estimate, the medial axis of the
bacterium is obtained by minimizing an energy function which incorporates
bacteria shape constraints. Experimental results on biological images and
comparative evaluation of the performances validate the proposed approach for
modeling cigar-shaped bacteria like Escherichia coli. The Image-J plugin of the
proposed method can be found online at http://fluobactracker.inrialpes.fr.Comment: Published in Computer Methods in Biomechanics and Biomedical
Engineering: Imaging and Visualizationto appear i
Dynamic Bayesian Combination of Multiple Imperfect Classifiers
Classifier combination methods need to make best use of the outputs of
multiple, imperfect classifiers to enable higher accuracy classifications. In
many situations, such as when human decisions need to be combined, the base
decisions can vary enormously in reliability. A Bayesian approach to such
uncertain combination allows us to infer the differences in performance between
individuals and to incorporate any available prior knowledge about their
abilities when training data is sparse. In this paper we explore Bayesian
classifier combination, using the computationally efficient framework of
variational Bayesian inference. We apply the approach to real data from a large
citizen science project, Galaxy Zoo Supernovae, and show that our method far
outperforms other established approaches to imperfect decision combination. We
go on to analyse the putative community structure of the decision makers, based
on their inferred decision making strategies, and show that natural groupings
are formed. Finally we present a dynamic Bayesian classifier combination
approach and investigate the changes in base classifier performance over time.Comment: 35 pages, 12 figure
Stochastic gradient descent performs variational inference, converges to limit cycles for deep networks
Stochastic gradient descent (SGD) is widely believed to perform implicit
regularization when used to train deep neural networks, but the precise manner
in which this occurs has thus far been elusive. We prove that SGD minimizes an
average potential over the posterior distribution of weights along with an
entropic regularization term. This potential is however not the original loss
function in general. So SGD does perform variational inference, but for a
different loss than the one used to compute the gradients. Even more
surprisingly, SGD does not even converge in the classical sense: we show that
the most likely trajectories of SGD for deep networks do not behave like
Brownian motion around critical points. Instead, they resemble closed loops
with deterministic components. We prove that such "out-of-equilibrium" behavior
is a consequence of highly non-isotropic gradient noise in SGD; the covariance
matrix of mini-batch gradients for deep networks has a rank as small as 1% of
its dimension. We provide extensive empirical validation of these claims,
proven in the appendix
- …