25,186 research outputs found
Competitive Collaboration: Joint Unsupervised Learning of Depth, Camera Motion, Optical Flow and Motion Segmentation
We address the unsupervised learning of several interconnected problems in
low-level vision: single view depth prediction, camera motion estimation,
optical flow, and segmentation of a video into the static scene and moving
regions. Our key insight is that these four fundamental vision problems are
coupled through geometric constraints. Consequently, learning to solve them
together simplifies the problem because the solutions can reinforce each other.
We go beyond previous work by exploiting geometry more explicitly and
segmenting the scene into static and moving regions. To that end, we introduce
Competitive Collaboration, a framework that facilitates the coordinated
training of multiple specialized neural networks to solve complex problems.
Competitive Collaboration works much like expectation-maximization, but with
neural networks that act as both competitors to explain pixels that correspond
to static or moving regions, and as collaborators through a moderator that
assigns pixels to be either static or independently moving. Our novel method
integrates all these problems in a common framework and simultaneously reasons
about the segmentation of the scene into moving objects and the static
background, the camera motion, depth of the static scene structure, and the
optical flow of moving objects. Our model is trained without any supervision
and achieves state-of-the-art performance among joint unsupervised methods on
all sub-problems.Comment: CVPR 201
An Infinitesimal Probabilistic Model for Principal Component Analysis of Manifold Valued Data
We provide a probabilistic and infinitesimal view of how the principal
component analysis procedure (PCA) can be generalized to analysis of nonlinear
manifold valued data. Starting with the probabilistic PCA interpretation of the
Euclidean PCA procedure, we show how PCA can be generalized to manifolds in an
intrinsic way that does not resort to linearization of the data space. The
underlying probability model is constructed by mapping a Euclidean stochastic
process to the manifold using stochastic development of Euclidean
semimartingales. The construction uses a connection and bundles of covariant
tensors to allow global transport of principal eigenvectors, and the model is
thereby an example of how principal fiber bundles can be used to handle the
lack of global coordinate system and orientations that characterizes manifold
valued statistics. We show how curvature implies non-integrability of the
equivalent of Euclidean principal subspaces, and how the stochastic flows
provide an alternative to explicit construction of such subspaces. We describe
estimation procedures for inference of parameters and prediction of principal
components, and we give examples of properties of the model on embedded
surfaces
- …