2 research outputs found
Inexact Alternating Optimization for Phase Retrieval In the Presence of Outliers
Phase retrieval has been mainly considered in the presence of Gaussian noise.
However, the performance of the algorithms proposed under the Gaussian noise
model severely degrades when grossly corrupted data, i.e., outliers, exist.
This paper investigates techniques for phase retrieval in the presence of
heavy-tailed noise -- which is considered a better model for situations where
outliers exist. An -norm () based estimator is proposed for
fending against such noise, and two-block inexact alternating optimization is
proposed as the algorithmic framework to tackle the resulting optimization
problem. Two specific algorithms are devised by exploring different local
approximations within this framework. Interestingly, the core conditional
minimization steps can be interpreted as iteratively reweighted least squares
and gradient descent. Convergence properties of the algorithms are discussed,
and the Cram\'er-Rao bound (CRB) is derived. Simulations demonstrate that the
proposed algorithms approach the CRB and outperform state-of-the-art algorithms
in heavy-tailed noise.Comment: 23 pages, 16 figure
Tensor Decomposition for Signal Processing and Machine Learning
Tensors or {\em multi-way arrays} are functions of three or more indices
-- similar to matrices (two-way arrays), which are functions
of two indices for (row,column). Tensors have a rich history,
stretching over almost a century, and touching upon numerous disciplines; but
they have only recently become ubiquitous in signal and data analytics at the
confluence of signal processing, statistics, data mining and machine learning.
This overview article aims to provide a good starting point for researchers and
practitioners interested in learning about and working with tensors. As such,
it focuses on fundamentals and motivation (using various application examples),
aiming to strike an appropriate balance of breadth {\em and depth} that will
enable someone having taken first graduate courses in matrix algebra and
probability to get started doing research and/or developing tensor algorithms
and software. Some background in applied optimization is useful but not
strictly required. The material covered includes tensor rank and rank
decomposition; basic tensor factorization models and their relationships and
properties (including fairly good coverage of identifiability); broad coverage
of algorithms ranging from alternating optimization to stochastic gradient;
statistical performance analysis; and applications ranging from source
separation to collaborative filtering, mixture and topic modeling,
classification, and multilinear subspace learning.Comment: revised version, overview articl