3,943 research outputs found
Missing Spectrum-Data Recovery in Cognitive Radio Networks Using Piecewise Constant Nonnegative Matrix Factorization
In this paper, we propose a missing spectrum data recovery technique for
cognitive radio (CR) networks using Nonnegative Matrix Factorization (NMF). It
is shown that the spectrum measurements collected from secondary users (SUs)
can be factorized as product of a channel gain matrix times an activation
matrix. Then, an NMF method with piecewise constant activation coefficients is
introduced to analyze the measurements and estimate the missing spectrum data.
The proposed optimization problem is solved by a Majorization-Minimization
technique. The numerical simulation verifies that the proposed technique is
able to accurately estimate the missing spectrum data in the presence of noise
and fading.Comment: 6 pages, 6 figures, Accepted for presentation in MILCOM'15 Conferenc
Bi-Objective Nonnegative Matrix Factorization: Linear Versus Kernel-Based Models
Nonnegative matrix factorization (NMF) is a powerful class of feature
extraction techniques that has been successfully applied in many fields, namely
in signal and image processing. Current NMF techniques have been limited to a
single-objective problem in either its linear or nonlinear kernel-based
formulation. In this paper, we propose to revisit the NMF as a multi-objective
problem, in particular a bi-objective one, where the objective functions
defined in both input and feature spaces are taken into account. By taking the
advantage of the sum-weighted method from the literature of multi-objective
optimization, the proposed bi-objective NMF determines a set of nondominated,
Pareto optimal, solutions instead of a single optimal decomposition. Moreover,
the corresponding Pareto front is studied and approximated. Experimental
results on unmixing real hyperspectral images confirm the efficiency of the
proposed bi-objective NMF compared with the state-of-the-art methods
Algorithms for Approximate Subtropical Matrix Factorization
Matrix factorization methods are important tools in data mining and analysis.
They can be used for many tasks, ranging from dimensionality reduction to
visualization. In this paper we concentrate on the use of matrix factorizations
for finding patterns from the data. Rather than using the standard algebra --
and the summation of the rank-1 components to build the approximation of the
original matrix -- we use the subtropical algebra, which is an algebra over the
nonnegative real values with the summation replaced by the maximum operator.
Subtropical matrix factorizations allow "winner-takes-it-all" interpretations
of the rank-1 components, revealing different structure than the normal
(nonnegative) factorizations. We study the complexity and sparsity of the
factorizations, and present a framework for finding low-rank subtropical
factorizations. We present two specific algorithms, called Capricorn and
Cancer, that are part of our framework. They can be used with data that has
been corrupted with different types of noise, and with different error metrics,
including the sum-of-absolute differences, Frobenius norm, and Jensen--Shannon
divergence. Our experiments show that the algorithms perform well on data that
has subtropical structure, and that they can find factorizations that are both
sparse and easy to interpret.Comment: 40 pages, 9 figures. For the associated source code, see
http://people.mpi-inf.mpg.de/~pmiettin/tropical
- …