735 research outputs found

    Imposing early and asymptotic constraints on LiGME with application to nonconvex enhancement of fused lasso models

    Full text link
    For the constrained LiGME model, a nonconvexly regularized least squares estimation model, under its overall convexity condition, we newly present an iterative algorithm of guaranteed convergence to its globally optimal solution. The proposed algorithm can deal with two different types of constraints simultaneously. The first type called the asymptotic constraint requires for the limit point of the produced sequence by the proposed algorithm to achieve asymptotically. The second type called the early constraint requires for the whole sequence by the proposed algorithm to satisfy. We also propose a nonconvex and constraint enhancement of fused lasso models for sparse piecewise constant signal estimations, possibly under nonzero baseline assumptions, to which the proposed enhancement with two types of constraints can achieve robustness against possible model mismatch as well as higher estimation accuracy compared with conventional fused lasso type models.Comment: 5 pages, 7 figure

    Adaptive Localized Cayley Parametrization for Optimization over Stiefel Manifold

    Full text link
    We present an adaptive parametrization strategy for optimization problems over the Stiefel manifold by using generalized Cayley transforms to utilize powerful Euclidean optimization algorithms efficiently. The generalized Cayley transform can translate an open dense subset of the Stiefel manifold into a vector space, and the open dense subset is determined according to a tunable parameter called a center point. With the generalized Cayley transform, we recently proposed the naive Cayley parametrization, which reformulates the optimization problem over the Stiefel manifold as that over the vector space. Although this reformulation enables us to transplant powerful Euclidean optimization algorithms, their convergences may become slow by a poor choice of center points. To avoid such a slow convergence, in this paper, we propose to estimate adaptively 'good' center points so that the reformulated problem can be solved faster. We also present a unified convergence analysis, regarding the gradient, in cases where fairly standard Euclidean optimization algorithms are employed in the proposed adaptive parametrization strategy. Numerical experiments demonstrate that (i) the proposed strategy succeeds in escaping from the slow convergence observed in the naive Cayley parametrization strategy; (ii) the proposed strategy outperforms the standard strategy which employs a retraction.Comment: 29 pages, 4 figures, 4 table

    A Unified Framework for Solving a General Class of Nonconvexly Regularized Convex Models

    Full text link
    Recently, several nonconvex sparse regularizers which can preserve the convexity of the cost function have received increasing attention. This paper proposes a general class of such convexity-preserving (CP) regularizers, termed partially smoothed difference-of-convex (pSDC) regularizer. The pSDC regularizer is formulated as a structured difference-of-convex (DC) function, where the landscape of the subtrahend function can be adjusted by a parameterized smoothing function so as to attain overall-convexity. Assigned with proper building blocks, the pSDC regularizer reproduces existing CP regularizers and opens the way to a large number of promising new ones. With respect to the resultant nonconvexly regularized convex (NRC) model, we derive a series of overall-convexity conditions which naturally embrace the conditions in previous works. Moreover, we develop a unified framework based on DC programming for solving the NRC model. Compared to previously reported proximal splitting type approaches, the proposed framework makes less stringent assumptions. We establish the convergence of the proposed framework to a global minimizer. Numerical experiments demonstrate the power of the pSDC regularizers and the efficiency of the proposed DC algorithm.Comment: 15 pages, 6 figures, submitted to journa

    Robust Reduced-Rank Adaptive Processing Based on Parallel Subgradient Projection and Krylov Subspace Techniques

    Full text link
    In this paper, we propose a novel reduced-rank adaptive filtering algorithm by blending the idea of the Krylov subspace methods with the set-theoretic adaptive filtering framework. Unlike the existing Krylov-subspace-based reduced-rank methods, the proposed algorithm tracks the optimal point in the sense of minimizing the \sinq{true} mean square error (MSE) in the Krylov subspace, even when the estimated statistics become erroneous (e.g., due to sudden changes of environments). Therefore, compared with those existing methods, the proposed algorithm is more suited to adaptive filtering applications. The algorithm is analyzed based on a modified version of the adaptive projected subgradient method (APSM). Numerical examples demonstrate that the proposed algorithm enjoys better tracking performance than the existing methods for the interference suppression problem in code-division multiple-access (CDMA) systems as well as for simple system identification problems.Comment: 10 figures. In IEEE Transactions on Signal Processing, 201

    Adaptive Quadratic-Metric Parallel Subgradient Projection Algorithm and its Application to Acoustic Echo Cancellation

    Get PDF
    Publication in the conference proceedings of EUSIPCO, Florence, Italy, 200
    • …
    corecore