5 research outputs found

    Globally Convergent Accelerated Algorithms for Multilinear Sparse Logistic Regression with β„“0\ell_0-constraints

    Full text link
    Tensor data represents a multidimensional array. Regression methods based on low-rank tensor decomposition leverage structural information to reduce the parameter count. Multilinear logistic regression serves as a powerful tool for the analysis of multidimensional data. To improve its efficacy and interpretability, we present a Multilinear Sparse Logistic Regression model with β„“0\ell_0-constraints (β„“0\ell_0-MLSR). In contrast to the β„“1\ell_1-norm and β„“2\ell_2-norm, the β„“0\ell_0-norm constraint is better suited for feature selection. However, due to its nonconvex and nonsmooth properties, solving it is challenging and convergence guarantees are lacking. Additionally, the multilinear operation in β„“0\ell_0-MLSR also brings non-convexity. To tackle these challenges, we propose an Accelerated Proximal Alternating Linearized Minimization with Adaptive Momentum (APALM+^+) method to solve the β„“0\ell_0-MLSR model. We provide a proof that APALM+^+ can ensure the convergence of the objective function of β„“0\ell_0-MLSR. We also demonstrate that APALM+^+ is globally convergent to a first-order critical point as well as establish convergence rate by using the Kurdyka-Lojasiewicz property. Empirical results obtained from synthetic and real-world datasets validate the superior performance of our algorithm in terms of both accuracy and speed compared to other state-of-the-art methods.Comment: arXiv admin note: text overlap with arXiv:2308.1212
    corecore