5 research outputs found
Globally Convergent Accelerated Algorithms for Multilinear Sparse Logistic Regression with -constraints
Tensor data represents a multidimensional array. Regression methods based on
low-rank tensor decomposition leverage structural information to reduce the
parameter count. Multilinear logistic regression serves as a powerful tool for
the analysis of multidimensional data. To improve its efficacy and
interpretability, we present a Multilinear Sparse Logistic Regression model
with -constraints (-MLSR). In contrast to the -norm and
-norm, the -norm constraint is better suited for feature
selection. However, due to its nonconvex and nonsmooth properties, solving it
is challenging and convergence guarantees are lacking. Additionally, the
multilinear operation in -MLSR also brings non-convexity. To tackle
these challenges, we propose an Accelerated Proximal Alternating Linearized
Minimization with Adaptive Momentum (APALM) method to solve the
-MLSR model. We provide a proof that APALM can ensure the
convergence of the objective function of -MLSR. We also demonstrate
that APALM is globally convergent to a first-order critical point as well
as establish convergence rate by using the Kurdyka-Lojasiewicz property.
Empirical results obtained from synthetic and real-world datasets validate the
superior performance of our algorithm in terms of both accuracy and speed
compared to other state-of-the-art methods.Comment: arXiv admin note: text overlap with arXiv:2308.1212