How to deal with hyperparmeter optimization in matrix decompositions: new existence theorems and novel stopping criteria

Abstract

The problem of tuning hyperparameters in learning approaches is an existing and valuable issue that for its nature can affect the real data analysis. Unfortunately, for now these hyperparameters are choosing with Grid-Search or Cross Validation approaches and there is a lack of an automatic tuning procedure to deal with this problem, especially for unsupervised context such as Dimensionality Reduction (DR) approaches. In this paper we deal with the Hyperparameters Optimization problem (HPO) applied to Nonnegative Matrix Factorization (NMF) a particular Linear DR technique. Starting from results in one of our works, we want to enforce the algorithmic strategy through a theoretical foundation. Firstly, motivated by a bilevel formulation of the NMF, we state suitable conditions needed to derive the existence of a minimizer in a infinite-dimensional Hilbert space through a more general existence result. On the other hand, some theoretical results are employed for all those situations when it is not possible (or it is unnecessary) to obtain an exact minimizer. To meet these needs, we derive a stop criterion of an ``approximate" function via Ekeland's Variational Principle. These abstract issues will be applied to our algorithm and some suggestions to other models will be proposed, too

    Similar works