134 research outputs found

    PRISMA: PRoximal Iterative SMoothing Algorithm

    Full text link
    Motivated by learning problems including max-norm regularized matrix completion and clustering, robust PCA and sparse inverse covariance selection, we propose a novel optimization algorithm for minimizing a convex objective which decomposes into three parts: a smooth part, a simple non-smooth Lipschitz part, and a simple non-smooth non-Lipschitz part. We use a time variant smoothing strategy that allows us to obtain a guarantee that does not depend on knowing in advance the total number of iterations nor a bound on the domain

    Efficient First Order Methods for Linear Composite Regularizers

    Get PDF
    A wide class of regularization problems in machine learning and statistics employ a regularization term which is obtained by composing a simple convex function \omega with a linear transformation. This setting includes Group Lasso methods, the Fused Lasso and other total variation methods, multi-task learning methods and many more. In this paper, we present a general approach for computing the proximity operator of this class of regularizers, under the assumption that the proximity operator of the function \omega is known in advance. Our approach builds on a recent line of research on optimal first order optimization methods and uses fixed point iterations for numerically computing the proximity operator. It is more general than current approaches and, as we show with numerical simulations, computationally more efficient than available first order methods which do not achieve the optimal rate. In particular, our method outperforms state of the art O(1/T) methods for overlapping Group Lasso and matches optimal O(1/T^2) methods for the Fused Lasso and tree structured Group Lasso.Comment: 19 pages, 8 figure

    Sparse Prediction with the kk-Support Norm

    Full text link
    We derive a novel norm that corresponds to the tightest convex relaxation of sparsity combined with an â„“2\ell_2 penalty. We show that this new {\em kk-support norm} provides a tighter relaxation than the elastic net and is thus a good replacement for the Lasso or the elastic net in sparse prediction problems. Through the study of the kk-support norm, we also bound the looseness of the elastic net, thus shedding new light on it and providing justification for its use

    Sphingolipid metabolism products: potential new players in the pathogenesis of bortezomib-induced neuropathic pain

    Get PDF
    Chemotherapy-induced peripheral neurotoxicity (CIPN) is one of the major dose-limiting adverse events of widely used drugs in both the oncologic and hematologic setting (1). Among its cardinal symptoms, neuropathic pain is frequently present (2). In particular, the incidence of bortezomib-induced peripheral neurotoxicity (BIPN) and neuropathic pain ranges from 14–45% and 5–39%, respectively, in myeloma multiple patients. BIPN is more frequently developed in pretreated patients, compared to those being chemotherapy-naïve (3,4), and this difference mostly accounts for the wide variability in the observed incidence rates. Bortezomib is the first proteasome inhibitor introduced in clinical practice. The mechanisms underlying the pathogenesis of peripheral neurotoxicity in bortezomib- treated patients are, yet, not fully elucidated (3,4)

    Convex relaxations of penalties for sparse correlated variables with bounded total variation

    Get PDF
    International audienceWe study the problem of statistical estimation with a signal known to be sparse, spatially contiguous, and containing many highly correlated variables. We take inspiration from the recently introduced k-support norm, which has been successfully applied to sparse prediction problems with correlated features, but lacks any explicit structural constraints commonly found in machine learning and image processing. We address this problem by incorporating a total variation penalty in the k-support framework. We introduce the (k, s) support total variation norm as the tightest convex relaxation of the intersection of a set of sparsity and total variation constraints. We show that this norm leads to an intractable combinatorial graph optimization problem, which we prove to be NP-hard. We then introduce a tractable relaxation with approximation guarantees that scale well for grid structured graphs. We devise several first-order optimization strategies for statistical parameter estimation with the described penalty. We demonstrate the effectiveness of this penalty on classification in the low-sample regime, classification with M/EEG neuroimaging data, and image recovery with synthetic and real data background subtracted image recovery tasks. We extensively analyse the application of our penalty on the complex task of identifying predictive regions from low-sample high-dimensional fMRI brain data, we show that our method is particularly useful compared to existing methods in terms of accuracy, interpretability, and stability

    Inconclusive evidence to support the use of minimally-invasive radiofrequency denervation against chronic low back pain

    Get PDF
    Low back pain (LBP), defined as the localized pain or discomfort between the costal margins and superior gluteal line, with or without associated lower limb pain, is one of the most commonly encountered pain syndromes in adults. It is considered chronic LBP (CLBP), when pain persists for more than three months (1). CLBP might be disabling with increased missing hours of productive work or of personal activities and it can also be associated with significant excess of healthcare costs (2). Commonly, CLBP also gives rise to the genesis or exacerbation of various psychiatric disorders, such as depression and/or anxiety (3)
    • …
    corecore