392 research outputs found

    Quadratic Growth Conditions for Convex Matrix Optimization Problems Associated with Spectral Functions

    Get PDF
    In this paper, we provide two types of sufficient conditions for ensuring the quadratic growth conditions of a class of constrained convex symmetric and non-symmetric matrix optimization problems regularized by nonsmooth spectral functions. These sufficient conditions are derived via the study of the C2\mathcal{C}^2-cone reducibility of spectral functions and the metric subregularity of their subdifferentials, respectively. As an application, we demonstrate how quadratic growth conditions are used to guarantee the desirable fast convergence rates of the augmented Lagrangian methods (ALM) for solving convex matrix optimization problems. Numerical experiments on an easy-to-implement ALM applied to the fastest mixing Markov chain problem are also presented to illustrate the significance of the obtained results

    SOX Genes and Cancer

    Get PDF
    Transcription factors play a critical role in regulating the gene expression programs that establish and maintain specific cell states in humans. Deregulation of these gene expression programs can lead to a broad range of diseases including cancer. SOX transcription factors are a conserved group of transcriptional regulators that mediates DNA binding by a highly conserved high-mobility group (HMG) domain. Numerous evidence has recently demonstrated that SOX transcription factors critically control cell fate and differentiation in major developmental processes, and that their upregulation may be important for cancer progression. In this review, we discuss recent advances in our understanding of the role of SOX genes in cancer

    3DFill:Reference-guided Image Inpainting by Self-supervised 3D Image Alignment

    Full text link
    Most existing image inpainting algorithms are based on a single view, struggling with large holes or the holes containing complicated scenes. Some reference-guided algorithms fill the hole by referring to another viewpoint image and use 2D image alignment. Due to the camera imaging process, simple 2D transformation is difficult to achieve a satisfactory result. In this paper, we propose 3DFill, a simple and efficient method for reference-guided image inpainting. Given a target image with arbitrary hole regions and a reference image from another viewpoint, the 3DFill first aligns the two images by a two-stage method: 3D projection + 2D transformation, which has better results than 2D image alignment. The 3D projection is an overall alignment between images and the 2D transformation is a local alignment focused on the hole region. The entire process of image alignment is self-supervised. We then fill the hole in the target image with the contents of the aligned image. Finally, we use a conditional generation network to refine the filled image to obtain the inpainting result. 3DFill achieves state-of-the-art performance on image inpainting across a variety of wide view shifts and has a faster inference speed than other inpainting models

    Strong Variational Sufficiency for Nonlinear Semidefinite Programming and its Implications

    Full text link
    Strong variational sufficiency is a newly proposed property, which turns out to be of great use in the convergence analysis of multiplier methods. However, what this property implies for non-polyhedral problems remains a puzzle. In this paper, we prove the equivalence between the strong variational sufficiency and the strong second order sufficient condition (SOSC) for nonlinear semidefinite programming (NLSDP), without requiring the uniqueness of multiplier or any other constraint qualifications. Based on this characterization, the local convergence property of the augmented Lagrangian method (ALM) for NLSDP can be established under strong SOSC in the absence of constraint qualifications. Moreover, under the strong SOSC, we can apply the semi-smooth Newton method to solve the ALM subproblems of NLSDP as the positive definiteness of the generalized Hessian of augmented Lagrangian function is satisfied.Comment: 23 page

    Accelerating preconditioned ADMM via degenerate proximal point mappings

    Full text link
    In this paper, we aim to accelerate a preconditioned alternating direction method of multipliers (pADMM), whose proximal terms are convex quadratic functions, for solving linearly constrained convex optimization problems. To achieve this, we first reformulate the pADMM into a form of proximal point method (PPM) with a positive semidefinite preconditioner which can be degenerate due to the lack of strong convexity of the proximal terms in the pADMM. Then we accelerate the pADMM by accelerating the reformulated degenerate PPM (dPPM). Specifically, we first propose an accelerated dPPM by integrating the Halpern iteration and the fast Krasnosel'ski\u{i}-Mann iteration into it, achieving asymptotic o(1/k)o(1/k) and non-asymptotic O(1/k)O(1/k) convergence rates. Subsequently, building upon the accelerated dPPM, we develop an accelerated pADMM algorithm that exhibits both asymptotic o(1/k)o(1/k) and non-asymptotic O(1/k)O(1/k) nonergodic convergence rates concerning the Karush-Kuhn-Tucker residual and the primal objective function value gap. Preliminary numerical experiments validate the theoretical findings, demonstrating that the accelerated pADMM outperforms the pADMM in solving convex quadratic programming problems
    corecore