5,601 research outputs found

    Statistical properties of the method of regularization with periodic Gaussian reproducing kernel

    Get PDF
    The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infinite-order Sobolev space and the space of analytic functions, the method under consideration is asymptotically minimax; in finite-order Sobolev spaces, the method is rate optimal, and the efficiency in terms of constant when compared with the minimax estimator is reasonably high. The smoothing parameters in the periodic Gaussian regularization can be chosen adaptively without loss of asymptotic efficiency. The results derived in this paper give a partial explanation of the success of the Gaussian reproducing kernel in practice. Simulations are carried out to study the finite sample properties of the periodic Gaussian regularization.Comment: Published by the Institute of Mathematical Statistics (http://www.imstat.org) in the Annals of Statistics (http://www.imstat.org/aos/) at http://dx.doi.org/10.1214/00905360400000045

    Optimal estimation of the mean function based on discretely sampled functional data: Phase transition

    Get PDF
    The problem of estimating the mean of random functions based on discretely sampled data arises naturally in functional data analysis. In this paper, we study optimal estimation of the mean function under both common and independent designs. Minimax rates of convergence are established and easily implementable rate-optimal estimators are introduced. The analysis reveals interesting and different phase transition phenomena in the two cases. Under the common design, the sampling frequency solely determines the optimal rate of convergence when it is relatively small and the sampling frequency has no effect on the optimal rate when it is large. On the other hand, under the independent design, the optimal rate of convergence is determined jointly by the sampling frequency and the number of curves when the sampling frequency is relatively small. When it is large, the sampling frequency has no effect on the optimal rate. Another interesting contrast between the two settings is that smoothing is necessary under the independent design, while, somewhat surprisingly, it is not essential under the common design.Comment: Published in at http://dx.doi.org/10.1214/11-AOS898 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Minimax optimization of entanglement witness operator for the quantification of three-qubit mixed-state entanglement

    Full text link
    We develop a numerical approach for quantifying entanglement in mixed quantum states by convex-roof entanglement measures, based on the optimal entanglement witness operator and the minimax optimization method. Our approach is applicable to general entanglement measures and states and is an efficient alternative to the conventional approach based on the optimal pure-state decomposition. Compared with the conventional one, it has two important merits: (i) that the global optimality of the solution is quantitatively verifiable, and (ii) that the optimization is considerably simplified by exploiting the common symmetry of the target state and measure. To demonstrate the merits, we quantify Greenberger-Horne-Zeilinger (GHZ) entanglement in a class of three-qubit full-rank mixed states composed of the GHZ state, the W state, and the white noise, the simplest mixtures of states with different genuine multipartite entanglement, which have not been quantified before this work. We discuss some general properties of the form of the optimal witness operator and of the convex structure of mixed states, which are related to the symmetry and the rank of states
    • …
    corecore