4 research outputs found

    GLOBAL RATES OF CONVERGENCE IN LOG-CONCAVE DENSITY ESTIMATION

    Get PDF
    The estimation of a log-concave density on Rd represents a central problem in the area of nonparametric inference under shape constraints. In this paper, we study the performance of log-concave density estimators with respect to global loss functions, and adopt a minimax approach. We first show that no statistical procedure based on a sample of size n can estimate a log-concave density with respect to the squared Hellinger loss function with supremum risk smaller than order n−4/5, when d=1, and order n−2/(d+1) when d≥2. In particular, this reveals a sense in which, when d≥3, log-concave density estimation is fundamentally more challenging than the estimation of a density with two bounded derivatives (a problem to which it has been compared). Second, we show that for d≤3, the Hellinger ε-bracketing entropy of a class of log-concave densities with small mean and covariance matrix close to the identity grows like max{ε−d/2,ε−(d−1)} (up to a logarithmic factor when d=2). This enables us to prove that when d≤3 the log-concave maximum likelihood estimator achieves the minimax optimal rate (up to logarithmic factors when d=2,3) with respect to squared Hellinger loss.The research of Richard J. Samworth was supported by an EPSRC Early Career Fellowship and a grant from the Leverhulme Trust

    ADAPTATION IN MULTIVARIATE LOG-CONCAVE DENSITY ESTIMATION

    Get PDF
    We study the adaptation properties of the multivariate log-concave maximum likelihood estimator over three subclasses of log-concave densities. The first consists of densities with polyhedral support whose logarithms are piecewise affine. The complexity of such densities~ff can be measured in terms of the sum Γ(f)\Gamma(f) of the numbers of facets of the subdomains in the polyhedral subdivision of the support induced by ff. Given nn independent observations from a dd-dimensional log-concave density with d{2,3}d \in \{2,3\}, we prove a sharp oracle inequality, which in particular implies that the Kullback--Leibler risk of the log-concave maximum likelihood estimator for such densities is bounded above by Γ(f)/n\Gamma(f)/n, up to a polylogarithmic factor. Thus, the rate can be essentially parametric, even in this multivariate setting. For the second type of adaptation, we consider densities that are bounded away from zero on a polytopal support; we show that up to polylogarithmic factors, the log-concave maximum likelihood estimator attains the rate n4/7n^{-4/7} when d=3d=3, which is faster than the worst-case rate of n1/2n^{-1/2}. Finally, our third type of subclass consists of densities whose contours are well-separated; these new classes are constructed to be affine invariant and turn out to contain a wide variety of densities, including those that satisfy H\"older regularity conditions. Here, we prove another sharp oracle inequality, which reveals in particular that the log-concave maximum likelihood estimator attains a risk bound of order nmin(β+3β+7,47)n^{-\min\bigl(\frac{\beta+3}{\beta+7},\frac{4}{7}\bigr)} when d=3d=3 over the class of β\beta-H\"older log-concave densities with β(1,3]\beta\in (1,3], again up to a polylogarithmic factor.EPSRC Fellowship EP/P031447/1 Leverhulme Trust grant RG8176
    corecore