58 research outputs found

    Estimation of a kk-monotone density: limit distribution theory and the spline connection

    Full text link
    We study the asymptotic behavior of the Maximum Likelihood and Least Squares Estimators of a kk-monotone density g0g_0 at a fixed point x0x_0 when k>2k>2. We find that the jjth derivative of the estimators at x0x_0 converges at the rate n(kj)/(2k+1)n^{-(k-j)/(2k+1)} for j=0,...,k1j=0,...,k-1. The limiting distribution depends on an almost surely uniquely defined stochastic process HkH_k that stays above (below) the kk-fold integral of Brownian motion plus a deterministic drift when kk is even (odd). Both the MLE and LSE are known to be splines of degree k1k-1 with simple knots. Establishing the order of the random gap τn+τn\tau_n^+-\tau_n^-, where τn±\tau_n^{\pm} denote two successive knots, is a key ingredient of the proof of the main results. We show that this ``gap problem'' can be solved if a conjecture about the upper bound on the error in a particular Hermite interpolation via odd-degree splines holds.Comment: Published in at http://dx.doi.org/10.1214/009053607000000262 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Chernoff's density is log-concave

    Full text link
    We show that the density of Z=argmax{W(t)t2}Z=\mathop {\operatorname {argmax}}\{W(t)-t^2\}, sometimes known as Chernoff's density, is log-concave. We conjecture that Chernoff's density is strongly log-concave or "super-Gaussian", and provide evidence in support of the conjecture.Comment: Published in at http://dx.doi.org/10.3150/12-BEJ483 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    A Kiefer--Wolfowitz theorem for convex densities

    Full text link
    Kiefer and Wolfowitz [Z. Wahrsch. Verw. Gebiete 34 (1976) 73--85] showed that if FF is a strictly curved concave distribution function (corresponding to a strictly monotone density ff), then the Maximum Likelihood Estimator F^n\hat{F}_n, which is, in fact, the least concave majorant of the empirical distribution function Fn\mathbb {F}_n, differs from the empirical distribution function in the uniform norm by no more than a constant times (n1logn)2/3(n^{-1}\log n)^{2/3} almost surely. We review their result and give an updated version of their proof. We prove a comparable theorem for the class of distribution functions FF with convex decreasing densities ff, but with the maximum likelihood estimator F^n\hat{F}_n of FF replaced by the least squares estimator F~n\widetilde{F}_n: if X1,...,XnX_1,..., X_n are sampled from a distribution function FF with strictly convex density ff, then the least squares estimator F~n\widetilde{F}_n of FF and the empirical distribution function Fn\mathbb {F}_n differ in the uniform norm by no more than a constant times (n1logn)3/5(n^{-1}\log n)^{3/5} almost surely. The proofs rely on bounds on the interpolation error for complete spline interpolation due to Hall [J. Approximation Theory 1 (1968) 209--218], Hall and Meyer [J. Approximation Theory 16 (1976) 105--122], building on earlier work by Birkhoff and de Boor [J. Math. Mech. 13 (1964) 827--835]. These results, which are crucial for the developments here, are all nicely summarized and exposited in de Boor [A Practical Guide to Splines (2001) Springer, New York].Comment: Published at http://dx.doi.org/10.1214/074921707000000256 in the IMS Lecture Notes Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Testing monotonicity via local least concave majorants

    Full text link
    We propose a new testing procedure for detecting localized departures from monotonicity of a signal embedded in white noise. In fact, we perform simultaneously several tests that aim at detecting departures from concavity for the integrated signal over various intervals of different sizes and localizations. Each of these local tests relies on estimating the distance between the restriction of the integrated signal to some interval and its least concave majorant. Our test can be easily implemented and is proved to achieve the optimal uniform separation rate simultaneously for a wide range of H\"{o}lderian alternatives. Moreover, we show how this test can be extended to a Gaussian regression framework with unknown variance. A simulation study confirms the good performance of our procedure in practice.Comment: Published in at http://dx.doi.org/10.3150/12-BEJ496 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Least Squares estimation of two ordered monotone regression curves

    Full text link
    In this paper, we consider the problem of finding the Least Squares estimators of two isotonic regression curves g1g^\circ_1 and g2g^\circ_2 under the additional constraint that they are ordered; e.g., g1g2g^\circ_1 \le g^\circ_2. Given two sets of nn data points y1,...,yny_1, ..., y_n and z1,>...,znz_1, >...,z_n observed at (the same) design points, the estimates of the true curves are obtained by minimizing the weighted Least Squares criterion L2(a,b)=j=1n(yjaj)2w1,j+j=1n(zjbj)2w2,jL_2(a, b) = \sum_{j=1}^n (y_j - a_j)^2 w_{1,j}+ \sum_{j=1}^n (z_j - b_j)^2 w_{2,j} over the class of pairs of vectors (a,b)Rn×Rn(a, b) \in \mathbb{R}^n \times \mathbb{R}^n such that a1a2...ana_1 \le a_2 \le ...\le a_n , b1b2...bnb_1 \le b_2 \le ...\le b_n , and aibi,i=1,...,na_i \le b_i, i=1, ...,n. The characterization of the estimators is established. To compute these estimators, we use an iterative projected subgradient algorithm, where the projection is performed with a "generalized" pool-adjacent-violaters algorithm (PAVA), a byproduct of this work. Then, we apply the estimation method to real data from mechanical engineering.Comment: 23 pages, 2 figures. Second revised version according to reviewer comment

    Limit distribution theory for maximum likelihood estimation of a log-concave density

    Full text link
    We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, that is, a density of the form f0=expφ0f_0=\exp\varphi_0 where φ0\varphi_0 is a concave function on R\mathbb{R}. The pointwise limiting distributions depend on the second and third derivatives at 0 of HkH_k, the "lower invelope" of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of φ0=logf0\varphi_0=\log f_0 at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode M(f0)M(f_0) and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.Comment: Published in at http://dx.doi.org/10.1214/08-AOS609 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore