23,596 research outputs found

    On Quadratic g-Evaluations/Expectations and Related Analysis

    Full text link
    In this paper we extend the notion of g-evaluation, in particular g-expectation, to the case where the generator g is allowed to have a quadratic growth. We show that some important properties of the g-expectations, including a representation theorem between the generator and the corresponding g-expectation, and consequently the reverse comparison theorem of quadratic BSDEs as well as the Jensen inequality, remain true in the quadratic case. Our main results also include a Doob-Meyer type decomposition, the optional sampling theorem, and the up-crossing inequality. The results of this paper are important in the further development of the general quadratic nonlinear expectations.Comment: 27 page

    Optimal classification in sparse Gaussian graphic model

    Get PDF
    Consider a two-class classification problem where the number of features is much larger than the sample size. The features are masked by Gaussian noise with mean zero and covariance matrix Σ\Sigma, where the precision matrix Ω=Σ1\Omega=\Sigma^{-1} is unknown but is presumably sparse. The useful features, also unknown, are sparse and each contributes weakly (i.e., rare and weak) to the classification decision. By obtaining a reasonably good estimate of Ω\Omega, we formulate the setting as a linear regression model. We propose a two-stage classification method where we first select features by the method of Innovated Thresholding (IT), and then use the retained features and Fisher's LDA for classification. In this approach, a crucial problem is how to set the threshold of IT. We approach this problem by adapting the recent innovation of Higher Criticism Thresholding (HCT). We find that when useful features are rare and weak, the limiting behavior of HCT is essentially just as good as the limiting behavior of ideal threshold, the threshold one would choose if the underlying distribution of the signals is known (if only). Somewhat surprisingly, when Ω\Omega is sufficiently sparse, its off-diagonal coordinates usually do not have a major influence over the classification decision. Compared to recent work in the case where Ω\Omega is the identity matrix [Proc. Natl. Acad. Sci. USA 105 (2008) 14790-14795; Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 367 (2009) 4449-4470], the current setting is much more general, which needs a new approach and much more sophisticated analysis. One key component of the analysis is the intimate relationship between HCT and Fisher's separation. Another key component is the tight large-deviation bounds for empirical processes for data with unconventional correlation structures, where graph theory on vertex coloring plays an important role.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1163 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Representation Theorems for Quadratic F{\cal F}-Consistent Nonlinear Expectations

    Get PDF
    In this paper we extend the notion of ``filtration-consistent nonlinear expectation" (or "F{\cal F}-consistent nonlinear expectation") to the case when it is allowed to be dominated by a gg-expectation that may have a quadratic growth. We show that for such a nonlinear expectation many fundamental properties of a martingale can still make sense, including the Doob-Meyer type decomposition theorem and the optional sampling theorem. More importantly, we show that any quadratic F{\cal F}-consistent nonlinear expectation with a certain domination property must be a quadratic gg-expectation. The main contribution of this paper is the finding of the domination condition to replace the one used in all the previous works, which is no longer valid in the quadratic case. We also show that the representation generator must be deterministic, continuous, and actually must be of the simple form
    corecore