172 research outputs found

    Contributions to Nonparametric Predictive Inference for Bernoulli Data with Applications in Finance

    Get PDF
    Imprecise probability is a more general probability theory which has many advantages over precise probability theory in uncertainty quantification. Many statistical methodologies within imprecise probability framework have been developed today, one of which is nonparametric predicted inference (NPI). NPI has been developed to handle various data types and has many successful applications in different fields. This thesis firstly further developed NPI for Bernoulli data to address two current challenging issues, the computation of imprecise expectation for a general function of multiple future stages observations and handling of imprecise Bernoulli data. To achieve the former, we introduce the concept of the mass function from Weichselberger's axiomatization of imprecise probability theory and Dempster-Shafer's notion of basic probability assignment. Based on the concept of mass function, an algorithm to find the imprecise expectation measure for a general function of a finite random variable is proposed. We then construct mass functions of single and multiple future stages observations in NPI for Bernoulli data by its underlying latent variable representation, which leads to the applicability of the proposed algorithm in NPI for Bernoulli data. To achieve the latter, we extend the original NPI path counting method in its underlying lattice representation. This leads to the development of mass function and the imprecise probabilities of NPI for imprecise Bernoulli data. The property of NPI for imprecise Bernoulli data is illustrated with a numerical example. Subsequently, under the binomial tree model, NPI for Bernoulli data and imprecise data are applied to asset and European options trading and NPI for Bernoulli data is applied to portfolio assessment. The performances of both applications are evaluated via simulations. The predictive nature and ability of noise recognition of NPI for precise and imprecise Bernoulli data are validated. The viability for application of NPI in portfolio assessment is confirmed

    Locality Preserving Projections for Grassmann manifold

    Full text link
    Learning on Grassmann manifold has become popular in many computer vision tasks, with the strong capability to extract discriminative information for imagesets and videos. However, such learning algorithms particularly on high-dimensional Grassmann manifold always involve with significantly high computational cost, which seriously limits the applicability of learning on Grassmann manifold in more wide areas. In this research, we propose an unsupervised dimensionality reduction algorithm on Grassmann manifold based on the Locality Preserving Projections (LPP) criterion. LPP is a commonly used dimensionality reduction algorithm for vector-valued data, aiming to preserve local structure of data in the dimension-reduced space. The strategy is to construct a mapping from higher dimensional Grassmann manifold into the one in a relative low-dimensional with more discriminative capability. The proposed method can be optimized as a basic eigenvalue problem. The performance of our proposed method is assessed on several classification and clustering tasks and the experimental results show its clear advantages over other Grassmann based algorithms.Comment: Accepted by IJCAI 201
    • …
    corecore