4,207 research outputs found

    Continuity of the martingale optimal transport problem on the real line

    Full text link
    We show continuity of the martingale optimal transport optimisation problem as a functional of its marginals. This is achieved via an estimate on the projection in the nested/causal Wasserstein distance of an arbitrary coupling on to the set of martingale couplings with the same marginals. As a corollary we obtain an independent proof of sufficiency of the monotonicity principle established in [Beiglboeck, M., & Juillet, N. (2016). On a problem of optimal transport under marginal martingale constraints. Ann. Probab., 44 (2016), no. 1, 42106]. On a problem of optimal transport under marginal martingale constraints. Ann. Probab., 44 (2016), no. 1, 42-106] for cost functions of polynomial growth

    Covariance Estimation in Elliptical Models with Convex Structure

    Full text link
    We address structured covariance estimation in Elliptical distribution. We assume it is a priori known that the covariance belongs to a given convex set, e.g., the set of Toeplitz or banded matrices. We consider the General Method of Moments (GMM) optimization subject to these convex constraints. Unfortunately, GMM is still non-convex due to objective. Instead, we propose COCA - a convex relaxation which can be efficiently solved. We prove that the relaxation is tight in the unconstrained case for a finite number of samples, and in the constrained case asymptotically. We then illustrate the advantages of COCA in synthetic simulations with structured Compound Gaussian distributions. In these examples, COCA outperforms competing methods as Tyler's estimate and its projection onto a convex set

    Joint Covariance Estimation with Mutual Linear Structure

    Full text link
    We consider the problem of joint estimation of structured covariance matrices. Assuming the structure is unknown, estimation is achieved using heterogeneous training sets. Namely, given groups of measurements coming from centered populations with different covariances, our aim is to determine the mutual structure of these covariance matrices and estimate them. Supposing that the covariances span a low dimensional affine subspace in the space of symmetric matrices, we develop a new efficient algorithm discovering the structure and using it to improve the estimation. Our technique is based on the application of principal component analysis in the matrix space. We also derive an upper performance bound of the proposed algorithm in the Gaussian scenario and compare it with the Cramer-Rao lower bound. Numerical simulations are presented to illustrate the performance benefits of the proposed method

    Compressed matched filter for non-Gaussian noise

    Full text link
    We consider estimation of a deterministic unknown parameter vector in a linear model with non-Gaussian noise. In the Gaussian case, dimensionality reduction via a linear matched filter provides a simple low dimensional sufficient statistic which can be easily communicated and/or stored for future inference. Such a statistic is usually unknown in the general non-Gaussian case. Instead, we propose a hybrid matched filter coupled with a randomized compressed sensing procedure, which together create a low dimensional statistic. We also derive a complementary algorithm for robust reconstruction given this statistic. Our recovery method is based on the fast iterative shrinkage and thresholding algorithm which is used for outlier rejection given the compressed data. We demonstrate the advantages of the proposed framework using synthetic simulations

    Robust estimation of superhedging prices

    Full text link
    We consider statistical estimation of superhedging prices using historical stock returns in a frictionless market with d traded assets. We introduce a plugin estimator based on empirical measures and show it is consistent but lacks suitable robustness. To address this we propose novel estimators which use a larger set of martingale measures defined through a tradeoff between the radius of Wasserstein balls around the empirical measure and the allowed norm of martingale densities. We establish consistency and robustness of these estimators and argue that they offer a superior performance relative to the plugin estimator. We generalise the results by replacing the superhedging criterion with acceptance relative to a risk measure. We further extend our study, in part, to the case of markets with traded options, to a multiperiod setting and to settings with model uncertainty. We also study convergence rates of estimators and convergence of superhedging strategies.Comment: This work will appear in the Annals of Statistics. The above version merges the main paper to appear in print and its online supplemen

    Group Symmetry and non-Gaussian Covariance Estimation

    Full text link
    We consider robust covariance estimation with group symmetry constraints. Non-Gaussian covariance estimation, e.g., Tyler scatter estimator and Multivariate Generalized Gaussian distribution methods, usually involve non-convex minimization problems. Recently, it was shown that the underlying principle behind their success is an extended form of convexity over the geodesics in the manifold of positive definite matrices. A modern approach to improve estimation accuracy is to exploit prior knowledge via additional constraints, e.g., restricting the attention to specific classes of covariances which adhere to prior symmetry structures. In this paper, we prove that such group symmetry constraints are also geodesically convex and can therefore be incorporated into various non-Gaussian covariance estimators. Practical examples of such sets include: circulant, persymmetric and complex/quaternion proper structures. We provide a simple numerical technique for finding maximum likelihood estimates under such constraints, and demonstrate their performance advantage using synthetic experiments

    Tyler's Covariance Matrix Estimator in Elliptical Models with Convex Structure

    Full text link
    We address structured covariance estimation in elliptical distributions by assuming that the covariance is a priori known to belong to a given convex set, e.g., the set of Toeplitz or banded matrices. We consider the General Method of Moments (GMM) optimization applied to robust Tyler's scatter M-estimator subject to these convex constraints. Unfortunately, GMM turns out to be non-convex due to the objective. Instead, we propose a new COCA estimator - a convex relaxation which can be efficiently solved. We prove that the relaxation is tight in the unconstrained case for a finite number of samples, and in the constrained case asymptotically. We then illustrate the advantages of COCA in synthetic simulations with structured compound Gaussian distributions. In these examples, COCA outperforms competing methods such as Tyler's estimator and its projection onto the structure set.Comment: arXiv admin note: text overlap with arXiv:1311.059
    • …
    corecore