3,312 research outputs found

    Spatial Sign Correlation

    Get PDF
    A new robust correlation estimator based on the spatial sign covariance matrix (SSCM) is proposed. We derive its asymptotic distribution and influence function at elliptical distributions. Finite sample and robustness properties are studied and compared to other robust correlation estimators by means of numerical simulations.Comment: 20 pages, 7 figures, 2 table

    CLEAR: Covariant LEAst-square Re-fitting with applications to image restoration

    Full text link
    In this paper, we propose a new framework to remove parts of the systematic errors affecting popular restoration algorithms, with a special focus for image processing tasks. Generalizing ideas that emerged for 1\ell_1 regularization, we develop an approach re-fitting the results of standard methods towards the input data. Total variation regularizations and non-local means are special cases of interest. We identify important covariant information that should be preserved by the re-fitting method, and emphasize the importance of preserving the Jacobian (w.r.t. the observed signal) of the original estimator. Then, we provide an approach that has a "twicing" flavor and allows re-fitting the restored signal by adding back a local affine transformation of the residual term. We illustrate the benefits of our method on numerical simulations for image restoration tasks

    Long term memories of developed and emerging markets: using the scaling analysis to characterize their stage of development

    Full text link
    The scaling properties encompass in a simple analysis many of the volatility characteristics of financial markets. That is why we use them to probe the different degree of markets development. We empirically study the scaling properties of daily Foreign Exchange rates, Stock Market indices and fixed income instruments by using the generalized Hurst approach. We show that the scaling exponents are associated with characteristics of the specific markets and can be used to differentiate markets in their stage of development. The robustness of the results is tested by both Monte-Carlo studies and a computation of the scaling in the frequency-domain.Comment: 46 pages, 7 figures, accepted for publication in Journal of Banking & Financ

    Estimating the Leverage Parameter of Continuous-time Stochastic Volatility Models Using High Frequency S&P 500 and VIX

    Get PDF
    This paper proposes a new method for estimating continuous-time stochastic volatility (SV) models for the S&P 500 stock index process using intraday high-frequency observations of both the S&P 500 index and the Chicago Board of Exchange (CBOE) implied (or expected) volatility index (VIX). Intraday high-frequency observations data have become readily available for an increasing number of financial assets and their derivatives in recent years, but it is well known that attempts to directly apply popular continuous-time models to short intraday time intervals, and estimate the parameters using such data, can lead to nonsensical estimates due to severe intraday seasonality. A primary purpose of the paper is to provide a framework for using intraday high frequency data of both the index estimate, in particular, for improving the estimation accuracy of the leverage parameter, , that is, the correlation between the two Brownian motions driving the diffusive components of the price process and its spot variance process, respectively. As a special case, we focus on Heston’s (1993) square-root SV model, and propose the realized leverage estimator for , noting that, under this model without measurement errors, the “realized leverage,” or the realized covariation of the price and VIX processes divided by the product of the realized volatilities of the two processes, is in-fill consistent for  . Finite sample simulation results show that the proposed estimator delivers more accurate estimates of the leverage parameter than do existing methods.Continuous time, high frequency data, stochastic volatility, S&P 500, implied volatility, VIX.

    Tyler's Covariance Matrix Estimator in Elliptical Models with Convex Structure

    Full text link
    We address structured covariance estimation in elliptical distributions by assuming that the covariance is a priori known to belong to a given convex set, e.g., the set of Toeplitz or banded matrices. We consider the General Method of Moments (GMM) optimization applied to robust Tyler's scatter M-estimator subject to these convex constraints. Unfortunately, GMM turns out to be non-convex due to the objective. Instead, we propose a new COCA estimator - a convex relaxation which can be efficiently solved. We prove that the relaxation is tight in the unconstrained case for a finite number of samples, and in the constrained case asymptotically. We then illustrate the advantages of COCA in synthetic simulations with structured compound Gaussian distributions. In these examples, COCA outperforms competing methods such as Tyler's estimator and its projection onto the structure set.Comment: arXiv admin note: text overlap with arXiv:1311.059

    Device-independent point estimation from finite data and its application to device-independent property estimation

    Full text link
    The device-independent approach to physics is one where conclusions are drawn directly from the observed correlations between measurement outcomes. In quantum information, this approach allows one to make strong statements about the properties of the underlying systems or devices solely via the observation of Bell-inequality-violating correlations. However, since one can only perform a {\em finite number} of experimental trials, statistical fluctuations necessarily accompany any estimation of these correlations. Consequently, an important gap remains between the many theoretical tools developed for the asymptotic scenario and the experimentally obtained raw data. In particular, a physical and concurrently practical way to estimate the underlying quantum distribution has so far remained elusive. Here, we show that the natural analogs of the maximum-likelihood estimation technique and the least-square-error estimation technique in the device-independent context result in point estimates of the true distribution that are physical, unique, computationally tractable and consistent. They thus serve as sound algorithmic tools allowing one to bridge the aforementioned gap. As an application, we demonstrate how such estimates of the underlying quantum distribution can be used to provide, in certain cases, trustworthy estimates of the amount of entanglement present in the measured system. In stark contrast to existing approaches to device-independent parameter estimations, our estimation does not require the prior knowledge of {\em any} Bell inequality tailored for the specific property and the specific distribution of interest.Comment: Essentially published version, but with the typo in Eq. (E5) correcte
    corecore