212 research outputs found

    Kernel Ellipsoidal Trimming

    No full text
    Ellipsoid estimation is an issue of primary importance in many practical areas such as control, system identification, visual/audio tracking, experimental design, data mining, robust statistics and novelty/outlier detection. This paper presents a new method of kernel information matrix ellipsoid estimation (KIMEE) that finds an ellipsoid in a kernel defined feature space based on a centered information matrix. Although the method is very general and can be applied to many of the aforementioned problems, the main focus in this paper is the problem of novelty or outlier detection associated with fault detection. A simple iterative algorithm based on Titterington's minimum volume ellipsoid method is proposed for practical implementation. The KIMEE method demonstrates very good performance on a set of real-life and simulated datasets compared with support vector machine methods

    Testing the Capital Asset Pricing Model Efficiently Under Elliptical Symmetry: A Semiparametric Approach

    Get PDF
    We develop new tests of the capital asset pricing model (CAPM) that take account of and are valid under the assumption that the distribution generating returns is elliptically symmetric; this assumption is necessary and sufficient for the validity of the CAPM. Our test is based on semiparametric efficient estimation procedures for a seemingly unrelated regression model where the multivariate error density is elliptically symmetric, but otherwise unrestricted. The elliptical symmetry assumption allows us to avert the curse of dimensionality problem that typically arises in multivariate semiparametric estimation procedures, because the multivariate elliptically symmetric density function can be written as a function of a scalar transformation of the observed multivariate data. The elliptically symmetric family includes a number of thick-tailed distributions and so is potentially relevant in financial applications. Our estimated betas are lower than the OLS estimates, and our parameter estimates are much less consistent with the CAPM restrictions than the corresponding OLS estimates. Nous développons de nouveaux tests du modèle d'évaluation des actifs financiers (" CAPM ") qui tiennent compte de, et sont valides sous, l'hypothèse que les retours des actifs découlent d'un loi de probabilité elliptiquement symétrique. Cette hypothèse est nécessaire et suffisante pour la validité du CAPM. Notre test utilise un estimateur des paramètres du modèle qui a l'efficacité semiparamétrique quand on a un modèle de régression apparemment sans relation et qui a des erreurs qui suivent une loi elliptiquement symétrique. L'hypothèse de la symétrie elliptique nous permet d'éviter le problème d'estimer non-paramétriquement une fonction de haute dimension parce qu'on peut écrire la densité d'une loi elliptique comme une fonction d'une transformation unidimensionnelle de la variable aléatoire multidimensionnelle. La famille des lois elliptiquement symétriques inclue plusieurs lois leptokurtiques, donc elle est pertinente à des applications financières. Les bêtas obtenus avec notre estimateur sont plus bas que ceux qui sont obtenus en utilisant des moindres carrés, et sont moins compatibles avec le CAPM.Adaptive estimation, capital asset pricing model, elliptical symmetry, semiparametric efficiency

    Two triangulations methods based on edge refinement

    Get PDF
    In this paper two curvature adaptive methods of surface triangulation are presented. Both methods are based on edge refinement to obtain a triangulation compatible with the curvature requirements. The first method applies an incremental and constrained Delaunay triangulation and uses curvature bounds to determine if an edge of the triangulation is admissible. The second method uses this function also in the edge refinement process, i.e. in the computation of the location of a refining point, and in the re-triangulation needed after the insertion of this refining point. Results are presented, comparing both approachesPostprint (published version

    Data complexity measured by principal graphs

    Full text link
    How to measure the complexity of a finite set of vectors embedded in a multidimensional space? This is a non-trivial question which can be approached in many different ways. Here we suggest a set of data complexity measures using universal approximators, principal cubic complexes. Principal cubic complexes generalise the notion of principal manifolds for datasets with non-trivial topologies. The type of the principal cubic complex is determined by its dimension and a grammar of elementary graph transformations. The simplest grammar produces principal trees. We introduce three natural types of data complexity: 1) geometric (deviation of the data's approximator from some "idealized" configuration, such as deviation from harmonicity); 2) structural (how many elements of a principal graph are needed to approximate the data), and 3) construction complexity (how many applications of elementary graph transformations are needed to construct the principal object starting from the simplest one). We compute these measures for several simulated and real-life data distributions and show them in the "accuracy-complexity" plots, helping to optimize the accuracy/complexity ratio. We discuss various issues connected with measuring data complexity. Software for computing data complexity measures from principal cubic complexes is provided as well.Comment: Computers and Mathematics with Applications, in pres

    Efficient Estimation of Conditional Asset Pricing Models

    Get PDF
    A semiparametric efficient estimation procedure is developed for the parameters of multivariate GARCH-in-mean models when the disturbances have a distribution that is assumed to be elliptically symmetric but is otherwise unrestricted. Under high level restrictions, the resulting estimator achieves the asymptotic semiparametric efficiency bound. The elliptical symmetry assumption allows us to avert the curse of dimensionality problem that would otherwise arise in estimating the unknown error distribution. This framework is suitable for the estimation and testing of conditional asset pricing models such as the conditional CAPM, and we apply our estimator in an empirical study of stock prices, with Monte Carlo simulation results also being reported. Nous développons un nouvel estimateur pour les paramètres d'un modèle de GARCH en moyenne (" GARCH-M ") avec plusieurs variables. L'estimateur a l'efficacité semiparamétrique quand les erreurs suivent une loi de probabilité qui est elliptiquement symétrique mais n'aucune autre restriction. Sous les hypothèses de haut niveau, notre estimateur obtient la limite d'efficacité semiparamétrique. L'hypothèse de la symétrie elliptique nous permet d'éviter le problème d'estimer non-paramétriquement une fonction de haut dimension, parce qu'on peut écrire la densité d'un loi elliptique comme un fonction d'une transformation unidimensionnelle de la variable aléatoire multidimensionnelle. Ce cadre est approprié pour analyser des modèles conditionnels des prix des actifs financiers, comme le CAPM conditionnel. Nous appliquons notre méthodologie à l'étude des prix des actions, et nous rendons compte des résultats d'une étude simulation "Monte-Carlo".Capital asset pricing model, elliptical symmetry, semiparametric efficiency, GARCH.

    Nonparametrically consistent depth-based classifiers

    Full text link
    We introduce a class of depth-based classification procedures that are of a nearest-neighbor nature. Depth, after symmetrization, indeed provides the center-outward ordering that is necessary and sufficient to define nearest neighbors. Like all their depth-based competitors, the resulting classifiers are affine-invariant, hence in particular are insensitive to unit changes. Unlike the former, however, the latter achieve Bayes consistency under virtually any absolutely continuous distributions - a concept we call nonparametric consistency, to stress the difference with the stronger universal consistency of the standard kkNN classifiers. We investigate the finite-sample performances of the proposed classifiers through simulations and show that they outperform affine-invariant nearest-neighbor classifiers obtained through an obvious standardization construction. We illustrate the practical value of our classifiers on two real data examples. Finally, we shortly discuss the possible uses of our depth-based neighbors in other inference problems.Comment: Published at http://dx.doi.org/10.3150/13-BEJ561 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    CLASH: Precise New Constraints on the Mass Profile of Abell 2261

    Get PDF
    We precisely constrain the inner mass profile of Abell 2261 (z=0.225) for the first time and determine this cluster is not "over-concentrated" as found previously, implying a formation time in agreement with {\Lambda}CDM expectations. These results are based on strong lensing analyses of new 16-band HST imaging obtained as part of the Cluster Lensing and Supernova survey with Hubble (CLASH). Combining this with revised weak lensing analyses of Subaru wide field imaging with 5-band Subaru + KPNO photometry, we place tight new constraints on the halo virial mass M_vir = 2.2\pm0.2\times10^15 M\odot/h70 (within r \approx 3 Mpc/h70) and concentration c = 6.2 \pm 0.3 when assuming a spherical halo. This agrees broadly with average c(M,z) predictions from recent {\Lambda}CDM simulations which span 5 <~ 8. Our most significant systematic uncertainty is halo elongation along the line of sight. To estimate this, we also derive a mass profile based on archival Chandra X-ray observations and find it to be ~35% lower than our lensing-derived profile at r2500 ~ 600 kpc. Agreement can be achieved by a halo elongated with a ~2:1 axis ratio along our line of sight. For this elongated halo model, we find M_vir = 1.7\pm0.2\times10^15 M\odot/h70 and c_vir = 4.6\pm0.2, placing rough lower limits on these values. The need for halo elongation can be partially obviated by non-thermal pressure support and, perhaps entirely, by systematic errors in the X-ray mass measurements. We estimate the effect of background structures based on MMT/Hectospec spectroscopic redshifts and find these tend to lower Mvir further by ~7% and increase cvir by ~5%.Comment: Submitted to the Astrophysical Journal. 19 pages, 14 figure

    Intensity-Based Image Registration Using Robust Correlation Coefficients

    Full text link
    The ordinary sample correlation coefficient is a popular similarity measure for aligning images from the same or similar modalities. However, this measure can be sensitive to the presence of “outlier” objects that appear in one image but not the other, such as surgical instruments, the patient table, etc., which can lead to biased registrations. This paper describes an intensity-based image registration technique that uses a robust correlation coefficient as a similarity measure. Relative to the ordinary sample correlation coefficient, the proposed similarity measure reduces the influence of outliers. We also compared the performance of the proposed method with the mutual information- based method. The robust correlation-based method should be useful for image registration in radiotherapy (KeV to MeV X-ray images) and image-guided surgery applications. We have investigated the properties of the proposed method by theoretical analysis, computer simulations, a phantom experiment, and with functional magnetic resonance imaging data.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85801/1/Fessler55.pd
    corecore