1,996 research outputs found

    Asymptotic equivalence for inhomogeneous jump diffusion processes and white noise

    Get PDF
    We prove the global asymptotic equivalence between the experiments generated by the discrete (high frequency) or continuous observation of a path of a time inhomogeneous jump-diffusion process and a Gaussian white noise experiment. Here, the considered parameter is the drift function, and we suppose that the observation time TT tends to \infty. The approximation is given in the sense of the Le Cam Δ\Delta-distance, under smoothness conditions on the unknown drift function. These asymptotic equivalences are established by constructing explicit Markov kernels that can be used to reproduce one experiment from the other.Comment: 20 pages; to appear on ESAIM: P\&S. In this version there are some improvements in the exposition following the reports suggestion

    Bayesian Bootstrap Analysis of Systems of Equations

    Get PDF
    Research Methods/ Statistical Methods,

    On the Nonparametric Identification of Nonlinear Simultaneous Equations Models: comment on B. Brown (1983) and Roehrig (1988)

    Get PDF
    This note revisits the identification theorems of B. Brown (1983) and Roehrig (1988). We describe an error in the proofs of the main identification theorems in these papers, and provide an important counterexample to the theorems on the identification of the reduced form. Specifically, contrary to the theorems, the reduced form of a nonseparable simultaneous equations model is not identified even under the assumptions of those papers. We conclude the note with a conjecture that it may be possible to use classical exclusion restrictions to recover some of the key implications of the theorems.Simultaneous equations, Non-separable errors

    Spatial-temporal data mining procedure: LASR

    Full text link
    This paper is concerned with the statistical development of our spatial-temporal data mining procedure, LASR (pronounced ``laser''). LASR is the abbreviation for Longitudinal Analysis with Self-Registration of large-pp-small-nn data. It was motivated by a study of ``Neuromuscular Electrical Stimulation'' experiments, where the data are noisy and heterogeneous, might not align from one session to another, and involve a large number of multiple comparisons. The three main components of LASR are: (1) data segmentation for separating heterogeneous data and for distinguishing outliers, (2) automatic approaches for spatial and temporal data registration, and (3) statistical smoothing mapping for identifying ``activated'' regions based on false-discovery-rate controlled pp-maps and movies. Each of the components is of interest in its own right. As a statistical ensemble, the idea of LASR is applicable to other types of spatial-temporal data sets beyond those from the NMES experiments.Comment: Published at http://dx.doi.org/10.1214/074921706000000707 in the IMS Lecture Notes--Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Multi-view Learning as a Nonparametric Nonlinear Inter-Battery Factor Analysis

    Get PDF
    Factor analysis aims to determine latent factors, or traits, which summarize a given data set. Inter-battery factor analysis extends this notion to multiple views of the data. In this paper we show how a nonlinear, nonparametric version of these models can be recovered through the Gaussian process latent variable model. This gives us a flexible formalism for multi-view learning where the latent variables can be used both for exploratory purposes and for learning representations that enable efficient inference for ambiguous estimation tasks. Learning is performed in a Bayesian manner through the formulation of a variational compression scheme which gives a rigorous lower bound on the log likelihood. Our Bayesian framework provides strong regularization during training, allowing the structure of the latent space to be determined efficiently and automatically. We demonstrate this by producing the first (to our knowledge) published results of learning from dozens of views, even when data is scarce. We further show experimental results on several different types of multi-view data sets and for different kinds of tasks, including exploratory data analysis, generation, ambiguity modelling through latent priors and classification.Comment: 49 pages including appendi
    corecore