3 research outputs found

    Zylstra v. State Clerk\u27s Record Dckt. 41421

    Get PDF
    https://digitalcommons.law.uidaho.edu/idaho_supreme_court_record_briefs/5777/thumbnail.jp

    Renshaw v. Mortgage Electronic Registration Systems Clerk\u27s Record v. 1 Dckt. 40512

    Get PDF
    https://digitalcommons.law.uidaho.edu/idaho_supreme_court_record_briefs/1864/thumbnail.jp

    Data Depth Inference for Difficult Data

    Get PDF
    We explore various ways in which a robust, nonparametric statistical tool, the data depth function can be used to conduct inference on data which could be described as difficult. This can include data which are difficult in structure, such as multivariate, functional, or multivariate functional data. It can also include data which are difficult in the sense that published statistics must satisfy privacy constraints. We begin with multivariate data. In Chapter 2, we develop two robust, nonparametric methods for multiple change-point detection in the covariance matrix of a multivariate sequence of observations. We demonstrate that changes in ranks generated from data depth functions can be used to detect certain types of changes in the covariance matrix of a sequence of observations. In order to catch more than one change, the first algorithm uses methods similar to that of wild-binary segmentation (Fryzlewicz, 2014). The second algorithm estimates change-points by maximizing a penalized version of the classical Kruskal Wallis ANOVA test statistic. We show that this objective function can be maximized via the well-known pruned exact linear time algorithm. We show under mild, nonparametric assumptions that both of these algorithms are consistent for the correct number of change-points and the correct location(s) of the change-point(s). We demonstrate the efficacy of these methods with a simulation study and a data analysis. We are able to estimate changes accurately when the data are heavy tailed or skewed. We are also able to detect second order change-points in a time series of multivariate financial returns, without first imposing a time series model on the data. In Chapter 3 we extend these methods to the setting of functional data, where we develop a group of hypothesis tests which detect differences between the covariance kernels of several samples. These tests, called functional Kruskal Wallis for covariance tests, are based on functional data depth ranks, which are combined using the classical Kruskal Wallis test statistic. These tests are very robust; we demonstrate that these tests work well when the data are very heavy tailed, both in simulation and theoretically. Specifically, in order for the test to be consistent there is no need to assume that the fourth moment of the observations is finite, which is a typical assumption of existing methods. These tests offer several other benefits: they have a simple distribution under the null hypothesis, they are computationally cheap and they posses linear invariance properties. We show via simulation that these tests have higher power than their competitors in some situations, while still maintaining a reasonable size. We characterize the behavior of these tests under the null hypothesis and show consistency of the several versions of the tests under general alternative hypotheses. We also provide a method for computing sample size and provide some analysis under local alternatives when the ranks are based on a new depth function. In Chapter 4 we present methods for detecting change-points in the variability of a sequence of functional data, thus, combining the methods of Chapter 2 and Chapter 3. Our methods allow the user to test for one change-point, to test for an epidemic period, or to detect an unknown amount of change-points in the data. Since our methodology is based on depth-ranks, we have no need to estimate the covariance operator, which makes our methods computationally cheap. For example, our procedure can identify multiple change-points in O(nlognn\log n) time. Our procedure is fully non-parametric and is robust to outliers through the use of data depth ranks. We show that when nn is large, our methods have simple behavior under the null hypothesis. We also show that the functional Kruskal Wallis for covariance change-point procedures are n1/2n^{-1/2}-consistent. In addition to asymptotic results, we provide a finite sample accuracy result for our at-most-one change-point estimator. In simulation, we compare our methods against several other methods from the literature. We also present an application of our methods to intraday asset returns and f-MRI scans. In Chapter 5 we investigate differentially private estimation of depth functions and their associated medians. We then present a private method for estimating depth-based medians, which is based on the exponential mechanism (McSherry, 2007). We compute the sample complexity of these private medians as a function of the dimension, prior parameters and privacy parameter. As a by-product of our work, we present a smooth depth function, which we show has the same depth-like properties as its non-smooth counterpart. Another by-product of our work is uniform concentration for several depth functions. We also present methods and algorithms for estimating private depth values at in-sample and out-of-sample points. In addition, we extend the propose-test-release methodology of (Brunel, 2020) to be used with depth functions and the exponential mechanism. We show that when using propose-test-release to projection depth values, the probability of no reply is small, and the private depth values concentrate around their population counterparts. We also give an algorithm to approximate the ``test'' step in propose-test-release, since it is computationally difficult. We show that this approximation maintains the low probability of no-reply as in the original propose-test-release. Chapter 6 presents some possible directions for future research related to network data and shape data
    corecore