195 research outputs found

    Learning Mixtures of Plackett-Luce Models with Features from Top-ll Orders

    Full text link
    Plackett-Luce model (PL) is one of the most popular models for preference learning. In this paper, we consider PL with features and its mixture models, where each alternative has a vector of features, possibly different across agents. Such models significantly generalize the standard PL, but are not as well investigated in the literature. We extend mixtures of PLs with features to models that generate top-ll and characterize their identifiability. We further prove that when PL with features is identifiable, its MLE is consistent with a strictly concave objective function under mild assumptions. Our experiments on synthetic data demonstrate the effectiveness of MLE on PL with features with tradeoffs between statistical efficiency and computational efficiency when ll takes different values. For mixtures of PL with features, we show that an EM algorithm outperforms MLE in MSE and runtime.Comment: 16 pages, 2 figure

    Multiple Testing of Local Extrema for Detection of Structural Breaks in Piecewise Linear Models

    Full text link
    In this paper, we propose a new generic method for detecting the number and locations of structural breaks or change points in piecewise linear models under stationary Gaussian noise. Our method transforms the change point detection problem into identifying local extrema (local maxima and local minima) through kernel smoothing and differentiation of the data sequence. By computing p-values for all local extrema based on peak height distributions of smooth Gaussian processes, we utilize the Benjamini-Hochberg procedure to identify significant local extrema as the detected change points. Our method can distinguish between two types of change points: continuous breaks (Type I) and jumps (Type II). We study three scenarios of piecewise linear signals, namely pure Type I, pure Type II and a mixture of Type I and Type II change points. The results demonstrate that our proposed method ensures asymptotic control of the False Discover Rate (FDR) and power consistency, as sequence length, slope changes, and jump size increase. Furthermore, compared to traditional change point detection methods based on recursive segmentation, our approach only requires a single test for all candidate local extrema, thereby achieving the smallest computational complexity proportionate to the data sequence length. Additionally, numerical studies illustrate that our method maintains FDR control and power consistency, even in non-asymptotic cases when the size of slope changes or jumps is not large. We have implemented our method in the R package "dSTEM" (available from https://cran.r-project.org/web/packages/dSTEM)
    • …
    corecore