4 research outputs found

    Lower bounds for the trade-off between bias and mean absolute deviation

    Get PDF
    In nonparametric statistics, rate-optimal estimators typically balance bias and stochastic error. The recent work on overparametrization raises the question whether rate-optimal estimators exist that do not obey this trade-off. In this work we consider pointwise estimation in the Gaussian white noise model with regression function f in a class of β-Hölder smooth functions. Let ’worst-case’ refer to the supremum over all functions f in the Hölder class. It is shown that any estimator with worst-case bias ≲n −β/(2β+1)≕ψ n must necessarily also have a worst-case mean absolute deviation that is lower bounded by ≳ψ n. To derive the result, we establish abstract inequalities relating the change of expectation for two probability measures to the mean absolute deviation.</p

    Bias/Variance Analysis for Relational Domains

    No full text

    Bias/Variance Analysis for Relational Domains

    No full text
    Abstract. Bias/variance analysis is a useful tool for investigating the performance of machine learning algorithms. Conventional analysis decomposes loss into errors due to aspects of the learning process, but in relational domains, the inference process introduces an additional source of error. Collective inference techniques introduce additional error both through the use of approximate inference algorithms and through variation in the availability of test set information. To date, the impact of inference error on model performance has not been investigated. In this paper, we propose a new bias/variance framework that decomposes loss into errors due to both the learning and inference process. We evaluate performance of three relational models and show that (1) inference can be a significant source of error, and (2) the models exhibit different types of errors as data characteristics are varied. Key words: Statistical relational learning, collective inference, evaluation
    corecore