15 research outputs found
Reliable Prediction Intervals with Regression Neural Networks
This paper proposes an extension to conventional regression Neural Networks
(NNs) for replacing the point predictions they produce with prediction
intervals that satisfy a required level of confidence. Our approach follows a
novel machine learning framework, called Conformal Prediction (CP), for
assigning reliable confidence measures to predictions without assuming anything
more than that the data are independent and identically distributed (i.i.d.).
We evaluate the proposed method on four benchmark datasets and on the problem
of predicting Total Electron Content (TEC), which is an important parameter in
trans-ionospheric links; for the latter we use a dataset of more than 60000 TEC
measurements collected over a period of 11 years. Our experimental results show
that the prediction intervals produced by our method are both well-calibrated
and tight enough to be useful in practice
Reliable Probabilistic Classification with Neural Networks
Venn Prediction (VP) is a new machine learning framework for producing
well-calibrated probabilistic predictions. In particular it provides
well-calibrated lower and upper bounds for the conditional probability of an
example belonging to each possible class of the problem at hand. This paper
proposes five VP methods based on Neural Networks (NNs), which is one of the
most widely used machine learning techniques. The proposed methods are
evaluated experimentally on four benchmark datasets and the obtained results
demonstrate the empirical well-calibratedness of their outputs and their
superiority over the outputs of the traditional NN classifier
Conformal Prediction: a Unified Review of Theory and New Challenges
In this work we provide a review of basic ideas and novel developments about
Conformal Prediction -- an innovative distribution-free, non-parametric
forecasting method, based on minimal assumptions -- that is able to yield in a
very straightforward way predictions sets that are valid in a statistical sense
also in in the finite sample case. The in-depth discussion provided in the
paper covers the theoretical underpinnings of Conformal Prediction, and then
proceeds to list the more advanced developments and adaptations of the original
idea.Comment: arXiv admin note: text overlap with arXiv:0706.3188,
arXiv:1604.04173, arXiv:1709.06233, arXiv:1203.5422 by other author
Conformal Prediction with Orange
Conformal predictors estimate the reliability of outcomes made by supervised machine learning models. Instead of a point value, conformal prediction defines an outcome region that meets a user-specified reliability threshold. Provided that the data are independently and identically distributed, the user can control the level of the prediction errors and adjust it following the requirements of a given application. The quality of conformal predictions often depends on the choice of nonconformity estimate for a given machine learning method. To promote the selection of a successful approach, we have developed Orange3-Conformal, a Python library that provides a range of conformal prediction methods for classification and regression. The library also implements several nonconformity scores. It has a modular design and can be extended to add new conformal prediction methods and nonconformities