6 research outputs found
Reliable Probabilistic Classification with Neural Networks
Venn Prediction (VP) is a new machine learning framework for producing
well-calibrated probabilistic predictions. In particular it provides
well-calibrated lower and upper bounds for the conditional probability of an
example belonging to each possible class of the problem at hand. This paper
proposes five VP methods based on Neural Networks (NNs), which is one of the
most widely used machine learning techniques. The proposed methods are
evaluated experimentally on four benchmark datasets and the obtained results
demonstrate the empirical well-calibratedness of their outputs and their
superiority over the outputs of the traditional NN classifier
Reliable Prediction Intervals with Regression Neural Networks
This paper proposes an extension to conventional regression Neural Networks
(NNs) for replacing the point predictions they produce with prediction
intervals that satisfy a required level of confidence. Our approach follows a
novel machine learning framework, called Conformal Prediction (CP), for
assigning reliable confidence measures to predictions without assuming anything
more than that the data are independent and identically distributed (i.i.d.).
We evaluate the proposed method on four benchmark datasets and on the problem
of predicting Total Electron Content (TEC), which is an important parameter in
trans-ionospheric links; for the latter we use a dataset of more than 60000 TEC
measurements collected over a period of 11 years. Our experimental results show
that the prediction intervals produced by our method are both well-calibrated
and tight enough to be useful in practice
Guaranteed Coverage Prediction Intervals with Gaussian Process Regression
Gaussian Process Regression (GPR) is a popular regression method, which
unlike most Machine Learning techniques, provides estimates of uncertainty for
its predictions. These uncertainty estimates however, are based on the
assumption that the model is well-specified, an assumption that is violated in
most practical applications, since the required knowledge is rarely available.
As a result, the produced uncertainty estimates can become very misleading; for
example the prediction intervals (PIs) produced for the 95\% confidence level
may cover much less than 95\% of the true labels. To address this issue, this
paper introduces an extension of GPR based on a Machine Learning framework
called, Conformal Prediction (CP). This extension guarantees the production of
PIs with the required coverage even when the model is completely misspecified.
The proposed approach combines the advantages of GPR with the valid coverage
guarantee of CP, while the performed experimental results demonstrate its
superiority over existing methods.Comment: 12 pages. This work has been submitted to IEEE Transactions on
Pattern Analysis and Machine Intelligence for possible publication. Copyright
may be transferred without notice, after which this version may no longer be
accessibl
Reliable confidence intervals for software effort estimation
CEUR Workshop Proceedings
Volume 475, 2009, Pages 211-220This paper deals with the problem of software effort estimation through the use of a new machine learning technique for producing reliable confidence measures in predictions. More specifically, we propose the use of Conformal Predictors (CPs), a novel type of prediction algorithms, as a means for providing effort estimations for software projects in the form of predictive intervals according to a specified confidence level. Our approach is based on the well-known Ridge Regression technique, but instead of the simple effort estimates produced by the original method, it produces predictive intervals that satisfy a given confidence level. The results obtained using the proposed algorithm on the COCOMO, Desharnais and ISBSG datasets suggest a quite successful performance obtaining reliable predictive intervals which are narrow enough to be useful in practice