In many real-world predictions tasks, class labels include information about
the relative ordering between labels, which is not captured by commonly-used
loss functions such as multi-category cross-entropy. Recently, ordinal
regression frameworks have been adopted by the deep learning community to take
such ordering information into account. Using a framework that transforms
ordinal targets into binary classification subtasks, neural networks were
equipped with ordinal regression capabilities. However, this method suffers
from inconsistencies among the different binary classifiers. We hypothesize
that addressing the inconsistency issue in these binary classification
task-based neural networks improves predictive performance. To test this
hypothesis, we propose the COnsistent RAnk Logits (CORAL) framework with strong
theoretical guarantees for rank-monotonicity and consistent confidence scores.
Moreover, the proposed method is architecture-agnostic and can extend arbitrary
state-of-the-art deep neural network classifiers for ordinal regression tasks.
The empirical evaluation of the proposed rank-consistent method on a range of
face-image datasets for age prediction shows a substantial reduction of the
prediction error compared to the reference ordinal regression network.Comment: In the previous manuscript version, an issue with the figures caused
certain versions of Adobe Acrobat Reader to crash. This version fixes this
issu