From an empirical perspective, the subset chosen through active learning
cannot guarantee an advantage over random sampling when transferred to another
model. While it underscores the significance of verifying transferability,
experimental design from previous works often neglected that the
informativeness of a data subset can change over model configurations. To
tackle this issue, we introduce a new experimental design, coined as Candidate
Proposal, to find transferable data candidates from which active learning
algorithms choose the informative subset. Correspondingly, a data selection
algorithm is proposed, namely Transferable candidate proposal with Bounded
Uncertainty (TBU), which constrains the pool of transferable data candidates by
filtering out the presumably redundant data points based on uncertainty
estimation. We verified the validity of TBU in image classification benchmarks,
including CIFAR-10/100 and SVHN. When transferred to different model
configurations, TBU consistency improves performance in existing active
learning algorithms. Our code is available at
https://github.com/gokyeongryeol/TBU.Comment: Accepted in NeurIPS 2023 Workshop on Adaptive Experimental Design and
Active Learning in the Real Worl