Skip to main content
Article thumbnail
Location of Repository

This work is licensed under a Creative Commons Attribution- Noncommercial-Share Alike 3.0 LicenseDrift Detection using Uncertainty Distribution Divergence

By Conference Papers, Patrick Lindstrom, Brian Mac Namee and Sarah Jane DelanyPatrick Lindstrom, Brian Mac Namee and Sarah Jane Delany


Abstract — Concept drift is believed to be prevalent in most data gathered from naturally occurring processes and thus warrants research by the machine learning community. There are a myriad of approaches to concept drift handling which have been shown to handle concept drift with varying degrees of success. However, most approaches make the key assumption that the labelled data will be available at no labelling cost shortly after classification, an assumption which is often violated. The high labelling cost in many domains provides a strong motivation to reduce the number of labelled instances required to handle concept drift. Explicit detection approaches that do not require labelled instances to detect concept drift show great promise for achieving this. Our approach Confidence Distribution Batch Detection (CDBD) provides a signal correlated to changes in concept without using labelled data. We also show how this signal combined with a trigger and a rebuild policy can maintain classifier accuracy while using a limited amount of labelled data. Keywords-concept drift; explicit drift detection; labelling cost; classifier confidence; I

Year: 2013
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.