Skip to main content
Article thumbnail
Location of Repository

Call for papers Journal of Automated Software Engineering Special Issue: Learning to Organize Testing EDITORS:

By 

Abstract

At the start of the decade, two publications[1,2] described the start-of-the art in defect reduction. Since then, there has been considerable research into data mining of defect data; e.g. [3]. The data mining work has become less about defect reduction, and more about how to organize a project's test resources in order to improve product quality by (say) defining a procedure such that the modules most likely to contain defects are inspected first [4]. After a decade of intensive work into data mining to make best use of testing resources, it is time to ask: what have we learned from all that research? Some of that research offers success stories with (e.g.) • Reducing the costs to find defects [4]; • Generalizing defect predictors to other projects [5]; • Tuning those predictors to different business goals [6]. But other research offers the cautions that: • defect predictors may not generalize to other projects [7]; • Despite much effort on data mining and defects, most of that work achieves similar conclusions [8]; • Data mining data is fundamentally less important than discussing those effects with the users [9] The above references sample just a small subset of the research performed this decade on data mining and software defects. We seek papers that document, review, and extend this work. Do the insights from the start of the decade still hold? Has anything extra really been learned in the meanwhile? If we wrote an article "What We Have Learned About Organizing Testing Resources " in 2010, what would we write in such an article, that has been verified using publicly available data sets? For this special issue, we seek papers about or progress (or lack of progress) in using data mining to organize test resources in order to (say) fight defects. Papers are required to offer verifiable results; i.e. they must be based on public-domain data sets. Submissions should come with an attached note offering the URL of the data used to make the paper's conclusions. A condition of publication for accepted papers is that their data must be transferred to the PROMISE repositor

Year: 2011
OAI identifier: oai:CiteSeerX.psu:10.1.1.353.531
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://menzies.us/pdf/cftOrgai... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.