1 research outputs found
Towards Better Summarizing Bug Reports with Crowdsourcing Elicited Attributes
Recent years have witnessed the growing demands for resolving numerous bug
reports in software maintenance. Aiming to reduce the time testers/developers
take in perusing bug reports, the task of bug report summarization has
attracted a lot of research efforts in the literature. However, no systematic
analysis has been conducted on attribute construction which heavily impacts the
performance of supervised algorithms for bug report summarization. In this
study, we first conduct a survey to reveal the existing methods for attribute
construction in mining software repositories. Then, we propose a new method
named Crowd-Attribute to infer new effective attributes from the crowdgenerated
data in crowdsourcing and develop a new tool named Crowdsourcing Software
Engineering Platform to facilitate this method. With Crowd-Attribute, we
successfully construct 11 new attributes and propose a new supervised algorithm
named Logistic Regression with Crowdsourced Attributes (LRCA). To evaluate the
effectiveness of LRCA, we build a series of large scale data sets with 105,177
bug reports. Experiments over both the public data set SDS with 36 manually
annotated bug reports and new large-scale data sets demonstrate that LRCA can
consistently outperform the state-of-the-art algorithms for bug report
summarization.Comment: Accepted by IEEE Transactions on Reliabilit