3 research outputs found

    Natural language processing techniques for researching and improving peer feedback

    Get PDF
    Peer review has been viewed as a promising solution for improving studennts' writing, which still remains a great challenge for educators. However, one core problem with peer review of writing is that potentially useful feedbback from peers is not always presented in ways that lead to revision. Our prior investigations found that whether students implement feedback is significantly correlated with two feedback features: localization information and concrete solutions. But focusing on feedback features is time-intensive for researchers and instructors. We apply data mining and Natural Languagee Processing techniques to automatically code reviews for these feedback features. Our results show that it is feasible to provide intelligent suppport to peer review systems to automatically assess students' reviewing performance with respect to problem localization and solution. We also show that similar research conclusions about helpfulness perceptions of feedback across students and different expert types can be drawn from automatically coded data and from hand-coded data. © Earli

    Understanding Differences in Perceived Peer-Review Helpfulness using Natural Language Processing

    No full text
    Identifying peer-review helpfulness is an important task for improving the quality of feedback received by students, as well as for helping students write better reviews. As we tailor standard product review analysis techniques to our peer-review domain, we notice that peerreview helpfulness differs not only between students and experts but also between types of experts. In this paper, we investigate how different types of perceived helpfulness might influence the utility of features for automatic prediction. Our feature selection results show that certain low-level linguistic features are more useful for predicting student perceived helpfulness, while high-level cognitive constructs are more effective in modeling experts’ perceived helpfulness.

    Helpfulness Guided Review Summarization

    Get PDF
    User-generated online reviews are an important information resource in people's everyday life. As the review volume grows explosively, the ability to automatically identify and summarize useful information from reviews becomes essential in providing analytic services in many review-based applications. While prior work on review summarization focused on different review perspectives (e.g. topics, opinions, sentiment, etc.), the helpfulness of reviews is an important informativeness indicator that has been less frequently explored. In this thesis, we investigate automatic review helpfulness prediction and exploit review helpfulness for review summarization in distinct review domains. We explore two paths for predicting review helpfulness in a general setting: one is by tailoring existing helpfulness prediction techniques to a new review domain; the other is by using a general representation of review content that reflects review helpfulness across domains. For the first one, we explore educational peer reviews and show how peer-review domain knowledge can be introduced to a helpfulness model developed for product reviews to improve prediction performance. For the second one, we characterize review language usage, content diversity and helpfulness-related topics with respect to different content sources using computational linguistic features. For review summarization, we propose to leverage user-provided helpfulness assessment during content selection in two ways: 1) using the review-level helpfulness ratings directly to filter out unhelpful reviews, 2) developing sentence-level helpfulness features via supervised topic modeling for sentence selection. As a demonstration, we implement our methods based on an extractive multi-document summarization framework and evaluate them in three user studies. Results show that our helpfulness-guided summarizers outperform the baseline in both human and automated evaluation for camera reviews and movie reviews. While for educational peer reviews, the preference for helpfulness depends on student writing performance and prior teaching experience
    corecore