2 research outputs found

    Profiling users' behavior, and identifying important features of review 'helpfulness'

    Get PDF
    The increasing volume of online reviews and the use of review platforms leave tracks that can be used to explore interesting patterns. It is in the primary interest of businesses to retain and improve their reputation. Reviewers, on the other hand, tend to write reviews that can influence and attract people’s attention, which often leads to deliberate deviations from past rating behavior. Until now, very limited studies have attempted to explore the impact of user rating behavior on review helpfulness. However, there are more perspectives of user behavior in selecting and rating businesses that still need to be investigated. Moreover, previous studies gave more attention to the review features and reported inconsistent findings on the importance of the features. To fill this gap, we introduce new and modify existing business and reviewer features and propose a user-focused mechanism for review selection. This study aims to investigate and report changes in business reputation, user choice, and rating behavior through descriptive and comparative analysis. Furthermore, the relevance of various features for review helpfulness is identified by correlation, linear regression, and negative binomial regression. The analysis performed on the Yelp dataset shows that the reputation of the businesses has changed slightly over time. Moreover, 46% of the users chose a business with a minimum of 4 stars. The majority of users give 4-star ratings, and 60% of reviewers adopt irregular rating behavior. Our results show a slight improvement by using user rating behavior and choice features. Whereas, the significant increase in R2 indicates the importance of reviewer popularity and experience features. The overall results show that the most significant features of review helpfulness are average user helpfulness, number of user reviews, average business helpfulness, and review length. The outcomes of this study provide important theoretical and practical implications for researchers, businesses, and reviewers

    Identifying Features and Predicting Consumer Helpfulness of Product Reviews

    Get PDF
    Major corporations utilize data from online platforms to make user product or service recommendations. Companies like Netflix, Amazon, Yelp, and Spotify rely on purchasing trends, user reviews, and helpfulness votes to make content recommendations. This strategy can increase user engagement on a company\u27s platform. However, misleading and/or spam reviews significantly hinder the success of these recommendation strategies. The rise of social media has made it increasingly difficult to distinguish between authentic content and advertising, leading to a burst of deceptive reviews across the marketplace. The helpfulness of the review is subjective to a voting system. As such, this study aims to predict product reviews that are helpful and enable strategies to moderate a user review post to improve the helpfulness quality of a review. The prediction of review helpfulness will utilize NLP methods against Amazon product review data. Multiple machine learning principles of different complexities will be implemented in this review to compare the results and ease of implementation (e.g., Naïve Bayes and BERT) to predict a product review\u27s helpfulness. The result of this study concludes that review helpfulness can be effectively predicted through the deployment of model features. The removal of duplicate reviews, the imputing of review helpfulness based on word count, and the inclusion of lexical elements are recommended to be included in review analysis. The results of this research indicate that the deployment of these features results in a high F1-Score of 0.83 for predicting helpful Amazon product reviews
    corecore