2 research outputs found
A Machine Learning and Deep Learning Framework for Binary, Ternary, and Multiclass Emotion Classification of Covid-19 Vaccine-Related Tweets
My research mines public emotion toward the Covid-19 vaccine based on Twitter data collected over the past 6-12 months. This project is centered around building and developing machine learning and deep learning models to perform natural language processing of short-form text, which in our case tweets. These tweets are all vaccine-related tweets and the goal of the classification task is for our models to accurately classify a tweet into one of four emotion groups: Apprehension/Anticipation, Sadness/Anger/Frustration, Joy/Humor/Sarcasm, and Gratitude/Relief. Given this data and the goal of the paper, we aim to answer the following questions: (1) Can a framework be developed for machine learning and deep learning multiclass classification models to accurately infer one of four listed emotion groups represented by a vaccine-related tweet? A follow-up to this question is: Can we improve the overall model performance by clustering the emotions into a ternary classification problem? (2) Is there a significant binary distinction that can be made between tweets that express “negative” emotions (Apprehension, Anticipation, Sadness, Anger, and Frustration) and “positive” emotions (Joy, Humor, Sarcasm, Gratitude, and Relief)? This research will present a framework that takes in the raw tweet data and through a pipeline that applies data preprocessing, feature extraction, data splitting & sampling, and ultimately emotion classification. Through these questions, the aim is not only to determine the overall acceptance and sentiment of the vaccines by the public but also to understand the steps public health officials can take to further educate hesitant and/or fearful citizens while also incentivizing it
Recommended from our members
An efficient local binary pattern based plantar pressure optical sensor image classification using convolutional neural networks
The objective of this study was to design and produce highly comfortable shoe products guided by a plantar pressure imaging data-set. Previous studies have focused on the geometric measurement on the size of the plantar, while in this research a plantar pressure optical imaging data-set based classification technology has been developed. In this paper, an improved local binary pattern (LBP) algorithm is used to extract texture-based features and recognize patterns from the data-set. A calculating model of plantar pressure imaging feature area is established subsequently. The data-set is classified by a neural network to guide the generation of various shoe-last surfaces. Firstly, the local binary mode is improved to adapt to the pressure imaging data-set, and the texture-based feature calculation is fully used to accurately generate the feature point set; hereafter, the plantar pressure imaging feature point set is then used to guide the design of last free surface forming. In the presented experiments of plantar imaging, multi-dimensional texture-based features and improved LBP features have been found by a convolution neural network (CNN), and compared with a 21-input-3-output two-layer perceptual neural network. Three feet types are investigated in the experiment, being flatfoot (F) referring to the lack of a normal arch, or arch collapse, Talipes Equinovarus (TE), being the front part of the foot is adduction, calcaneus varus, plantar flexion, or Achilles tendon contracture and Normal (N). This research has achieved an 82% accuracy rate with 10 hidden-layers CNN of rotation invariance LBP (RI-LBP) algorithm using 21 texture-based features by comparing other deep learning methods presented in the literature