13 research outputs found

    Interpretable aesthetic features for affective image classification

    Full text link
    Images can not only display contents themselves, but also convey emotions, e.g., excitement, sadness. Affective image classification is useful and hot in many fields such as comput-er vision and multimedia. Current researches usually consid-er the relationship model between images and emotions as a black box. They extract the traditional discursive visual fea-tures such as SIFT and wavelet textures, and use them di-rectly upon various classification algorithms. However, these visual features are not interpretable, and people cannot know why such a set of features induce a particular emotion. And due to the highly subjective nature of images, the classifica-tion accuracies on these visual features are not satisfactory for a long time. We propose the interpretable aesthetic fea-tures to describe images inspired by art theories, which are intuitive, discriminative and easily understandable. Affective image classification based on these features can achieve high-er accuracy, compared with the state-of-the-art. Specifically, the features can also intuitively explain why an image tends to convey a certain emotion. We also develop an emotion guided image gallery to demonstrate the proposed feature collection. Index Terms — image features, affective classification, interpretability, art theory 1

    Modeling Emotion Influence from Images in Social Networks

    Full text link
    Images become an important and prevalent way to express users' activities, opinions and emotions. In a social network, individual emotions may be influenced by others, in particular by close friends. We focus on understanding how users embed emotions into the images they uploaded to the social websites and how social influence plays a role in changing users' emotions. We first verify the existence of emotion influence in the image networks, and then propose a probabilistic factor graph based emotion influence model to answer the questions of "who influences whom". Employing a real network from Flickr as experimental data, we study the effectiveness of factors in the proposed model with in-depth data analysis. Our experiments also show that our model, by incorporating the emotion influence, can significantly improve the accuracy (+5%) for predicting emotions from images. Finally, a case study is used as the anecdotal evidence to further demonstrate the effectiveness of the proposed model

    High-Level Concepts for Affective Understanding of Images

    Full text link
    This paper aims to bridge the affective gap between image content and the emotional response of the viewer it elicits by using High-Level Concepts (HLCs). In contrast to previous work that relied solely on low-level features or used convolutional neural network (CNN) as a black-box, we use HLCs generated by pretrained CNNs in an explicit way to investigate the relations/associations between these HLCs and a (small) set of Ekman's emotional classes. As a proof-of-concept, we first propose a linear admixture model for modeling these relations, and the resulting computational framework allows us to determine the associations between each emotion class and certain HLCs (objects and places). This linear model is further extended to a nonlinear model using support vector regression (SVR) that aims to predict the viewer's emotional response using both low-level image features and HLCs extracted from images. These class-specific regressors are then assembled into a regressor ensemble that provide a flexible and effective predictor for predicting viewer's emotional responses from images. Experimental results have demonstrated that our results are comparable to existing methods, with a clear view of the association between HLCs and emotional classes that is ostensibly missing in most existing work

    Can Title Images Predict the Emotions and the Performance of Crowdfunding Projects?

    Get PDF
    Crowdfunding is a novel way to raise funds from individuals. However, taking Kickstarter for example, more than 60% of projects failed to reach the funding targets. Hence it is imperative to study how to improve the successfulness of the projects. From a design perspective, we intend to investigate that can the characteristics of title images of the projects on the search page of the crowdfunding website predict the performance of crowdfunding projects. We use objective standards to measure the aesthetic features of the title images. And we introduce emotions as important antecedents for the performance of a project. We used deep learning to extract the emotion metrics from the title images. Analysis results provide significant evidence that aesthetic attributes of images can predict emotion in images, and emotions, such as sadness and contentment, can predict the performance of crowdfunding projects. Our results provide both theoretical and practical values

    Evaluation and Prediction of Evoked Emotions Induced by Image Manipulations

    Get PDF
    Various image editing tools make our pictures more attractive, and at the same time, evoke different emotional responses. With powerful and easy-to-use imaging applications, capturing, editing and then sharing pictures have become daily life for many. This paper investigates the influence of several image manipulations on evoked emotions for different types of images. To do so, various types of images clustered in different categories, were collected from Instagram and subjective evaluations were conducted via crowdsourcing to gather the emotional responses on different manipulations as perceived by subjects. Evaluation results show that certain image manipulations can induce different evoked emotions on transformed pictures when compared to the original ones. However, such changes in image emotions due to manipulation are highly content dependent. Then, we conducted a machine learning based experiment, in attempt to predict the emotions of a manipulated image given its original version and the desired manipulation method. Experimental results present a promising performance of such a prediction model, which could pave the road to automatic selection or recommendation of image editing tools that can efficiently transform or emphasize desired emotions in pictures

    The impact of social media visual features on acceptance of meat substitute

    Get PDF
    There is a growing demand for meat substitutes among consumers, given that excessive meat consumption is associated with negative consequences for personal health and the environment. However, the market shares of such meat substitutes remain low, thus highlighting the need to further investigate how to increase consumer acceptance of meat substitutes. The present research investigates social media data of plant-based meat brands and explores how visual features could lead to a high number of likes, which is a numerical representation of social acceptance. The findings of this research show that social media posts with warm color, vertical symmetry, and horizontal symmetry receive a higher number of likes. Further, there is a joint effect between warm color and vertical symmetry, such that vertical symmetry would strengthen the positive effect of warm color on the number of likes. These findings offer a more nuanced understanding of how to increase consumer acceptance of meat substitutes and how to promote plant-based meat brands in social media. </jats:p

    Affective Image Content Analysis: Two Decades Review and New Perspectives

    Get PDF
    Images can convey rich semantics and induce various emotions in viewers. Recently, with the rapid advancement of emotional intelligence and the explosive growth of visual data, extensive research efforts have been dedicated to affective image content analysis (AICA). In this survey, we will comprehensively review the development of AICA in the recent two decades, especially focusing on the state-of-the-art methods with respect to three main challenges -- the affective gap, perception subjectivity, and label noise and absence. We begin with an introduction to the key emotion representation models that have been widely employed in AICA and description of available datasets for performing evaluation with quantitative comparison of label noise and dataset bias. We then summarize and compare the representative approaches on (1) emotion feature extraction, including both handcrafted and deep features, (2) learning methods on dominant emotion recognition, personalized emotion prediction, emotion distribution learning, and learning from noisy data or few labels, and (3) AICA based applications. Finally, we discuss some challenges and promising research directions in the future, such as image content and context understanding, group emotion clustering, and viewer-image interaction.Comment: Accepted by IEEE TPAM

    Affective image content analysis: two decades review and new perspectives

    Get PDF
    corecore