15,085 research outputs found
Multimodal Content Analysis for Effective Advertisements on YouTube
The rapid advances in e-commerce and Web 2.0 technologies have greatly
increased the impact of commercial advertisements on the general public. As a
key enabling technology, a multitude of recommender systems exists which
analyzes user features and browsing patterns to recommend appealing
advertisements to users. In this work, we seek to study the characteristics or
attributes that characterize an effective advertisement and recommend a useful
set of features to aid the designing and production processes of commercial
advertisements. We analyze the temporal patterns from multimedia content of
advertisement videos including auditory, visual and textual components, and
study their individual roles and synergies in the success of an advertisement.
The objective of this work is then to measure the effectiveness of an
advertisement, and to recommend a useful set of features to advertisement
designers to make it more successful and approachable to users. Our proposed
framework employs the signal processing technique of cross modality feature
learning where data streams from different components are employed to train
separate neural network models and are then fused together to learn a shared
representation. Subsequently, a neural network model trained on this joint
feature embedding representation is utilized as a classifier to predict
advertisement effectiveness. We validate our approach using subjective ratings
from a dedicated user study, the sentiment strength of online viewer comments,
and a viewer opinion metric of the ratio of the Likes and Views received by
each advertisement from an online platform.Comment: 11 pages, 5 figures, ICDM 201
Opinion modeling on social media and marketing aspects
We introduce and discuss kinetic models of opinion formation on social
networks in which the distribution function depends on both the opinion and the
connectivity of the agents. The opinion formation model is subsequently coupled
with a kinetic model describing the spreading of popularity of a product on the
web through a social network. Numerical experiments on the underlying kinetic
models show a good qualitative agreement with some measured trends of hashtags
on social media websites and illustrate how companies can take advantage of the
network structure to obtain at best the advertisement of their products
Is adaptation of e-advertising the way forward?
E-advertising is a multi-billion dollar industry that has shown exponential growth in the last few years. However, although the number of users accessing the Internet increases, users don’t respond positively to adverts. Adaptive e-advertising may be the key to ensuring effectiveness of the ads reaching their target. Moreover, social networks are good sources of user information and can be used to extract user behaviour and characteristics for presentation of personalized advertising. Here we present a two-sided study based on two questionnaires, one directed to Internet users and the other to businesses. Our study shows that businesses agree that personalized advertising is the best way for the future, to maximize effectiveness and profit. In addition, our results indicate that most Internet users would prefer adaptive advertisements. From this study, we can propose a new design for a system that meets both Internet users’ and businesses’ requirements
Evaluating Content-centric vs User-centric Ad Affect Recognition
Despite the fact that advertisements (ads) often include strongly emotional
content, very little work has been devoted to affect recognition (AR) from ads.
This work explicitly compares content-centric and user-centric ad AR
methodologies, and evaluates the impact of enhanced AR on computational
advertising via a user study. Specifically, we (1) compile an affective ad
dataset capable of evoking coherent emotions across users; (2) explore the
efficacy of content-centric convolutional neural network (CNN) features for
encoding emotions, and show that CNN features outperform low-level emotion
descriptors; (3) examine user-centered ad AR by analyzing Electroencephalogram
(EEG) responses acquired from eleven viewers, and find that EEG signals encode
emotional information better than content descriptors; (4) investigate the
relationship between objective AR and subjective viewer experience while
watching an ad-embedded online video stream based on a study involving 12
users. To our knowledge, this is the first work to (a) expressly compare user
vs content-centered AR for ads, and (b) study the relationship between modeling
of ad emotions and its impact on a real-life advertising application.Comment: Accepted at the ACM International Conference on Multimodal Interation
(ICMI) 201
- …