72,902 research outputs found
VirtualIdentity : privacy preserving user profiling
User profiling from user generated content (UGC) is a common practice that supports the business models of many social media companies. Existing systems require that the UGC is fully exposed to the module that constructs the user profiles. In this paper we show that it is possible to build user profiles without ever accessing the user's original data, and without exposing the trained machine learning models for user profiling - which are the intellectual property of the company - to the users of the social media site. We present VirtualIdentity, an application that uses secure multi-party cryptographic protocols to detect the age, gender and personality traits of users by classifying their user-generated text and personal pictures with trained support vector machine models in a privacy preserving manner
Visual BFI: an Exploratory Study for Image-based Personality Test
This paper positions and explores the topic of image-based personality test.
Instead of responding to text-based questions, the subjects will be provided a
set of "choose-your-favorite-image" visual questions. With the image options of
each question belonging to the same concept, the subjects' personality traits
are estimated by observing their preferences of images under several unique
concepts. The solution to design such an image-based personality test consists
of concept-question identification and image-option selection. We have
presented a preliminary framework to regularize these two steps in this
exploratory study. A demo version of the designed image-based personality test
is available at http://www.visualbfi.org/. Subjective as well as objective
evaluations have demonstrated the feasibility of image-based personality test
in limited questions
Personality in Computational Advertising: A Benchmark
In the last decade, new ways of shopping online have increased the
possibility of buying products and services more easily and faster
than ever. In this new context, personality is a key determinant
in the decision making of the consumer when shopping. A personâs
buying choices are influenced by psychological factors like
impulsiveness; indeed some consumers may be more susceptible
to making impulse purchases than others. Since affective metadata
are more closely related to the userâs experience than generic
parameters, accurate predictions reveal important aspects of userâs
attitudes, social life, including attitude of others and social identity.
This work proposes a highly innovative research that uses a personality
perspective to determine the unique associations among the
consumerâs buying tendency and advert recommendations. In fact,
the lack of a publicly available benchmark for computational advertising
do not allow both the exploration of this intriguing research
direction and the evaluation of recent algorithms. We present the
ADS Dataset, a publicly available benchmark consisting of 300 real
advertisements (i.e., Rich Media Ads, Image Ads, Text Ads) rated
by 120 unacquainted individuals, enriched with Big-Five usersâ
personality factors and 1,200 personal usersâ pictures
SALSA: A Novel Dataset for Multimodal Group Behavior Analysis
Studying free-standing conversational groups (FCGs) in unstructured social
settings (e.g., cocktail party ) is gratifying due to the wealth of information
available at the group (mining social networks) and individual (recognizing
native behavioral and personality traits) levels. However, analyzing social
scenes involving FCGs is also highly challenging due to the difficulty in
extracting behavioral cues such as target locations, their speaking activity
and head/body pose due to crowdedness and presence of extreme occlusions. To
this end, we propose SALSA, a novel dataset facilitating multimodal and
Synergetic sociAL Scene Analysis, and make two main contributions to research
on automated social interaction analysis: (1) SALSA records social interactions
among 18 participants in a natural, indoor environment for over 60 minutes,
under the poster presentation and cocktail party contexts presenting
difficulties in the form of low-resolution images, lighting variations,
numerous occlusions, reverberations and interfering sound sources; (2) To
alleviate these problems we facilitate multimodal analysis by recording the
social interplay using four static surveillance cameras and sociometric badges
worn by each participant, comprising the microphone, accelerometer, bluetooth
and infrared sensors. In addition to raw data, we also provide annotations
concerning individuals' personality as well as their position, head, body
orientation and F-formation information over the entire event duration. Through
extensive experiments with state-of-the-art approaches, we show (a) the
limitations of current methods and (b) how the recorded multiple cues
synergetically aid automatic analysis of social interactions. SALSA is
available at http://tev.fbk.eu/salsa.Comment: 14 pages, 11 figure
Development and Validation of an Attitudinal-Profiling Tool for Patients With Asthma
This study was supported and funded by Mundipharma Pte Ltd. Online survey and statistical analysis were performed by Pei-Li Teh, Rachel Howard, Tsin-Li Chua and Jie Sun of Research Partnership Pte Ltd. Medical writing support was provided by Sen-Kwan Tay of Research2Trials Clinical Solutions Pte Ltd. The authors received honoraria from Mundipharma for their participation in the REALISE Asia Working Group meetings and discussions. Prof Price has Board membership with Mundipharma; and had received consultancy and speaker fees, grants and unrestricted funding support from Mundipharma; and payment for manuscript preparation and travel/accommodations/meeting expenses from Mundipharma. Profs Liam and David-Wang are members of the Asia-Pacific Advisory Board of Mundipharma. Profs Cho and David-Wang had received speaker fees from Mundipharma in the past. Dr Neira was an employee of Mundipharma Pte Ltd, Singapore. Ms Teh is an employee of Research Partnership Pte Ltd which conducted the REALISE Asia survey for Mundipharma. Prof Cho is a member of the Editorial Board of Allergy, Asthma & Immunology.Peer reviewedPublisher PD
Effects of Team-Based Computer Interaction: The Media Equation and Game Design Considerations
The current paper applies media equation research to video game de-sign. The paper presents a review of the existing media equation research, de-scribes a specific study conducted by the authors, discusses how the findings of the study can be used to inform future game design, and explores how other media equation findings might be incorporated into game design. The specific study, discussed in detail in the paper, explores the notion of team formation between humans and computer team-mates. The results show that while highly experienced users will accept a computer as a team-mate, they tend to react more negatively towards the computer than to human teammates (a âBlack Sheepâ Effect
First impressions: A survey on vision-based apparent personality trait analysis
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Personality analysis has been widely studied in psychology, neuropsychology, and signal processing fields, among others. From the past few years, it also became an attractive research area in visual computing. From the computational point of view, by far speech and text have been the most considered cues of information for analyzing personality. However, recently there has been an increasing interest from the computer vision community in analyzing personality from visual data. Recent computer vision approaches are able to accurately analyze human faces, body postures and behaviors, and use these information to infer apparent personality traits. Because of the overwhelming research interest in this topic, and of the potential impact that this sort of methods could have in society, we present in this paper an up-to-date review of existing vision-based approaches for apparent personality trait recognition. We describe seminal and cutting edge works on the subject, discussing and comparing their distinctive features and limitations. Future venues of research in the field are identified and discussed. Furthermore, aspects on the subjectivity in data labeling/evaluation, as well as current datasets and challenges organized to push the research on the field are reviewed.Peer ReviewedPostprint (author's final draft
Recommended from our members
A Mixed-Effects Location Scale Model for Dyadic Interactions.
We present a mixed-effects location scale model (MELSM) for examining the daily dynamics of affect in dyads. The MELSM includes person and time-varying variables to predict the location, or individual means, and the scale, or within-person variances. It also incorporates a submodel to account for between-person variances. The dyadic specification can accommodate individual and partner effects in both the location and the scale components, and allows random effects for all location and scale parameters. All covariances among the random effects, within and across the location and the scale are also estimated. These covariances offer new insights into the interplay of individual mean structures, intra-individual variability, and the influence of partner effects on such factors. To illustrate the model, we use data from 274 couples who provided daily ratings on their positive and negative emotions toward their relationship - up to 90 consecutive days. The model is fit using Hamiltonian Monte Carlo methods, and includes subsets of predictors in order to demonstrate the flexibility of this approach. We conclude with a discussion on the usefulness and the limitations of the MELSM for dyadic research
- âŠ