4 research outputs found

    Towards the design of a platform for abuse detection in OSNs using multimedial data analysis

    Get PDF
    Online social networks (OSNs) are becoming increasingly popular every day. The vast amount of data created by users and their actions yields interesting opportunities, both socially and economically. Unfortunately, these online communities are prone to abuse and inappropriate behaviour such as cyber bullying. For victims, this kind of behaviour can lead to depression and other severe problems. However, due to the huge amount of users and data it is impossible to manually check all content posted on the social network. We propose a pluggable architecture with reusable components, able to quickly detect harmful content. The platform uses text-, image-, audio- and video-based analysis modules to detect inappropriate content or high risk behaviour. Domain services aggregate this data and flag user profiles if necessary. Social network moderators need only check the validity of the flagged profiles. This paper reports upon key requirements of the platform, the architectural components and important challenges

    Characterizing and Detecting State-Sponsored Troll Activity on Social Media

    Full text link
    The detection of state-sponsored trolls acting in information operations is an unsolved and critical challenge for the research community, with repercussions that go beyond the online realm. In this paper, we propose a novel AI-based solution for the detection of state-sponsored troll accounts, which consists of two steps. The first step aims at classifying trajectories of accounts' online activities as belonging to either a state-sponsored troll or to an organic user account. In the second step, we exploit the classified trajectories to compute a metric, namely "troll score", which allows us to quantify the extent to which an account behaves like a state-sponsored troll. As a study case, we consider the troll accounts involved in the Russian interference campaign during the 2016 US Presidential election, identified as Russian trolls by the US Congress. Experimental results show that our approach identifies accounts' trajectories with an AUC close to 99\% and, accordingly, classify Russian trolls and organic users with an AUC of 97\%. Finally, we evaluate whether the proposed solution can be generalized to different contexts (e.g., discussions about Covid-19) and generic misbehaving users, showing promising results that will be further expanded in our future endeavors.Comment: 15 page

    Designing Light Filters to Detect Skin Using a Low-powered Sensor

    Get PDF
    Detection of nudity in photos and videos, especially prior to uploading to the internet, is vital to solving many problems related to adolescent sexting, the distribution of child pornography, and cyber-bullying. The problem with using nudity detection algorithms as a means to combat these problems is that: 1) it implies that a digitized nude photo of a minor already exists (i.e., child pornography), and 2) there are real ethical and legal concerns around the distribution and processing of child pornography. Once a camera captures an image, that image is no longer secure. Therefore, we need to develop new privacy-preserving solutions that prevent the digital capture of nude imagery of minors. My research takes a first step in trying to accomplish this long-term goal: In this thesis, I examine the feasibility of using a low-powered sensor to detect skin dominance (defined as an image comprised of 50% or more of human skin tone) in a visual scene. By designing four custom light filters to enhance the digital information extracted from 300 scenes captured with the sensor (without digitizing high-fidelity visual features), I was able to accurately detect a skin dominant scene with 83.7% accuracy, 83% precision, and 85% recall. The long-term goal to be achieved in the future is to design a low-powered vision sensor that can be mounted on a digital camera lens on a teen\u27s mobile device to detect and/or prevent the capture of nude imagery. Thus, I discuss the limitations of this work toward this larger goal, as well as future research directions
    corecore