3,833 research outputs found
Master of Puppets: Analyzing And Attacking A Botnet For Fun And Profit
A botnet is a network of compromised machines (bots), under the control of an
attacker. Many of these machines are infected without their owners' knowledge,
and botnets are the driving force behind several misuses and criminal
activities on the Internet (for example spam emails). Depending on its
topology, a botnet can have zero or more command and control (C&C) servers,
which are centralized machines controlled by the cybercriminal that issue
commands and receive reports back from the co-opted bots.
In this paper, we present a comprehensive analysis of the command and control
infrastructure of one of the world's largest proprietary spamming botnets
between 2007 and 2012: Cutwail/Pushdo. We identify the key functionalities
needed by a spamming botnet to operate effectively. We then develop a number of
attacks against the command and control logic of Cutwail that target those
functionalities, and make the spamming operations of the botnet less effective.
This analysis was made possible by having access to the source code of the C&C
software, as well as setting up our own Cutwail C&C server, and by implementing
a clone of the Cutwail bot. With the help of this tool, we were able to
enumerate the number of bots currently registered with the C&C server,
impersonate an existing bot to report false information to the C&C server, and
manipulate spamming statistics of an arbitrary bot stored in the C&C database.
Furthermore, we were able to make the control server inaccessible by conducting
a distributed denial of service (DDoS) attack. Our results may be used by law
enforcement and practitioners to develop better techniques to mitigate and
cripple other botnets, since many of findings are generic and are due to the
workflow of C&C communication in general
Analyzing the Social Structure and Dynamics of E-mail and Spam in Massive Backbone Internet Traffic
E-mail is probably the most popular application on the Internet, with
everyday business and personal communications dependent on it. Spam or
unsolicited e-mail has been estimated to cost businesses significant amounts of
money. However, our understanding of the network-level behavior of legitimate
e-mail traffic and how it differs from spam traffic is limited. In this study,
we have passively captured SMTP packets from a 10 Gbit/s Internet backbone link
to construct a social network of e-mail users based on their exchanged e-mails.
The focus of this paper is on the graph metrics indicating various structural
properties of e-mail networks and how they evolve over time. This study also
looks into the differences in the structural and temporal characteristics of
spam and non-spam networks. Our analysis on the collected data allows us to
show several differences between the behavior of spam and legitimate e-mail
traffic, which can help us to understand the behavior of spammers and give us
the knowledge to statistically model spam traffic on the network-level in order
to complement current spam detection techniques.Comment: 15 pages, 20 figures, technical repor
Tracking the History and Evolution of Entities: Entity-centric Temporal Analysis of Large Social Media Archives
How did the popularity of the Greek Prime Minister evolve in 2015? How did
the predominant sentiment about him vary during that period? Were there any
controversial sub-periods? What other entities were related to him during these
periods? To answer these questions, one needs to analyze archived documents and
data about the query entities, such as old news articles or social media
archives. In particular, user-generated content posted in social networks, like
Twitter and Facebook, can be seen as a comprehensive documentation of our
society, and thus meaningful analysis methods over such archived data are of
immense value for sociologists, historians and other interested parties who
want to study the history and evolution of entities and events. To this end, in
this paper we propose an entity-centric approach to analyze social media
archives and we define measures that allow studying how entities were reflected
in social media in different time periods and under different aspects, like
popularity, attitude, controversiality, and connectedness with other entities.
A case study using a large Twitter archive of four years illustrates the
insights that can be gained by such an entity-centric and multi-aspect
analysis.Comment: This is a preprint of an article accepted for publication in the
International Journal on Digital Libraries (2018
Hiding in Plain Sight: A Longitudinal Study of Combosquatting Abuse
Domain squatting is a common adversarial practice where attackers register
domain names that are purposefully similar to popular domains. In this work, we
study a specific type of domain squatting called "combosquatting," in which
attackers register domains that combine a popular trademark with one or more
phrases (e.g., betterfacebook[.]com, youtube-live[.]com). We perform the first
large-scale, empirical study of combosquatting by analyzing more than 468
billion DNS records---collected from passive and active DNS data sources over
almost six years. We find that almost 60% of abusive combosquatting domains
live for more than 1,000 days, and even worse, we observe increased activity
associated with combosquatting year over year. Moreover, we show that
combosquatting is used to perform a spectrum of different types of abuse
including phishing, social engineering, affiliate abuse, trademark abuse, and
even advanced persistent threats. Our results suggest that combosquatting is a
real problem that requires increased scrutiny by the security community.Comment: ACM CCS 1
Understanding the Detection of View Fraud in Video Content Portals
While substantial effort has been devoted to understand fraudulent activity
in traditional online advertising (search and banner), more recent forms such
as video ads have received little attention. The understanding and
identification of fraudulent activity (i.e., fake views) in video ads for
advertisers, is complicated as they rely exclusively on the detection
mechanisms deployed by video hosting portals. In this context, the development
of independent tools able to monitor and audit the fidelity of these systems
are missing today and needed by both industry and regulators.
In this paper we present a first set of tools to serve this purpose. Using
our tools, we evaluate the performance of the audit systems of five major
online video portals. Our results reveal that YouTube's detection system
significantly outperforms all the others. Despite this, a systematic evaluation
indicates that it may still be susceptible to simple attacks. Furthermore, we
find that YouTube penalizes its videos' public and monetized view counters
differently, the former being more aggressive. This means that views identified
as fake and discounted from the public view counter are still monetized. We
speculate that even though YouTube's policy puts in lots of effort to
compensate users after an attack is discovered, this practice places the burden
of the risk on the advertisers, who pay to get their ads displayed.Comment: To appear in WWW 2016, Montr\'eal, Qu\'ebec, Canada. Please cite the
conference version of this pape
Recommended from our members
Philippine Elections 2022: The Dictator\u27s Son and the Discoure around Disinformation
Social media was central to Ferdinand “Bongbong” Marcos Jr.’s electoral success, but not in the sense that his campaign had somehow unlocked their hidden features for technological brainwashing. Unfortunately, some pundits looking for quick rationalizations for his landslide victory in the May 2022 polls repeated much of the same explanatory devices from 2016. Many pundits had then attributed the wave of “surprise” populist victories of Rodrigo Duterte in the Philippines, Brexit in the United Kingdom and Donald Trump in the United States to what were hyped to be election-determining factors of social media-fuelled disinformation, troll and bot armies, and Russian influence operations
Spartan Daily, March 16, 2016
Volume 146, Issue 20https://scholarworks.sjsu.edu/spartan_daily_2016/1018/thumbnail.jp
Volume 112 Issue 2
https://dc.swosu.edu/the_southwestern/1059/thumbnail.jp
What IS a Tragedy of the Commons? Overfishing and the Campaign Spending Problem
Over the thirty-seven years since its publication, Garden Hardin\u27s Tragedy of the Commons has clearly become one of the most influential writings of all time. The tragedy of the commons is one of those rare scholarly ideas that has had an enormous impact in academia and is also commonly used outside of academia. In legal scholarship, the tragedy of the commons has been used to characterize a wide variety of resource problems, including intellectual property rights, overcrowding of telecommunications spectra, air and water pollution, and of course, the classic environmental commons problem, overfishing. But I suggest this embarrassment of citation riches highlights the fact that although we invoke it often, we do not know exactly what constitutes a tragedy of the commons.
In an ideological policy battle between interventionists and libertarians – those that argue for and against governmental intervention – a true tragedy of the commons situation presents a potentially decisive argument in favor of intervention. In a true tragedy of the commons, resource users impose mutual externalities upon each other, creating a paternalistic justification for intervention. Of course, in over-exploiting a resource, resource users may also impose externalities upon a larger group that has some stake in the resource, such as the general public might have in clean air or water. This externality alone may be sufficient justification for intervening. But as I define it in this Article, a tragedy of the commons specifically involves a situation in which the resource users are detracting from their own ability to continue to exploit the resource. The need to save the resource users from themselves provides, independent of the need to internalize other large-group externalities, a particularly compelling case for governmental intervention. I use the definition set forth in this Article to analyze a problem that has not been previously recognized as a tragedy of the commons – the problem of ever-increasing political campaign expenditures
- …