4,726 research outputs found
Automatically Generating a Large, Culture-Specific Blocklist for China
Internet censorship measurements rely on lists of websites to be tested, or
"block lists" that are curated by third parties. Unfortunately, many of these
lists are not public, and those that are tend to focus on a small group of
topics, leaving other types of sites and services untested. To increase and
diversify the set of sites on existing block lists, we use natural language
processing and search engines to automatically discover a much wider range of
websites that are censored in China. Using these techniques, we create a list
of 1125 websites outside the Alexa Top 1,000 that cover Chinese politics,
minority human rights organizations, oppressed religions, and more.
Importantly, . The list that we develop not only vastly expands the set
of sites that current Internet measurement tools can test, but it also deepens
our understanding of the nature of content that is censored in China. We have
released both this new block list and the code for generating it
A physics-based approach to flow control using system identification
Control of amplifier flows poses a great challenge, since the influence of environmental noise sources and measurement contamination is a crucial component in the design of models and the subsequent performance of the controller. A modelbased approach that makes a priori assumptions on the noise characteristics often yields unsatisfactory results when the true noise environment is different from the assumed one. An alternative approach is proposed that consists of a data-based systemidentification technique for modelling the flow; it avoids the model-based shortcomings by directly incorporating noise influences into an auto-regressive (ARMAX) design. This technique is applied to flow over a backward-facing step, a typical example of a noise-amplifier flow. Physical insight into the specifics of the flow is used to interpret and tailor the various terms of the auto-regressive model. The designed compensator shows an impressive performance as well as a remarkable robustness to increased noise levels and to off-design operating conditions. Owing to its reliance on only timesequences of observable data, the proposed technique should be attractive in the design of control strategies directly from experimental data and should result in effective compensators that maintain performance in a realistic disturbance environment
Automated Discovery of Internet Censorship by Web Crawling
Censorship of the Internet is widespread around the world. As access to the
web becomes increasingly ubiquitous, filtering of this resource becomes more
pervasive. Transparency about specific content that citizens are denied access
to is atypical. To counter this, numerous techniques for maintaining URL filter
lists have been proposed by various individuals and organisations that aim to
empirical data on censorship for benefit of the public and wider censorship
research community.
We present a new approach for discovering filtered domains in different
countries. This method is fully automated and requires no human interaction.
The system uses web crawling techniques to traverse between filtered sites and
implements a robust method for determining if a domain is filtered. We
demonstrate the effectiveness of the approach by running experiments to search
for filtered content in four different censorship regimes. Our results show
that we perform better than the current state of the art and have built domain
filter lists an order of magnitude larger than the most widely available public
lists as of Jan 2018. Further, we build a dataset mapping the interlinking
nature of blocked content between domains and exhibit the tightly networked
nature of censored web resources
Uncovering Vulnerable Industrial Control Systems from the Internet Core
Industrial control systems (ICS) are managed remotely with the help of
dedicated protocols that were originally designed to work in walled gardens.
Many of these protocols have been adapted to Internet transport and support
wide-area communication. ICS now exchange insecure traffic on an inter-domain
level, putting at risk not only common critical infrastructure but also the
Internet ecosystem (e.g., DRDoS~attacks).
In this paper, we uncover unprotected inter-domain ICS traffic at two central
Internet vantage points, an IXP and an ISP. This traffic analysis is correlated
with data from honeypots and Internet-wide scans to separate industrial from
non-industrial ICS traffic. We provide an in-depth view on Internet-wide ICS
communication. Our results can be used i) to create precise filters for
potentially harmful non-industrial ICS traffic, and ii) to detect ICS sending
unprotected inter-domain ICS traffic, being vulnerable to eavesdropping and
traffic manipulation attacks
An Internet Heartbeat
Obtaining sound inferences over remote networks via active or passive
measurements is difficult. Active measurement campaigns face challenges of
load, coverage, and visibility. Passive measurements require a privileged
vantage point. Even networks under our own control too often remain poorly
understood and hard to diagnose. As a step toward the democratization of
Internet measurement, we consider the inferential power possible were the
network to include a constant and predictable stream of dedicated lightweight
measurement traffic. We posit an Internet "heartbeat," which nodes periodically
send to random destinations, and show how aggregating heartbeats facilitates
introspection into parts of the network that are today generally obtuse. We
explore the design space of an Internet heartbeat, potential use cases,
incentives, and paths to deployment
- …