7,204 research outputs found
FilteredWeb: A Framework for the Automated Search-Based Discovery of Blocked URLs
Various methods have been proposed for creating and maintaining lists of
potentially filtered URLs to allow for measurement of ongoing internet
censorship around the world. Whilst testing a known resource for evidence of
filtering can be relatively simple, given appropriate vantage points,
discovering previously unknown filtered web resources remains an open
challenge.
We present a new framework for automating the process of discovering filtered
resources through the use of adaptive queries to well-known search engines. Our
system applies information retrieval algorithms to isolate characteristic
linguistic patterns in known filtered web pages; these are then used as the
basis for web search queries. The results of these queries are then checked for
evidence of filtering, and newly discovered filtered resources are fed back
into the system to detect further filtered content.
Our implementation of this framework, applied to China as a case study, shows
that this approach is demonstrably effective at detecting significant numbers
of previously unknown filtered web pages, making a significant contribution to
the ongoing detection of internet filtering as it develops.
Our tool is currently deployed and has been used to discover 1355 domains
that are poisoned within China as of Feb 2017 - 30 times more than are
contained in the most widely-used public filter list. Of these, 759 are outside
of the Alexa Top 1000 domains list, demonstrating the capability of this
framework to find more obscure filtered content. Further, our initial analysis
of filtered URLs, and the search terms that were used to discover them, gives
further insight into the nature of the content currently being blocked in
China.Comment: To appear in "Network Traffic Measurement and Analysis Conference
2017" (TMA2017
How Do Tor Users Interact With Onion Services?
Onion services are anonymous network services that are exposed over the Tor
network. In contrast to conventional Internet services, onion services are
private, generally not indexed by search engines, and use self-certifying
domain names that are long and difficult for humans to read. In this paper, we
study how people perceive, understand, and use onion services based on data
from 17 semi-structured interviews and an online survey of 517 users. We find
that users have an incomplete mental model of onion services, use these
services for anonymity and have varying trust in onion services in general.
Users also have difficulty discovering and tracking onion sites and
authenticating them. Finally, users want technical improvements to onion
services and better information on how to use them. Our findings suggest
various improvements for the security and usability of Tor onion services,
including ways to automatically detect phishing of onion services, more clear
security indicators, and ways to manage onion domain names that are difficult
to remember.Comment: Appeared in USENIX Security Symposium 201
- …