7,192 research outputs found

    Internet Filters: A Public Policy Report (Second edition; fully revised and updated)

    Get PDF
    No sooner was the Internet upon us than anxiety arose over the ease of accessing pornography and other controversial content. In response, entrepreneurs soon developed filtering products. By the end of the decade, a new industry had emerged to create and market Internet filters....Yet filters were highly imprecise from the beginning. The sheer size of the Internet meant that identifying potentially offensive content had to be done mechanically, by matching "key" words and phrases; hence, the blocking of Web sites for "Middlesex County," or words such as "magna cum laude". Internet filters are crude and error-prone because they categorize expression without regard to its context, meaning, and value. Yet these sweeping censorship tools are now widely used in companies, homes, schools, and libraries. Internet filters remain a pressing public policy issue to all those concerned about free expression, education, culture, and democracy. This fully revised and updated report surveys tests and studies of Internet filtering products from the mid-1990s through 2006. It provides an essential resource for the ongoing debate

    Lex Informatica: The Formulation of Information Policy Rules through Technology

    Get PDF
    Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information. This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rule-making. Technological capabilities and system design choices impose rules on participants. The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations. Even user preferences and technical choices create overarching, local default rules. This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a “Lex Informatica” that policymakers must understand, consciously recognize, and encourage

    Social Information Processing in Social News Aggregation

    Full text link
    The rise of the social media sites, such as blogs, wikis, Digg and Flickr among others, underscores the transformation of the Web to a participatory medium in which users are collaboratively creating, evaluating and distributing information. The innovations introduced by social media has lead to a new paradigm for interacting with information, what we call 'social information processing'. In this paper, we study how social news aggregator Digg exploits social information processing to solve the problems of document recommendation and rating. First, we show, by tracking stories over time, that social networks play an important role in document recommendation. The second contribution of this paper consists of two mathematical models. The first model describes how collaborative rating and promotion of stories emerges from the independent decisions made by many users. The second model describes how a user's influence, the number of promoted stories and the user's social network, changes in time. We find qualitative agreement between predictions of the model and user data gathered from Digg.Comment: Extended version of the paper submitted to IEEE Internet Computing's special issue on Social Searc

    Rating the Net

    Get PDF
    Rating systems provide an impressive solution to the problem of sexually explicit speech on the Internet. Members of the Internet community are rightly enthusiastic about the benefits filtering software promises. Those benefits, though, come at a cost. Sites may be stripped out of the filtered universe because of deliberate political choices on the part of ratings service administrators, and because of inaccuracies inherent in the ratings process. If a ratings service is to categorize a large number of sites, it cannot simultaneously achieve consistency and nuance; the techniques it must rely on to achieve consistency make it more difficult to capture nuance, and make it less likely that users will find the ratings useful. The necessity of excluding unrated sites may disproportionately bar speech that was not created by commercial providers for a mass audience. These concerns are especially troubling because it seems likely that many adults will reach the Net through approaches monitored by filtering software

    Australian Governments and dilemmas in filtering the Internet: juggling freedoms against potential for harm

    Get PDF
    This paper examines proposed internet filtering policies in Australia from the 1990s to 2014 and discusses some of their ideological underpinnings. Executive summary The Internet is a revolutionary source of information and its dissemination; and a medium for collaboration and interaction between individuals without regard for geographic location. Since its inception, however, concerns have been raised about the potential for unsavoury characters to use the Internet as a vehicle for distributing pornography and material of a violent nature to young or otherwise vulnerable individuals. Governments across the world have attempted to deal with such activities by various means and to varying degrees. These have included imposing mandatory filtering at an Internet Service Provider (ISP) level and optional filtering at the computer level. In Australia there has been considerable debate about what degree of filtering (if any) should be mandated. The Howard Government favoured an approach which emphasised self-regulation by ISPs combined with a legislative component and education and freedom for families to choose between either computer or ISP filtering based on a list of unacceptable content. The Rudd and Gillard Governments preferred the option of a mandatory ISP level filter, although this too was to be based on a ‘blacklist’ of prohibited content. Both options have been criticised as being expensive and inefficient. In addition, it has been argued that the Rudd/Gillard option would have had a detrimental impact on Internet speeds and that it would set a precedent for future governments to widen filtering to other forms of expression. The Howard Government’s programs were largely discarded by Labor after it was elected in 2007. However, Labor’s own filtering option was abandoned prior to its defeat in the 2013 election. In conjunction with their filtering options , both Coalition and Labor Governments have supported education and information campaigns to assist people, particularly children, to deal with online predators and both have introduced successful programs. The current Coalition Government’s policy on Internet filtering appears to favour light-handed legislation combined with education and information programs. This paper examines the iterations of internet filtering policies from the 1990s to 2014 and discusses some of their ideological underpinnings

    From Social Data Mining to Forecasting Socio-Economic Crisis

    Full text link
    Socio-economic data mining has a great potential in terms of gaining a better understanding of problems that our economy and society are facing, such as financial instability, shortages of resources, or conflicts. Without large-scale data mining, progress in these areas seems hard or impossible. Therefore, a suitable, distributed data mining infrastructure and research centers should be built in Europe. It also appears appropriate to build a network of Crisis Observatories. They can be imagined as laboratories devoted to the gathering and processing of enormous volumes of data on both natural systems such as the Earth and its ecosystem, as well as on human techno-socio-economic systems, so as to gain early warnings of impending events. Reality mining provides the chance to adapt more quickly and more accurately to changing situations. Further opportunities arise by individually customized services, which however should be provided in a privacy-respecting way. This requires the development of novel ICT (such as a self- organizing Web), but most likely new legal regulations and suitable institutions as well. As long as such regulations are lacking on a world-wide scale, it is in the public interest that scientists explore what can be done with the huge data available. Big data do have the potential to change or even threaten democratic societies. The same applies to sudden and large-scale failures of ICT systems. Therefore, dealing with data must be done with a large degree of responsibility and care. Self-interests of individuals, companies or institutions have limits, where the public interest is affected, and public interest is not a sufficient justification to violate human rights of individuals. Privacy is a high good, as confidentiality is, and damaging it would have serious side effects for society.Comment: 65 pages, 1 figure, Visioneer White Paper, see http://www.visioneer.ethz.c
    • 

    corecore