294 research outputs found

    On the Privacy Practices of Just Plain Sites

    Full text link
    In addition to visiting high profile sites such as Facebook and Google, web users often visit more modest sites, such as those operated by bloggers, or by local organizations such as schools. Such sites, which we call "Just Plain Sites" (JPSs) are likely to inadvertently represent greater privacy risks than high profile sites by virtue of being unable to afford privacy expertise. To assess the prevalence of the privacy risks to which JPSs may inadvertently be exposing their visitors, we analyzed a number of easily observed privacy practices of such sites. We found that many JPSs collect a great deal of information from their visitors, share a great deal of information about their visitors with third parties, permit a great deal of tracking of their visitors, and use deprecated or unsafe security practices. Our goal in this work is not to scold JPS operators, but to raise awareness of these facts among both JPS operators and visitors, possibly encouraging the operators of such sites to take greater care in their implementations, and visitors to take greater care in how, when, and what they share.Comment: 10 pages, 7 figures, 6 tables, 5 authors, and a partridge in a pear tre

    Distilling Privacy Requirements for Mobile Applications

    Get PDF
    As mobile computing applications have become commonplace, it is increasingly important for them to address end-users’ privacy requirements. Privacy requirements depend on a number of contextual socio-cultural factors to which mobility adds another level of contextual variation. However, traditional requirements elicitation methods do not sufficiently account for contextual factors and therefore cannot be used effectively to represent and analyse the privacy requirements of mobile end users. On the other hand, methods that do investigate contextual factors tend to produce data that does not lend itself to the process of requirements extraction. To address this problem we have developed a Privacy Requirements Distillation approach that employs a problem analysis framework to extract and refine privacy requirements for mobile applications from raw data gathered through empirical studies involving end users. Our approach introduces privacy facets that capture patterns of privacy concerns which are matched against the raw data. We demonstrate and evaluate our approach using qualitative data from an empirical study of a mobile social networking application

    Privacy in crowdsourcing:a systematic review

    Get PDF
    The advent of crowdsourcing has brought with it multiple privacy challenges. For example, essential monitoring activities, while necessary and unavoidable, also potentially compromise contributor privacy. We conducted an extensive literature review of the research related to the privacy aspects of crowdsourcing. Our investigation revealed interesting gender differences and also differences in terms of individual perceptions. We conclude by suggesting a number of future research directions.</p

    “We’re being tracked at all times”: Student perspectives of their privacy in relation to learning analytics in higher education

    Get PDF
    Higher education institutions are continuing to develop their capacity for learning analytics (LA), which is a sociotechnical data mining and analytic practice. Institutions rarely inform their students about LA practices and there exist significant privacy concerns. Without a clear student voice in the design of LA, institutions put themselves in an ethical grey area. To help fill this gap in practice and add to the growing literature on students’ privacy perspectives, this study reports findings from over 100 interviews with undergraduate students at eight United States highereducation institutions. Findings demonstrate that students lacked awareness of educational data mining and analytic practices, as well as the data on which they rely. Students see potential in LA, but they presented nuanced arguments about when and with whom data should be shared; they also expressed why informed consent was valuable and necessary. The study uncovered perspectives on institutional trust that were heretofore unknown, as well as what actions might violate that trust. Institutions must balance their desire to implement LA with their obligation to educate students about their analytic practices and treat them as partners in the design of analytic strategies reliant on student data in order to protect their intellectual privacy

    Media, Capabilities, and Justification

    No full text
    In this paper, I evaluate the ‘capability approach’ developed by Amartya Sen and Martha Nussbaum as a normative perspective for critical media research. The concept of capabilities provides a valuable way of assessing media and captures important aspects of the relationship between media and equality. However, following Rainer Forst’s critique of outcome- oriented approaches to justice, I argue the capability approach needs to pay more attention to questions of power and process. In particular, when it comes to deciding which capabilities media should promote and what media structure and practices should promote them, the capability approach must accept the priority of deliberative and democratic processes of justification. Once we do this, we are urged to situate the concept of capabilities within a more process-oriented view of justice, focused not on capabilities as such, but on outlining the conditions required for justificatory equality. After discussing the capability approach, I will outline the process-oriented theory of justice Forst has developed around the idea of the ‘right to justification’. While Forst does not discuss media in depth, I argue his theory of justice can provide a valuable alternative normative standpoint for the critical media research

    The Psychology of Privacy in the Digital Age

    Get PDF
    Privacy is a psychological topic suffering from historical neglect – a neglect that is increasingly consequential in an era of social media connectedness, mass surveillance and the permanence of our electronic footprint. Despite fundamental changes in the privacy landscape, social and personality psychology journals remains largely unrepresented in debates on the future of privacy. By contrast, in disciplines like computer science and media and communication studies, engaging directly with socio- technical developments, interest in privacy has grown considerably. In our review of this interdisciplinary literature we suggest four domains of interest to psychologists. These are: sensitivity to individual differences in privacy disposition; a claim that privacy is fundamentally based in social interactions; a claim that privacy is inherently contextual; and a suggestion that privacy is as much about psychological groups as it is about individuals. Moreover, we propose a framework to enable progression to more integrative models of the psychology of privacy in the digital age, and in particular suggest that a group and social relations based approach to privacy is needed

    The affective atmospheres of surveillance

    Get PDF
    The spaces that surveillance produces can be thought of as ambiguous, entailing elements that are ethereal yet material, geographical yet trans-geographical. Contemporary surveillance systems form numerous connections that involve multiple times, spaces, and bodies. Owing to their ubiquity, normalization, and yet clandestine characteristics, they seem to produce an almost unnoticed aspect of everyday life. The impacts, then, of contemporary surveillance systems appear to be particularly experienced on the margins of consciousness. Thus we find that an empirical analysis of this realm of experience is possible but requires one to look for such things as disruption, disfluency, and hesitation in the text of speech acts rather than clear representation. Through empirical analysis of narratives concerning everyday experiences of living with contemporary surveillance systems, this paper focuses on their possible affective impacts. In turn, we find it more fitting to think about the so-called “surveillance society” in terms of producing “atmospheres” rather than “cultures or assemblages,” and “affects” rather than “emotions.” © 2013, SAGE Publications. All rights reserved

    Competing jurisdictions: data privacy across the borders

    Get PDF
    Borderless cloud computing technologies are exacerbating tensions between European and other existing approaches to data privacy. On the one hand, in the European Union (EU), a series of data localisation initiatives are emerging with the objective of preserving Europe’s digital sovereignty, guaranteeing the respect of EU fundamental rights and preventing foreign law enforcement and intelligence agencies from accessing personal data. On the other hand, foreign countries are unilaterally adopting legislation requiring national corporations to disclose data stored in Europe, in this way bypassing jurisdictional boundaries grounded on physical data location. The chapter investigates this twofold dynamics by focusing particularly on the current friction between the EU data protection approach and the data privacy model of the United States (US) in the field of cloud computing
    corecore