28,142 research outputs found

    1984 Is Still Fiction: Electronic Monitoring in the Workplace and U.S. Privacy Law

    Get PDF
    Electronic monitoring in the workplace has been the subject of relentless public criticism. Privacy advocates argue that technological advancements have given overbearing employers powerful tools to abuse employee dignity in the name of productivity and that new legislation should bolster workplace privacy rights. This iBrief contends that current U.S. legal doctrine governing electronic monitoring in the workplace is fair given the nature and purpose of the workplace, and potential employer liability for employee misconduct

    User's Privacy in Recommendation Systems Applying Online Social Network Data, A Survey and Taxonomy

    Full text link
    Recommender systems have become an integral part of many social networks and extract knowledge from a user's personal and sensitive data both explicitly, with the user's knowledge, and implicitly. This trend has created major privacy concerns as users are mostly unaware of what data and how much data is being used and how securely it is used. In this context, several works have been done to address privacy concerns for usage in online social network data and by recommender systems. This paper surveys the main privacy concerns, measurements and privacy-preserving techniques used in large-scale online social networks and recommender systems. It is based on historical works on security, privacy-preserving, statistical modeling, and datasets to provide an overview of the technical difficulties and problems associated with privacy preserving in online social networks.Comment: 26 pages, IET book chapter on big data recommender system

    Personal data broker instead of blockchain for students’ data privacy assurance

    Get PDF
    Data logs about learning activities are being recorded at a growing pace due to the adoption and evolution of educational technologies (Edtech). Data analytics has entered the field of education under the name of learning analytics. Data analytics can provide insights that can be used to enhance learning activities for educational stakeholders, as well as helping online learning applications providers to enhance their services. However, despite the goodwill in the use of Edtech, some service providers use it as a means to collect private data about the students for their own interests and benefits. This is showcased in recent cases seen in media of bad use of students’ personal information. This growth in cases is due to the recent tightening in data privacy regulations, especially in the EU. The students or their parents should be the owners of the information about them and their learning activities online. Thus they should have the right tools to control how their information is accessed and for what purposes. Currently, there is no technological solution to prevent leaks or the misuse of data about the students or their activity. It seems appropriate to try to solve it from an automation technology perspective. In this paper, we consider the use of Blockchain technologies as a possible basis for a solution to this problem. Our analysis indicates that the Blockchain is not a suitable solution. Finally, we propose a cloud-based solution with a central personal point of management that we have called Personal Data Broker.Peer ReviewedPostprint (author's final draft

    Cyber-Democracy or Cyber-Hegemony? Exploring the Political and Economic Structures of the Internet as an Alternative Source of Information

    Get PDF
    Although government regulation of the Internet has been decried as undercutting free speech, the control of Internet content through capitalist gateways???namely, profit-driven software companies???has gone largely uncriticized. The author argues that this discursive trend manufactures consent through a hegemonic force neglecting to confront the invasion of online advertising or marketing strategies directed at children. This study suggests that ???inappropriate content??? (that is, nudity, pornography, obscenities) constitutes a cultural currency through which concerns and responses to the Internet have been articulated within the mainstream. By examining the rhetorical and financial investments of the telecommunications business sector, the author contends that the rhetorical elements creating ???cyber-safety??? concerns within the mainstream attempt to reach the consent of parents and educators by asking them to see some Internet content as value laden (sexuality, trigger words, or adult content), while disguising the interests and authority of profitable computer software and hardware industries (advertising and marketing). Although most online ???safety measures??? neglect to confront the emerging invasion of advertising/marketing directed at children and youth, the author argues that media literacy in cyberspace demands such scrutiny. Unlike measures to block or filter online information, students need an empowerment approach that will enable them to analyze, evaluate, and judge the information they receive.published or submitted for publicatio

    Yes, I know this IoT Device Might Invade my Privacy, but I Love it Anyway! A Study of Saudi Arabian Perceptions

    Get PDF
    The Internet of Things (IoT) ability to monitor our every move raises many privacy concerns. This paper reports on a study to assess current awareness of privacy implications of IoT devices amongst Saudi Arabians. We found that even when users are aware of the potential for privacy invasion, their need for the convenience these devices afford leads them to discount this potential and to ignore any concerns they might initially have had. We then conclude by making some predictions about the direction the IoT field will take in the next 5-7 years, in terms of privacy invasion, protection and awareness

    Surveillance, big data and democracy: lessons for Australia from the US and UK

    Get PDF
    This article argues that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society. Introduction In the era of big data, where people find themselves surveilled in ever more finely granulated aspects of their lives, and where the data profiles built from an accumulation of data gathered about themselves and others are used to predict as well as shape their behaviours, the question of privacy protection arises constantly. In this article we interrogate whether the discourse of privacy is sufficient to address this new paradigm of information flow and control. What we confront in this area is a set of practices concerning the collection, aggregation, sharing, interrogation and uses of data on a scale that crosses private and public boundaries, jurisdictional boundaries, and importantly, the boundaries between reality and simulation. The consequences of these practices are emerging as sometimes useful and sometimes damaging to governments, citizens and commercial organisations. Understanding how to regulate this sphere of activity to address the harms, to create an infrastructure of accountability, and to bring more transparency to the practices mentioned, is a challenge of some complexity. Using privacy frameworks may not provide the solutions or protections that ultimately are being sought. This article is concerned with data gathering and surveillance practices, by business and government, and the implications for individual privacy in the face of widespread collection and use of big data. We will firstly outline the practices around data and the issues that arise from such practices. We then consider how courts in the United Kingdom (‘UK’) and the United States (‘US’) are attempting to frame these issues using current legal frameworks, and finish by considering the Australian context. Notably the discourse around privacy protection differs significantly across these jurisdictions, encompassing elements of constitutional rights and freedoms, specific legislative schemes, data protection, anti-terrorist and criminal laws, tort and equity. This lack of a common understanding of what is or what should be encompassed within privacy makes it a very fragile creature indeed. On the basis of the exploration of these issues, we conclude that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society

    Privacy, Ideology, and Technology: A Response to Jeffrey Rosen

    Get PDF
    This essay reviews Jeffrey Rosen’s The Unwanted Gaze: The Destruction of Privacy in America (2000). Rosen offers a compelling (and often hair-raising) account of the pervasive dissolution of the boundary between public and private information. This dissolution is both legal and social; neither the law nor any other social institution seems to recognize many limits on the sorts of information that can be subjected to public scrutiny. The book also provides a rich, evocative characterization of the dignitary harms caused by privacy invasion. Rosen’s description of the sheer unfairness of being “judged out of context” rings instantly true. Privacy, Rosen concludes, is indispensable to human well-being and is at risk of being destroyed unless we act fast. The book is far less convincing, however, when it moves beyond description and attempts to identify the causes of the destruction of privacy and propose solutions. Why is privacy under siege today? The incidents that Rosen chooses as illustrations both reveal and obscure. From Monica Lewinsky’s unsent, deleted e-mails to the private online activities of corporate employees and the Dean of the Harvard Divinity School, the examples offer a rich stew of technology, corporate mind control, public scapegoating, and political intrigue. But for the most part, Rosen seems to think that it is sex that is primarily to blame for these developments—though how, exactly, Rosen cannot seem to decide. He suggests, variously, that we seek private information out of prurient fascination with other people’s intimate behavior, or to enforce upon others authoritarian notions of “correct” interpersonal behavior, or to inform moral judgments about others based on a hasty and ill-conceived equivalence between the personal and the political. Or perhaps Rosen is simply upset about the loss of privacy for a specific sort of (sexual or intimate) behavior, whatever the origin of society’s impulse to pry. Yet there are puzzling anomalies in Rosen’s account. Most notably, appended to Rosen’s excavation of recent sex-related privacy invasions is a chapter on privacy in cyberspace. This chapter sits uneasily in relation to the rest of the book. Its focus is not confined to sex-related privacy, and Rosen does not explain how the more varied information-gathering activities chronicled there bear on his earlier analysis. Rosen acknowledges as much and offers, instead, the explanation that intimate privacy and cyberspace privacy are simply two examples of the same problem: the risk of being judged out of context in a world of short attention spans, and the harms to dignity that follow. This explanation seems far too simple, and more than a bit circular. Why this rush to judge others out of context? Necessity is one answer—if attention spans are limited, we cannot avoid making decisions based on incomplete information—but where does the necessity to judge come from? And what do computers and digital networking technologies—factors that recur not only in the chapter on cyberspace privacy, but also in most of Rosen’s other examples—have to do with it? This Review Essay argues, first, that the use of personal information to sort and classify individuals is inextricably bound up with the fabric of our political economy. As Part II explains, the unfettered use of “true” information to predict risk and minimize uncertainty is a hallmark of the liberal state and its constituent economic and political markets. Not sex, but money, and more broadly an ideology about the predictive power of isolated facts, generate the perceived necessity to judge individuals based on incomplete profiles. The harms of this rush to judgment—harms not only to dignity, but also to economic welfare and more fundamentally to individual autonomy—may undermine liberal individualism (as Rosen argues), but they are products of it as well. Part III argues, further, that the problem of vanishing informational privacy in digital networked environments is not sui generis, but rather is central to understanding the destruction of privacy more generally. This is not simply because new technologies reduce the costs of collecting, exchanging, and processing the traditional sorts of consumer information. The profit-driven search for personal information via digital networks is also catalyzing an erosion of the privacy that individuals have customarily enjoyed in their homes, their private papers, and even their thoughts. This process is transforming not only the way we experience privacy, but also the way we understand it. Privacy is becoming not only harder to protect, but also harder to justify protecting. Part IV concludes that shifting these mutually reinforcing ideological and technological vectors will require more drastic intervention than Rosen suggests

    Democracy, Ideology and Process Re-Engineering: Realising the Benefits of e-Government in Singapore

    No full text
    The re-engineering of governmental processes is a necessary condition for the realisation of the benefits of e-government. Several obstacles to such re-engineering exist. These include: (1) information processing thrives on transparency and amalgamation of data, whilst governments are constrained by principles of privacy and data separation; (2) top-down re-engineering may be resisted effectively from the bottom up. This paper analyses these obstacles in the way of re-engineering in Singapore – a democratic one-party state where legislative and executive power lies with the People’s Action Party – and considers how that hegemony has aided the development of e-government
    • 

    corecore