602 research outputs found

    Empirical Study of Privacy Issues Among Social Networking Sites.

    Get PDF
    Social media networks are increasing their types of services and the numbers of users are rapidly growing. However, online consumers have expressed concerns about their personal privacy protection and recent news articles have shown many privacy breaches and unannounced changes to privacy policies. These events could adversely affect data protection and compromise user trust, thus it is vital that social sites contain explicit privacy policies stating a comprehensive list of protection methods. This study analyzes 60 worldwide social sites and finds that even if sites contain a privacy policy, the site pages may also possess technical elements that could be used to serendipitously collect personal information. The results show specific technical collection methods most common within several social network categories. Methods for improving online privacy practices are suggested

    Exploring personalized life cycle policies

    Get PDF
    Ambient Intelligence imposes many challenges in protecting people's privacy. Storing privacy-sensitive data permanently will inevitably result in privacy violations. Limited retention techniques might prove useful in order to limit the risks of unwanted and irreversible disclosure of privacy-sensitive data. To overcome the rigidness of simple limited retention policies, Life-Cycle policies more precisely describe when and how data could be first degraded and finally be destroyed. This allows users themselves to determine an adequate compromise between privacy and data retention. However, implementing and enforcing these policies is a difficult problem. Traditional databases are not designed or optimized for deleting data. In this report, we recall the formerly introduced life cycle policy model and the already developed techniques for handling a single collective policy for all data in a relational database management system. We identify the problems raised by loosening this single policy constraint and propose preliminary techniques for concurrently handling multiple policies in one data store. The main technical consequence for the storage structure is, that when allowing multiple policies, the degradation order of tuples will not always be equal to the insert order anymore. Apart from the technical aspects, we show that personalizing the policies introduces some inference breaches which have to be further investigated. To make such an investigation possible, we introduce a metric for privacy, which enables the possibility to compare the provided amount of privacy with the amount of privacy required by the policy

    A privacy awareness system for ubiquitous computing environments

    Get PDF
    www.inf.ethz.ch/˜langhein Abstract. Protecting personal privacy is going to be a prime concern for the deployment of ubiquitous computing systems in the real world. With daunting Orwellian visions looming, it is easy to conclude that tamper-proof technical protection mechanisms such as strong anonymization and encryption are the only solutions to such privacy threats. However, we argue that such perfect protection for personal information will hardly be achievable, and propose instead to build systems that help others respect our personal privacy, enable us to be aware of our own privacy, and to rely on social and legal norms to protect us from the few wrongdoers. We introduce a privacy awareness system targeted at ubiquitous computing environments that allows data collectors to both announce and implement data usage policies, as well as providing data subjects with technical means to keep track of their personal information as it is stored, used, and possibly removed from the system. Even though such a system cannot guarantee our privacy, we believe that it can create a sense of accountability in a world of invisible services that we will be comfortable living in and interacting with.

    Privacy Issues of the W3C Geolocation API

    Full text link
    The W3C's Geolocation API may rapidly standardize the transmission of location information on the Web, but, in dealing with such sensitive information, it also raises serious privacy concerns. We analyze the manner and extent to which the current W3C Geolocation API provides mechanisms to support privacy. We propose a privacy framework for the consideration of location information and use it to evaluate the W3C Geolocation API, both the specification and its use in the wild, and recommend some modifications to the API as a result of our analysis

    An information privacy taxonomy for collaborative environments

    Get PDF
    Purpose: Information Privacy is becoming an increasingly important field of research with many new definitions and terminologies. Along similar rates of increase are the use, uptake and expansion of Collaborative Environments. There is a need for a better understanding and classification of information privacy concepts and terms. The pur-pose of this paper is to provide a taxonomy of Information Privacy in Collaborative Environments. The knowledge provided from an information privacy taxonomy can be used to formulate better information privacy policies, practices, and privacy enhancing technologies (PET?s).Approach: Through the hierarchical classification and categorization of information privacy concepts and principles an organized representation of these components has been produced. Each area was well surveyed and researched and then classified into a number of sub-categories according to their nature and relevance.Findings: A taxonomy was successfully developed with the identification of three high level dimensions of information privacy. Within each dimensional view a further three sub-classifications were proposed each with their own unique nature.Originality: This paper provides an Information Privacy taxonomy for Collaborative Environments, the first of its kind to be proposed. A number of new Information Pri-vacy terms are defined that make up the categorization and classification of Informa-tion Privacy concepts and components

    Integration of situational and reward elements for fair privacy principles and preferences (F3P)

    Get PDF
    It is widely acknowledged that Information Privacy is subjective in nature and contextually influenced. Individuals value their personal privacy differently with many willing to trade-off of privacy for some form of reward or personal gain. Many of the proposed privacy protection schemes do not give due consideration to the contextual, and more importantly situational influence on privacy. Rather privacy preferences for personal data are configurable for only a limited set of notions that include purpose, recipient, category, and condition. Current solutions offer no, or very limited, support for individual situational privacy preferences. This paper proposes a conceptual framework that allows entities to assign privacy preferences to their personal data items that incorporate situation and reward elements. The solution allows entities to assign trade-off values to their personal data based on the situation and context of the data request. In this manner the data owners set what they perceive as fair privacy practices and preferences for evaluating the worth of their personal data

    Privacy self-regulation and the changing role of the state: from public law to social and technical mechanisms of governance

    Get PDF
    This paper provides a structured overview of different self-governance mechanisms for privacy and data protection in the corporate world, with a special focus on Internet privacy. It also looks at the role of the state, and how it has related to privacy self-governance over time. While early data protection started out as law-based regulation by nation-states, transnational self-governance mechanisms have become more important due to the rise of global telecommunications and the Internet. Reach, scope, precision and enforcement of these industry codes of conduct vary a lot. The more binding they are, the more limited is their reach, though they - like the state-based instruments for privacy protection - are becoming more harmonised and global in reach nowadays. These social codes of conduct are developed by the private sector with limited participation of official data protection commissioners, public interest groups, or international organisations. Software tools - technical codes - for online privacy protection can give back some control over their data to individual users and customers, but only have limited reach and applications. The privacy-enhancing design of network infrastructures and database architectures is still mainly developed autonomously by the computer and software industry. Here, we can recently find a stronger, but new role of the state. Instead of regulating data processors directly, governments and oversight agencies now focus more on the intermediaries - standards developers, large software companies, or industry associations. And instead of prescribing and penalising, they now rely more on incentive-structures like certifications or public funding for social and technical self-governance instruments of privacy protection. The use of technology as an instrument and object of regulation is thereby becoming more popular, but the success of this approach still depends on the social codes and the underlying norms which technology is supposed to embed. --

    A Blockchain-based Approach for Data Accountability and Provenance Tracking

    Full text link
    The recent approval of the General Data Protection Regulation (GDPR) imposes new data protection requirements on data controllers and processors with respect to the processing of European Union (EU) residents' data. These requirements consist of a single set of rules that have binding legal status and should be enforced in all EU member states. In light of these requirements, we propose in this paper the use of a blockchain-based approach to support data accountability and provenance tracking. Our approach relies on the use of publicly auditable contracts deployed in a blockchain that increase the transparency with respect to the access and usage of data. We identify and discuss three different models for our approach with different granularity and scalability requirements where contracts can be used to encode data usage policies and provenance tracking information in a privacy-friendly way. From these three models we designed, implemented, and evaluated a model where contracts are deployed by data subjects for each data controller, and a model where subjects join contracts deployed by data controllers in case they accept the data handling conditions. Our implementations show in practice the feasibility and limitations of contracts for the purposes identified in this paper
    • 

    corecore