297 research outputs found

    Privacy in an Ambient World

    Get PDF
    Privacy is a prime concern in today's information society. To protect\ud the privacy of individuals, enterprises must follow certain privacy practices, while\ud collecting or processing personal data. In this chapter we look at the setting where an\ud enterprise collects private data on its website, processes it inside the enterprise and\ud shares it with partner enterprises. In particular, we analyse three different privacy\ud systems that can be used in the different stages of this lifecycle. One of them is the\ud Audit Logic, recently introduced, which can be used to keep data private when it\ud travels across enterprise boundaries. We conclude with an analysis of the features\ud and shortcomings of these systems

    Obligations of trust for privacy and confidentiality in distributed transactions

    Get PDF
    Purpose – This paper aims to describe a bilateral symmetric approach to authorization, privacy protection and obligation enforcement in distributed transactions. The authors introduce the concept of the obligation of trust (OoT) protocol as a privacy assurance and authorization mechanism that is built upon the XACML standard. The OoT allows two communicating parties to dynamically exchange their privacy and authorization requirements and capabilities, which the authors term a notification of obligation (NoB), as well as their commitments to fulfilling each other's requirements, which the authors term signed acceptance of obligations (SAO). The authors seek to describe some applicability of these concepts and to show how they can be integrated into distributed authorization systems for stricter privacy and confidentiality control. Design/methodology/approach – Existing access control and privacy protection systems are typically unilateral and provider-centric, in that the enterprise service provider assigns the access rights, makes the access control decisions, and determines the privacy policy. There is no negotiation between the client and the service provider about which access control or privacy policy to use. The authors adopt a symmetric, more user-centric approach to privacy protection and authorization, which treats the client and service provider as peers, in which both can stipulate their requirements and capabilities, and hence negotiate terms which are equally acceptable to both parties. Findings – The authors demonstrate how the obligation of trust protocol can be used in a number of different scenarios to improve upon the mechanisms that are currently available today. Practical implications – This approach will serve to increase trust in distributed transactions since each communicating party receives a difficult to repudiate digitally signed acceptance of obligations, in a standard language (XACML), which can be automatically enforced by their respective computing machinery. Originality/value – The paper adds to current research in trust negotiation, privacy protection and authorization by combining all three together into one set of standardized protocols. Furthermore, by providing hard to repudiate signed acceptance of obligations messages, this strengthens the legal case of the injured party should a dispute arise

    An Audit Logic for Accountability

    Get PDF
    We describe and implement a policy language. In our system, agents can distribute data along with usage policies in a decentralized architecture. Our language supports the specification of conditions and obligations, and also the possibility to refine policies. In our framework, the compliance with usage policies is not actively enforced. However, agents are accountable for their actions, and may be audited by an authority requiring justifications.Comment: To appear in Proceedings of IEEE Policy 200

    In Defense of the Long Privacy Statement

    Get PDF

    A privacy awareness system for ubiquitous computing environments

    Get PDF
    www.inf.ethz.ch/˜langhein Abstract. Protecting personal privacy is going to be a prime concern for the deployment of ubiquitous computing systems in the real world. With daunting Orwellian visions looming, it is easy to conclude that tamper-proof technical protection mechanisms such as strong anonymization and encryption are the only solutions to such privacy threats. However, we argue that such perfect protection for personal information will hardly be achievable, and propose instead to build systems that help others respect our personal privacy, enable us to be aware of our own privacy, and to rely on social and legal norms to protect us from the few wrongdoers. We introduce a privacy awareness system targeted at ubiquitous computing environments that allows data collectors to both announce and implement data usage policies, as well as providing data subjects with technical means to keep track of their personal information as it is stored, used, and possibly removed from the system. Even though such a system cannot guarantee our privacy, we believe that it can create a sense of accountability in a world of invisible services that we will be comfortable living in and interacting with.

    Privacy self-regulation and the changing role of the state: from public law to social and technical mechanisms of governance

    Get PDF
    This paper provides a structured overview of different self-governance mechanisms for privacy and data protection in the corporate world, with a special focus on Internet privacy. It also looks at the role of the state, and how it has related to privacy self-governance over time. While early data protection started out as law-based regulation by nation-states, transnational self-governance mechanisms have become more important due to the rise of global telecommunications and the Internet. Reach, scope, precision and enforcement of these industry codes of conduct vary a lot. The more binding they are, the more limited is their reach, though they - like the state-based instruments for privacy protection - are becoming more harmonised and global in reach nowadays. These social codes of conduct are developed by the private sector with limited participation of official data protection commissioners, public interest groups, or international organisations. Software tools - technical codes - for online privacy protection can give back some control over their data to individual users and customers, but only have limited reach and applications. The privacy-enhancing design of network infrastructures and database architectures is still mainly developed autonomously by the computer and software industry. Here, we can recently find a stronger, but new role of the state. Instead of regulating data processors directly, governments and oversight agencies now focus more on the intermediaries - standards developers, large software companies, or industry associations. And instead of prescribing and penalising, they now rely more on incentive-structures like certifications or public funding for social and technical self-governance instruments of privacy protection. The use of technology as an instrument and object of regulation is thereby becoming more popular, but the success of this approach still depends on the social codes and the underlying norms which technology is supposed to embed. --

    The Audit Logic: Policy Compliance in Distributed Systems

    Get PDF
    We present a distributed framework where agents can share data along with usage policies. We use an expressive policy language including conditions, obligations and delegation. Our framework also supports the possibility to refine policies. Policies are not enforced a-priori. Instead policy compliance is checked using an a-posteriri auditing approach. Policy compliance is shown by a (logical) proof that the authority can systematically check for validity. Tools for automatically checking and generating proofs are also part of the framework.\u
    • 

    corecore