211,227 research outputs found

    Trust-based model for privacy control in context aware systems

    Get PDF
    In context-aware systems, there is a high demand on providing privacy solutions to users when they are interacting and exchanging personal information. Privacy in this context encompasses reasoning about trust and risk involved in interactions between users. Trust, therefore, controls the amount of information that can be revealed, and risk analysis allows us to evaluate the expected benefit that would motivate users to participate in these interactions. In this paper, we propose a trust-based model for privacy control in context-aware systems based on incorporating trust and risk. Through this approach, it is clear how to reason about trust and risk in designing and implementing context-aware systems that provide mechanisms to protect users' privacy. Our approach also includes experiential learning mechanisms from past observations in reaching better decisions in future interactions. The outlined model in this paper serves as an attempt to solve the concerns of privacy control in context-aware systems. To validate this model, we are currently applying it on a context-aware system that tracks users' location. We hope to report on the performance evaluation and the experience of implementation in the near future

    Obligations of trust for privacy and confidentiality in distributed transactions

    Get PDF
    Purpose – This paper aims to describe a bilateral symmetric approach to authorization, privacy protection and obligation enforcement in distributed transactions. The authors introduce the concept of the obligation of trust (OoT) protocol as a privacy assurance and authorization mechanism that is built upon the XACML standard. The OoT allows two communicating parties to dynamically exchange their privacy and authorization requirements and capabilities, which the authors term a notification of obligation (NoB), as well as their commitments to fulfilling each other's requirements, which the authors term signed acceptance of obligations (SAO). The authors seek to describe some applicability of these concepts and to show how they can be integrated into distributed authorization systems for stricter privacy and confidentiality control. Design/methodology/approach – Existing access control and privacy protection systems are typically unilateral and provider-centric, in that the enterprise service provider assigns the access rights, makes the access control decisions, and determines the privacy policy. There is no negotiation between the client and the service provider about which access control or privacy policy to use. The authors adopt a symmetric, more user-centric approach to privacy protection and authorization, which treats the client and service provider as peers, in which both can stipulate their requirements and capabilities, and hence negotiate terms which are equally acceptable to both parties. Findings – The authors demonstrate how the obligation of trust protocol can be used in a number of different scenarios to improve upon the mechanisms that are currently available today. Practical implications – This approach will serve to increase trust in distributed transactions since each communicating party receives a difficult to repudiate digitally signed acceptance of obligations, in a standard language (XACML), which can be automatically enforced by their respective computing machinery. Originality/value – The paper adds to current research in trust negotiation, privacy protection and authorization by combining all three together into one set of standardized protocols. Furthermore, by providing hard to repudiate signed acceptance of obligations messages, this strengthens the legal case of the injured party should a dispute arise

    A flexible architecture for privacy-aware trust management

    Get PDF
    In service-oriented systems a constellation of services cooperate, sharing potentially sensitive information and responsibilities. Cooperation is only possible if the different participants trust each other. As trust may depend on many different factors, in a flexible framework for Trust Management (TM) trust must be computed by combining different types of information. In this paper we describe the TAS3 TM framework which integrates independent TM systems into a single trust decision point. The TM framework supports intricate combinations whilst still remaining easily extensible. It also provides a unified trust evaluation interface to the (authorization framework of the) services. We demonstrate the flexibility of the approach by integrating three distinct TM paradigms: reputation-based TM, credential-based TM, and Key Performance Indicator TM. Finally, we discuss privacy concerns in TM systems and the directions to be taken for the definition of a privacy-friendly TM architecture.\u

    An Examination of Privacy Policies of US Government Senate Websites.

    Get PDF
    US Government websites are rapidly increasing the services they offer, but users express concerns about their personal privacy protection. To earn user's trust, these sites must show that personal data is protected, and the sites contain explicit privacy policies. This research studied privacy policy protection of 50 US Senate sites and found that few had comprehensive elements of privacy policies and a general lack of protection of personal data that could be obtain from the website. The study reviewed which specific privacy elements are most often mishandled, as well as suggestions for improving an overall online privacy practice

    Preserving Link Privacy in Social Network Based Systems

    Full text link
    A growing body of research leverages social network based trust relationships to improve the functionality of the system. However, these systems expose users' trust relationships, which is considered sensitive information in today's society, to an adversary. In this work, we make the following contributions. First, we propose an algorithm that perturbs the structure of a social graph in order to provide link privacy, at the cost of slight reduction in the utility of the social graph. Second we define general metrics for characterizing the utility and privacy of perturbed graphs. Third, we evaluate the utility and privacy of our proposed algorithm using real world social graphs. Finally, we demonstrate the applicability of our perturbation algorithm on a broad range of secure systems, including Sybil defenses and secure routing.Comment: 16 pages, 15 figure

    Privacy, Trust and Identity Permissions for Ambient Intelligence

    Get PDF
    • 

    corecore