An identity- and trust-based computational model for privacy

Abstract

The seemingly contradictory need and want of online users for information sharing and privacy has inspired this thesis work. The crux of the problem lies in the fact that a user has inadequate control over the flow (with whom information to be shared), boundary (acceptable usage), and persistence (duration of use) of their personal information. This thesis has built a privacy-preserving information sharing model using context, identity, and trust to manage the flow, boundary, and persistence of disclosed information. In this vein, privacy is viewed as context-dependent selective disclosures of information. This thesis presents the design, implementation, and analysis of a five-layer Identity and Trust based Model for Privacy (ITMP). Context, trust, and identity are the main building blocks of this model. The application layer identifies the counterparts, the purpose of communication, and the information being sought. The context layer determines the context of a communication episode through identifying the role of a partner and assessing the relationship with the partner. The trust layer combines partner and purpose information with the respective context information to determine the trustworthiness of a purpose and a partner. Given that the purpose and the partner have a known level of trustworthiness, the identity layer constructs a contextual partial identity from the user's complete identity. The presentation layer facilitates in disclosing a set of information that is a subset of the respective partial identity. It also attaches expiration (time-to-live) and usage (purpose-to-live) tags into each piece of information before disclosure. In this model, roles and relationships are used to adequately capture the notion of context to address privacy. A role is a set of activities assigned to an actor or expected of an actor to perform. For example, an actor in a learner role is expected to be involved in various learning activities, such as attending lectures, participating in a course discussion, appearing in exams, etc. A relationship involves related entities performing activities involving one another. Interactions between actors can be heavily influenced by roles. For example, in a learning-teaching relationship, both the learner and the teacher are expected to perform their respective roles. The nuances of activities warranted by each role are dictated by individual relationships. For example, two learners seeking help from an instructor are going to present themselves differently. In this model, trust is realized in two forms: trust in partners and trust of purposes. The first form of trust assesses the trustworthiness of a partner in a given context. For example, a stranger may be considered untrustworthy to be given a home phone number. The second form of trust determines the relevance or justification of a purpose for seeking data in a given context. For example, seeking/providing a social insurance number for the purpose of a membership in a student organization is inappropriate. A known and tested trustee can understandably be re-trusted or re-evaluated based on the personal experience of a trustor. In online settings, however, a software manifestation of a trusted persistent public actor, namely a guarantor, is required to help find a trustee, because we interact with a myriad of actors in a large number of contexts, often with no prior relationships. The ITMP model is instantiated as a suite of Role- and Relationship-based Identity and Reputation Management (RRIRM) features in iHelp, an e-learning environment in use at the University of Saskatchewan. This thesis presents the results of a two-phase (pilot and larger-scale) user study that illustrates the effectiveness of the RRIRM features and thus the ITMP model in enhancing privacy through identity and trust management in the iHelp Discussion Forum. This research contributes to the understanding of privacy problems along with other competing interests in the online world, as well as to the development of privacy-enhanced communications through understanding context, negotiating identity, and using trust

    Similar works