9 research outputs found

    Multiparty Privacy in Social Media

    Get PDF

    Systematic Review on Privacy Categorization

    Full text link
    In the modern digital world users need to make privacy and security choices that have far-reaching consequences. Researchers are increasingly studying people's decisions when facing with privacy and security trade-offs, the pressing and time consuming disincentives that influence those decisions, and methods to mitigate them. This work aims to present a systematic review of the literature on privacy categorization, which has been defined in terms of profile, profiling, segmentation, clustering and personae. Privacy categorization involves the possibility to classify users according to specific prerequisites, such as their ability to manage privacy issues, or in terms of which type of and how many personal information they decide or do not decide to disclose. Privacy categorization has been defined and used for different purposes. The systematic review focuses on three main research questions that investigate the study contexts, i.e. the motivations and research questions, that propose privacy categorisations; the methodologies and results of privacy categorisations; the evolution of privacy categorisations over time. Ultimately it tries to provide an answer whether privacy categorization as a research attempt is still meaningful and may have a future

    Modelling perceived risks to personal privacy from location disclosure on online social networks

    Get PDF
    As users increasingly rely on online social networks for their communication activities, personal location data processing through such networks poses significant risks to users’ privacy. Location tracks can be mined with other shared information to extract rich personal profiles. To protect users’ privacy, online social networks face the challenge of ensuring transparent communication to users of how their data are processed, and explicitly obtaining users’ informed consent for the use of this data. In this paper, we explore the complex nature of the location disclosure problem and its risks to personal privacy. We evaluate, with an experiment involving 715 participants, the contributing factors to the perception of such risks with scenarios that mimic (a) realistic modes of interaction, where users are not fully aware of the extent of their location-related data being processed, and (b) with devised scenarios that deliberately inform users of the data they are sharing and its visibility to others. The results are used to represent the users’ perception of privacy risks when sharing their location information online and to derive a possible model of privacy risks associated with this sharing behaviour. Such a model can inform the design of privacy-aware online social networks to improve users’ trust and to ensure compliance with legal frameworks for personal privacy

    Mapping user preference to privacy default settings

    No full text
    Copyright © 2015 ACM. Managing the privacy of online information can be a complex task often involving the configuration of a variety of settings. For example, Facebook users determine which audiences have access to their profile information and posts, how friends can interact with them through tagging, and how others can search for them-and many more privacy tasks. In most cases, the default privacy settings are permissive and appear to be designed to promote information sharing rather than privacy. Managing privacy online can be complex and often users do not change defaults or use granular privacy settings. In this article, we investigate whether default privacy settings on social network sites could be more customized to the preferences of users. We survey users\u27 privacy attitudes and sharing preferences for common SNS profile items. From these data, we explore using audience characterizations of profile items to quantify fit scores that indicate how well default privacy settings represent user privacy preferences. We then explore the fit of various schemes, including examining whether privacy attitude segmentation can be used to improve default settings. Our results suggest that using audience characterizations from community data to create default privacy settings can better match users\u27 desired privacy settings

    Facebook: Where privacy concerns and social needs collide

    Get PDF
    Facebook is an integral part of today’s social landscape, but Facebook use involves compromising one’s privacy in relation to both other users and to the Facebook corporation and its affiliated businesses. This analysis explores respondents’ reasons for using Facebook together with their Facebook-related privacy concerns, and how these factors influence self-disclosures and privacy management strategies on the site. Also explored are respondents’ perceptions both of what the Facebook corporation ‘knows’ about them and with whom it shares their data. The research is based on the concepts of user-user and user-corporate privacy concerns versus the social needs of self-portrayal and belonging. Self-portrayal (inspired by Friedlander, 2011) is explored in the contexts of both strategic self-presentation and expression of the true self, and belonging is explored in the contexts of both intimacy and affiliation. These concepts have been drawn from a combination of psychological theories together with existing research on privacy concerns and social needs on social networking sites. Respondents completed an online questionnaire over a six week period from late August to early October 2014, and a focus group was held in November 2014. The questionnaire was largely quantitative but allowed for qualitative input via text boxes. There were 404 completed and valid responses, and of the demographic factors tested, gender was most strongly associated with Facebook-related privacy concerns and age was most strongly associated with reasons for using Facebook. Respondents indicated a clash between fulfilling their social needs on Facebook and their privacy concerns on the site. However, these concerns did not, for the most part, stop them using Facebook, although in certain instances respondents employed tactics to minimise their privacy concerns. This thesis argues that, when using Facebook, respondents resolved the privacy paradox to the best of their ability. It is anticipated that the findings of this thesis will contribute to the ongoing dialogue surrounding the benefits and drawbacks of social media use

    A Privacy-Enhancing Framework for Mobile Devices

    Get PDF
    The use of mobile devices in daily life has increased exponentially, leading to them occupying many essential aspects of people’s lives, such as replacing credit cards to make payments, and for various forms of entertainment and social activities. Therefore, users have installed an enormous number of apps. These apps can collect and share a large amount of data, such as location data, images, videos, health data, and call logs, which are highly valuable and sensitive for users. Consequently, the use of apps raises a variety of privacy concerns regarding which app is allowed to access and share; to what degree of granularity, and how to manage and limit the disclosure of this data. Accordingly, it is imperative to develop and design a holistic solution for enhancing privacy on mobile apps to meet users’ privacy preferences. The research design in this study involved an attempt to address the problem in a coherent and logical way. Therefore, the research involved different phases, starting with identifying potential user requirements based on the literature, and then designing a participatory study to explore whether the initial requirements and design meet users’ preferences, which in turn led to the design of a final artefact. Design science requires the creation of a viable artefact for the current problem in the field. Thus, this study reviews the current use of privacy technologies and critically analyses the available solutions in order to investigate whether these solutions have the capability to meet personal privacy preferences and maximise users’ satisfaction. It is evident that most of the prior studies assume the homogeneity of privacy preferences across users, yet users’ privacy preferences differ from one user to another in the context of how to control and manage their data, prioritisation of information, personalised notifications, and levels of knowledge. Moreover, solutions with a user interface designed according to the users’ perceptions and based on HCI principles are not readily available. Therefore, it is paramount to meet and adopt user’s need and requirements to enhance privacy technology for mobile apps. A survey of 407 mobile users was undertaken to discover users’ privacy preferences. The outcome of the survey shows that it is possible to prioritise information into 10 unique profiles. Each profile effectively represents a cluster of likeminded users and captures their privacy-related information preferences. The outcomes of the analysis also revealed that users differ not only in the context of prioritisation of their information, but also regarding design, protection settings, responses, and level of knowledge. This, in turn, emphasises the need to develop and design a holistic solution for users, considering all these dimensions. As such, the thesis proposes a novel framework for enhancing privacy technology in a modular and robust manner that would support such a system in practice. This system provides a comprehensive solution that has been developed by considering different dimensions, and it includes a personalised response, prioritisation of privacy-related information, multilevel privacy controls, and also considers users’ varying levels of knowledge. As a result, this approach should enhance users’ privacy awareness and meet their needs to protect their privacy. Additionally, the proposed of the system consists of user interfaces designed according to the users’ perceptions and based on HCI principles to overcome the usability issues without compromising the users’ convenience. Ultimately, the evaluation of the effectiveness of the proposed approach shows that it is feasible and would enhance privacy technology as well as user convenience. This, in turn, would increase trust in the system and reduce privacy concerns
    corecore