16 research outputs found

    Understanding Organizational Approach towards End User Privacy

    Get PDF
    This research employed a longitudinal social network analysis (SNA) method, called stochastic actor-oriented modelling (SAOM), to analyse the inter-relationship between the employees’ socialisation and information security (IS) climate perceptions which are the employees’ perceptions of their colleagues and supervisors’ IS practices. Unlike prior studies, we conceptualised socialisation in the form of six networks: the exchange of work advice and of organisational updates, the provision of personal advice, interpersonal trust in expertise, the provision of IS advice, and support for IS troubleshooting. The adoption of the SAOM method enabled not only analysis of why an employee chooses to interact with or to send a network tie to another employee, but also how an employee’s perception of IS climate is affected by the ties that they possess in the network. This research suggests new directions for IS behavioural research based on the adoption of SNA methods to study IS-related perceptions and behaviours, while findings about the selection and influence mechanisms offer theoretical insights and practical methods to enhance IS in the workplace

    Designing Privacy for You : A User Centric Approach For Privacy

    Full text link
    Privacy directly concerns the user as the data owner (data- subject) and hence privacy in systems should be implemented in a manner which concerns the user (user-centered). There are many concepts and guidelines that support development of privacy and embedding privacy into systems. However, none of them approaches privacy in a user- centered manner. Through this research we propose a framework that would enable developers and designers to grasp privacy in a user-centered manner and implement it along with the software development life cycle.Comment: 14 pages, HCI International 2017 Vancouver, Canad

    Designing a Serious Game: Teaching Developers to Embed Privacy into Software Systems

    Full text link
    Software applications continue to challenge user privacy when users interact with them. Privacy practices (e.g. Data Minimisation (DM), Privacy by Design (PbD) or General Data Protection Regulation (GDPR)) and related "privacy engineering" methodologies exist and provide clear instructions for developers to implement privacy into software systems they develop that preserve user privacy. However, those practices and methodologies are not yet a common practice in the software development community. There has been no previous research focused on developing "educational" interventions such as serious games to enhance software developers' coding behaviour. Therefore, this research proposes a game design framework as an educational tool for software developers to improve (secure) coding behaviour, so they can develop privacy-preserving software applications that people can use. The elements of the proposed framework were incorporated into a gaming application scenario that enhances the software developers' coding behaviour through their motivation. The proposed work not only enables the development of privacy-preserving software systems but also helping the software development community to put privacy guidelines and engineering methodologies into practice.Comment:

    A Model for System Developers to Measure the Privacy Risk of Data

    Get PDF
    In this paper, we propose a model that could be used by system developers to measure the perceived privacy risk of users when they disclose data into software systems. We first derive a model to measure the perceived privacy risk based on existing knowledge and then we test our model through a survey with 151 participants. Our findings revealed that users\u27 perceived privacy risk monotonically increases with data sensitivity and visibility, and monotonically decreases with data relevance to the application. Furthermore, how visible data is in an application by default when the user discloses data had the highest impact on the perceived privacy risk. This model would enable developers to measure the users\u27 perceived privacy risk associated with data items, which would help them to understand how to treat different data within a system design

    Will They Use It or Not? Investigating Software Developers’ Intention to Follow Privacy Engineering Methodologies

    Full text link
    With the increasing concerns over privacy in software systems, there is a growing enthusiasm to develop methods to support the development of privacy aware software systems. Inadequate privacy in software system designs could result in users losing their sensitive data, such as health information and financial information, which may cause financial and reputation loss. Privacy Engineering Methodologies (PEMs) are introduced into the software development processes with the goal of guiding software developers to embed privacy into the systems they design. However, for PEMs to be successful it is imperative that software developers have a positive intention to use PEMs. Otherwise, developers may attempt to bypass the privacy methodologies or use them partially and hence develop software systems that may not protect user privacy appropriately. To investigate the factors that affect software developers’ behavioural intention to follow PEMs, in this article, we conducted a study with 149 software developers. Findings of the study show that the usefulness of the PEM to the developers’ existing work to be the strongest determinant that affects software developers’ intention to follow PEMs. Moreover, the compatibility of the PEM with their way of work and how the PEM demonstrates its results when used were also found to be significant. These findings provide important insights in understanding the behaviour of software developers and how they perceive PEMs. The findings could be used to assist organisations and researchers to deploy PEMs and design PEMs that are positively accepted by software developers.</jats:p
    corecore