11 research outputs found

    Understanding Organizational Approach towards End User Privacy

    Get PDF
    This research employed a longitudinal social network analysis (SNA) method, called stochastic actor-oriented modelling (SAOM), to analyse the inter-relationship between the employees’ socialisation and information security (IS) climate perceptions which are the employees’ perceptions of their colleagues and supervisors’ IS practices. Unlike prior studies, we conceptualised socialisation in the form of six networks: the exchange of work advice and of organisational updates, the provision of personal advice, interpersonal trust in expertise, the provision of IS advice, and support for IS troubleshooting. The adoption of the SAOM method enabled not only analysis of why an employee chooses to interact with or to send a network tie to another employee, but also how an employee’s perception of IS climate is affected by the ties that they possess in the network. This research suggests new directions for IS behavioural research based on the adoption of SNA methods to study IS-related perceptions and behaviours, while findings about the selection and influence mechanisms offer theoretical insights and practical methods to enhance IS in the workplace

    Designing Privacy for You : A User Centric Approach For Privacy

    Full text link
    Privacy directly concerns the user as the data owner (data- subject) and hence privacy in systems should be implemented in a manner which concerns the user (user-centered). There are many concepts and guidelines that support development of privacy and embedding privacy into systems. However, none of them approaches privacy in a user- centered manner. Through this research we propose a framework that would enable developers and designers to grasp privacy in a user-centered manner and implement it along with the software development life cycle.Comment: 14 pages, HCI International 2017 Vancouver, Canad

    Designing a Serious Game: Teaching Developers to Embed Privacy into Software Systems

    Full text link
    Software applications continue to challenge user privacy when users interact with them. Privacy practices (e.g. Data Minimisation (DM), Privacy by Design (PbD) or General Data Protection Regulation (GDPR)) and related "privacy engineering" methodologies exist and provide clear instructions for developers to implement privacy into software systems they develop that preserve user privacy. However, those practices and methodologies are not yet a common practice in the software development community. There has been no previous research focused on developing "educational" interventions such as serious games to enhance software developers' coding behaviour. Therefore, this research proposes a game design framework as an educational tool for software developers to improve (secure) coding behaviour, so they can develop privacy-preserving software applications that people can use. The elements of the proposed framework were incorporated into a gaming application scenario that enhances the software developers' coding behaviour through their motivation. The proposed work not only enables the development of privacy-preserving software systems but also helping the software development community to put privacy guidelines and engineering methodologies into practice.Comment:

    A Model for System Developers to Measure the Privacy Risk of Data

    Get PDF
    In this paper, we propose a model that could be used by system developers to measure the perceived privacy risk of users when they disclose data into software systems. We first derive a model to measure the perceived privacy risk based on existing knowledge and then we test our model through a survey with 151 participants. Our findings revealed that users\u27 perceived privacy risk monotonically increases with data sensitivity and visibility, and monotonically decreases with data relevance to the application. Furthermore, how visible data is in an application by default when the user discloses data had the highest impact on the perceived privacy risk. This model would enable developers to measure the users\u27 perceived privacy risk associated with data items, which would help them to understand how to treat different data within a system design

    Embedding Privacy into Software Systems : A Privacy Engineering Methodology for Data Minimisation

    Full text link
    Ubiquitous software systems (online shopping, social networking apps) today collect, store and process user data, such as user’s name, age, credit card number and location. If these systems collect unnecessary data, and store and share data without implementing privacy, data could be hacked and used to steal a users’ identity, or to cause reputation or/and financial loss to users. Therefore, systems should be designed taking privacy into account. Data Minimisation (DM) is a privacy concept that is recognised in the European General Data Protection Directive, which shows that systems should minimise the collection and use of data in a system by design. However, the developers who design systems are not privacy experts. They are unable to implement DM in systems without guidance. Therefore, the research reported in this thesis focuses on developing a Privacy Engineering Methodology (PEM) that would enable developers to implement DM in software systems through understanding data privacy risks. Three experiments were conducted in this endeavour. The first experiment investigated the difficulties faced by developers when following privacy concepts, similar to DM into their development practices. The findings showed that developers lacked knowledge on privacy concepts and that most concepts are not compatible with the way developers usually work. The second experiment investigated privacy risks associated with data in software systems. The results indicated that the sensitivity of data and the visibility of the data in a system were directly proportional to the data privacy risk, and the relevance of data to the system was inversely proportional to the data privacy risk. Knowledge from experiments one and two were used to develop a PEM that enables developers to practice DM through understanding the data privacy risks associated with data. The final experiment of the thesis investigated the intention of software developers to use the PEM using a modified version of the Technology Acceptance Model (TAM). Results indicated that developers had a positive intention to use the PEM and that understanding data privacy risks enable developers to decide how to ensure user privacy in systems. Therefore, this thesis determines that data privacy risks could be used as an effective tool to enable software developers to practice DM. The thesis also encourages that common privacy theories should be presented as PEMs to enable developers to use them within their development practices
    corecore