7 research outputs found

    A Tool for Improving Privacy in Software Development

    Get PDF
    Privacy is considered a necessary requirement for software development. It is necessary to understand how certain software vulnerabilities can create problems for organizations and individuals. In this context, privacy-oriented software development plays a primary role to reduce some problems that can arise simply from individuals’ interactions software applications, even when the data being processed is not directly linked to identifiable. The loss of confidentiality, integrity, or availability at some point in the data processing, such as data theft by external attackers or the unauthorized access or use of data by employees., represent some types of cybersecurity-related privacy events. Therefore, this research work discusses the formalization of 5 key privacy elements (Privacy by Design Principles, Privacy Design Strategies, Privacy Pattern, Vulnerabilities and Context) in software development and presents a privacy tool that supports developers’ decisions to integrate privacy and security requirements in all software development phases

    Developers' Privacy Education: A game framework to stimulate secure coding behaviour

    Full text link
    Software privacy provides the ability to limit data access to unauthorized parties. Privacy is achieved through different means, such as implementing GDPR into software applications. However, previous research revealed that the lack of poor coding behaviour leads to privacy breaches such as personal data breaching. Therefore, this research proposes a novel game framework as a training intervention enabling developers to implement privacy-preserving software systems. The proposed framework was empirically investigated through a survey study (with some exit questions), which revealed the elements that should be addressed in the game design framework. The developed game framework enhances developers' secure coding behaviour to improve the privacy of the software they develop.Comment: 1

    SoK: Demystifying Privacy Enhancing Technologies Through the Lens of Software Developers

    Full text link
    In the absence of data protection measures, software applications lead to privacy breaches, posing threats to end-users and software organisations. Privacy Enhancing Technologies (PETs) are technical measures that protect personal data, thus minimising such privacy breaches. However, for software applications to deliver data protection using PETs, software developers should actively and correctly incorporate PETs into the software they develop. Therefore, to uncover ways to encourage and support developers to embed PETs into software, this Systematic Literature Review (SLR) analyses 39 empirical studies on developers' privacy practices. It reports the usage of six PETs in software application scenarios. Then, it discusses challenges developers face when integrating PETs into software, ranging from intrinsic challenges, such as the unawareness of PETs, to extrinsic challenges, such as the increased development cost. Next, the SLR presents the existing solutions to address these challenges, along with the limitations of the solutions. Further, it outlines future research avenues to better understand PETs from a developer perspective and minimise the challenges developers face when incorporating PETs into software

    European Privacy by Design [védés előtt]

    Get PDF
    Three competing forces are shaping the concept of European Privacy by Design (PbD): laws and regulations, business goals and architecture designs. These forces carry their own influence in terms of ethics, economics, and technology. In this research we undertook the journey to understand the concept of European PbD. We examined its nature, application, and enforcement. We concluded that the European PbD is under-researched in two aspects: at organizational level (compared to the individual level); and mainly in the way it is enforced by authorities. We had high hopes especially with regards to the latter, and eager to bring significant scientific contribution on this field. We were interested to learn if data protection authorities are having such impacts looking at European PbD, that can pioneer new approaches to privacy preservation. This is why we elaborated on possible ways to measure their activity, in a manner that both legal and non-legal experts can understand our work. We promised a response to the research question can the enforcement of European PbD be measured and if yes, what are possible ways to do so? We conducted data analytics on quantitative and qualitative data to answer this question the best way possible. Our response is a moderate yes, the enforcement of PbD can be measured. Although, at this point, we need to settle with only good-enough ways of measure and not dwell into choosing the most optimal or best ways. One reason for this is that enforcement of PbD cases are highly customized and specific to their own circumstances. We have shown this while creating models to predict the amount of administrative fines for infringement of GDPR. Clustering these cases was a daunting task. Second reason for not delivering what could be the best way of measure is lack of data availability in Europe. This problem has its roots in the philosophical stance that the European legislator is taking on the topic of data collection within the EU. Lawmakers in Europe certainly dislike programs that collect gigantic amounts of personal data from EU citizens. Third reason is a causal link between the inconsistent approach between the data protection authorities’ practices. This is due to the different levels of competencies, reporting structures, personnel numbers, and experience in the work of data protection authorities. Looking beyond the above limitations, there are certainly ways to measure the enforcement of European PbD. Our measurements helped us formulate the following statements: a. The European PbD operates in ‘data saver’ mode: we argue that analogous to the data saving mode on mobile phones, where most applications and services get background data only via Wi-Fi connection, in Europe data collection and data processing is kept to minimal. Therefore, we argue that European PbD is in essence about data minimization. Our conviction that this concept is more oriented towards data security have been partially refuted. b. The European PbD is platform independent: we elaborated in the thesis on various infrastructures and convergent technologies that found compatibility with the PbD principles. We consider that the indeed the concept is evolutionary and technology –neutral. c. The European PbD is a tool obligation: we argue that the authorities are looking at PbD as a tool utilization obligation. In a simple language, companies should first perform a privacy impact assessment in order to find out which tools are supporting their data processing activities and then implement these, as mandated PbD. d. The European PbD is highly territorial: we reached the conclusion that enforcement of PbD is highly dependent on geographical indicators (i.e. countries and counties). The different level of privacy protection cultures are still present in Europe. On a particular level, what is commonly true across all countries is that European PbD mandates strong EU data sovereignty
    corecore