16 research outputs found

    The RFID PIA – developed by industry, agreed by regulators

    Get PDF
    This chapter discusses the privacy impact assessment (PIA) framework endorsed by the European Commission on February 11th, 2011. This PIA, the first to receive the Commission's endorsement, was developed to deal with privacy challenges associated with the deployment of radio frequency identification (RFID) technology, a key building block of the Internet of Things. The goal of this chapter is to present the methodology and key constructs of the RFID PIA Framework in more detail than was possible in the official text. RFID operators can use this article as a support document when they conduct PIAs and need to interpret the PIA Framework. The chapter begins with a history of why and how the PIA Framework for RFID came about. It then proceeds with a description of the endorsed PIA process for RFID applications and explains in detail how this process is supposed to function. It provides examples discussed during the development of the PIA Framework. These examples reflect the rationale behind and evolution of the text's methods and definitions. The chapter also provides insight into the stakeholder debates and compromises that have important implications for PIAs in general.Series: Working Papers on Information Systems, Information Business and Operation

    Human Values as the Basis for Sustainable Information System Design

    Get PDF
    Information systems (IS) play an increasing role for individual well-being [3], for the environment [4], and for society at large [5]. Considering sustainability in IS development is therefore becoming paramount. However, companies today associate sustainability with extra cost and burden on their operations. As a result, many view sustainability more as a demand and a challenge rather than an opportunity. In this article, we argue that companies should rethink this attitude, as both sustainability and a business model can be understood as deeply rooted in human values

    The Challenges of Privacy by Design

    Get PDF
    Heralded by regulators, Privacy by Design holds the promise to solve the digital world's privacy problems. But there are immense challenges, including management commitment and step-by-step methods to integrate privacy into systems

    About the Importance of Interface Complexity and Entropy for Online Information Sharing

    Get PDF
    In this paper, we describe two experiments that show the powerful influence of interface complexity and entropy on online information sharing behaviour. 134 participants were asked to do a creativity test and answer six open questions against three different screen backgrounds of increasing complexity. Our data shows that, as an interface becomes more complex and has more entropy users refer less to themselves and show less information sharing breadth. However, their verbal creativity and information sharing depth do not suffer in the same way. Instead, an inverse U shaped relationship between Interface complexity and creativity as well as information sharing depth can be observed: Users become more creative and thoughtful until a certain tipping point of interface complexity is reached. At that point, creativity and th inking suffer, leading to significantly less disclosure. This result challenges the general HCI assumption that simplicity is always best for computers interface design , as users'creativity and information sharing depth initially increases with more interface complexity. Our results suggest that the Yerkes Dodson Law may be a key theory underlying online creativity and depth of online disclosures

    A vision for global privacy bridges: Technical and legal measures for international data markets

    Get PDF
    From the early days of the information economy, personal data has been its most valuable asset. Despite data protection laws and an acknowledged right to privacy, trading personal information has become a business equated with "trading oil". Most of this business is done without the knowledge and active informed consent of the people. But as data breaches and abuses are made public through the media, consumers react. They become irritated about companies' data handling practices, lose trust, exercise political pressure and start to protect their privacy with the help of technical tools. As a result, companies' Internet business models that are based on personal data are unsettled. An open conflict is arising between business demands for data and a desire for privacy. As of 2015 no true answer is in sight of how to resolve this conflict. Technologists, economists and regulators are struggling to develop technical solutions and policies that meet businesses' demand for more data while still maintaining privacy. Yet, most of the proposed solutions fail to account for market complexity and provide no pathway to technological and legal implementation. They lack a bigger vision for data use and privacy. To break this vicious cycle, we propose and test such a vision of a personal information market with privacy. We accumulate technical and legal measures that have been proposed by technical and legal scholars over the past two decades. And out of this existing knowledge, we compose something new: a four-space market model for personal data

    Twenty years of value sensitive design: a review of methodological practices in VSD projects

    Get PDF
    This article reviews the academic literature (1996-2016) that emerged under value sensitive design (VSD). It investigates those VSD projects that employed the tripartite methodology, examining the use of VSD methodological elements, and illustrating common practices and identifying shortcomings. The article provides advice for VSD researchers on how to complete and enhance their methodological approach as the research community moves forward

    A systematic methodology for privacy impact assessments: a design science approach

    Get PDF
    For companies that develop and operate IT applications that process the personal data of customers and employees, a major problem is protecting these data and preventing privacy breaches. Failure to adequately address this problem can result in considerable damage to the company's reputation and finances, as well as negative effects for customers or employees (data subjects). To address this problem, we propose a methodology that systematically considers privacy issues by using a step-by-step privacy impact assessment (PIA). Existing PIA approaches cannot be applied easily because they are improperly structured or imprecise and lengthy. We argue that companies that employ our PIA can achieve "privacy-by-design", which is widely heralded by data protection authorities. In fact, the German Federal Office for Information Security (BSI) ratified the approach we present in this article for the technical field of RFID and published it as a guideline in November 2011. The contribution of the artefacts we created is twofold: First, we provide a formal problem representation structure for the analysis of privacy requirements. Second, we reduce the complexity of the privacy regulation landscape for practitioners who need to make privacy management decisions for their IT applications

    Engineering Privacy by Design: Are engineers ready to live up to the challenge?

    Get PDF
    Organizations struggle to comply with legal requirements as well as customers' calls for better data protection. Yet, information privacy depends on system engineers putting effort into the matter. We interviewed six seniors in system engineering, who work for globally leading IT corporations and research institutions in order to investigate their motivation and ability to comply with privacy expectations. The results of our in-depth interview study point to a lack of perceived responsibility, control and autonomy and to a struggle with the legal world. The information society may be facing the dilemma of asking engineers to live up to a challenge they are currently not ready to embrace

    Understanding Engineers' Drivers and Impediments for Ethical System Development: The Case of Privacy and Security Engineering

    Get PDF
    Machine ethics is a key challenge in times when digital systems play an increasing role in people's life. At the core of machine ethics is the handling of personal data and the security of machine operations. Yet, privacy and security engineering are a challenge in today's business world where personal data markets, corporate deadlines and a lag of perfectionism frame the context in which engineers need to work. Besides these organizational and market challenges, each engineer has his or her specific view on the importance of these values that can foster or inhibit taking them into consideration. We present the results of an empirical study of 124 engineers based on the Theory of Planned Behavior and Jonas' Principle of Responsibility to understand the drivers and impediments of ethical system development as far as privacy and security engineering are concerned. We find that many engineers find the two values important, but do not enjoy working on them. We also find that many struggle with the organizational environment. They face a lack of time and autonomy that is necessary for building ethical systems, even at this basic level. Organizations' privacy and security norms are often too weak or even oppose value-based design, putting engineers in conflict with their organizations. Our data indicate that it is largely engineers' individually perceived responsibility as well as a few character traits that make a positive difference

    Towards a value theory for personal data

    Get PDF
    Analysts, investors and entrepreneurs have recognized the value of personal data for Internet economics. Personal data is viewed as the "oil" of the digital economy. Yet, ordinary people are barely aware of this. Marketers collect personal data at minimal cost in exchange for free services. But will this be possible in the long term, especially in the face of privacy concerns? Little is known about how users really value their personal data. In this paper, we build a user-centered value theory for personal data. On the basis of a survey experiment with 1269 Facebook users, we identify core constructs that drive the value of volunteered personal data. We find that privacy concerns are less influential than expected and influence data value mainly when people become aware of data markets. In fact, the consciousness of data being a tradable asset is the single most influential factor driving willingness-to-pay for data. Furthermore, we find that people build a sense of psychological ownership for their data and hence value it more. Finally, our value theory helps to unveil market design mechanisms that will influence how personal data markets thrive: First, we observe a majority of users become reactant if they are consciously deprived of control over their personal data; many drop out of the market. We therefore advice companies to consider user-centered data control tools to have them participate in personal data markets. Second, we find that in order to create scarcity in the market, centralized IT architectures (reducing multiple data copies) may be beneficial
    corecore