8 research outputs found
Designing for GDPR - Investigating Childrenâs Understanding of Privacy: A Survey Approach
The General Data Protection Regulation (GDPR) places new obligations on businesses that collect and process data from children. It goes so far as to say that privacy notices should be presented in child-friendly and age appropriate formats. Fulfilling GDPR obligations will require designers to have a better understanding of how children understand privacy issues. This research aims to investigate childrenâs understanding of privacy online. Thirty-two children from a UK primary school, aged between 8 years and 10 years old completed a survey to gauge their understanding of privacy. Eight different scenarios were presented to the children and they had to decide whether the information should be kept private or not and state the reason why. This work identifies that children do have an understanding of privacy, especially when related to online safety. However, children do not yet understand that their data has an inherent value, have misconceptions about data and what data should be protected. This highlights the challenges for designers of technology used by children to meet the GDPR obligations
Towards a Multidimensional Model to Represent Human Behaviour on Online Social Networks
Online social networks have been growing exponentially. Everyday loads of new users are immersed into this environment, sharing and interacting using many different methods, tools and devices. However, this ever-growing environment leads to a variety of security and privacy concerns. Addressing this challenge, this paper proposes a discussion on risks and issues that arise from user behaviours on OSNs. To this end, a multidimensional model is presented to support the identification and analysis of such behaviours. This model comprises of three dimensions, namely, (depth of) involvement, (width of) perception and (height of) action. Furthermore, a list of ten possible disclosure behaviour divided into the three dimensions is presented and discussed.This paper analyses how these behaviours can be transformed into Personal Information Disclosure (PID)
Towards a Multidimensional Model to Represent Human Behaviour on Online Social Networks
Online social networks have been growing exponentially. Everyday loads of new users are immersed into this environment, sharing and interacting using many different methods, tools and devices. However, this ever-growing environment leads to a variety of security and privacy concerns. Addressing this challenge, this paper proposes a discussion on risks and issues that arise from user behaviours on OSNs. To this end, a multidimensional model is presented to support the identification and analysis of such behaviours. This model comprises of three dimensions, namely, (depth of) involvement, (width of) perception and (height of) action. Furthermore, a list of ten possible disclosure behaviour divided into the three dimensions is presented and discussed.This paper analyses how these behaviours can be transformed into Personal Information Disclosure (PID)
Evaluating the End-User Experience of Private Browsing Mode
Nowadays, all major web browsers have a private browsing mode. However, the
mode's benefits and limitations are not particularly understood. Through the
use of survey studies, prior work has found that most users are either unaware
of private browsing or do not use it. Further, those who do use private
browsing generally have misconceptions about what protection it provides.
However, prior work has not investigated \emph{why} users misunderstand the
benefits and limitations of private browsing. In this work, we do so by
designing and conducting a three-part study: (1) an analytical approach
combining cognitive walkthrough and heuristic evaluation to inspect the user
interface of private mode in different browsers; (2) a qualitative,
interview-based study to explore users' mental models of private browsing and
its security goals; (3) a participatory design study to investigate why
existing browser disclosures, the in-browser explanations of private browsing
mode, do not communicate the security goals of private browsing to users.
Participants critiqued the browser disclosures of three web browsers: Brave,
Firefox, and Google Chrome, and then designed new ones. We find that the user
interface of private mode in different web browsers violates several
well-established design guidelines and heuristics. Further, most participants
had incorrect mental models of private browsing, influencing their
understanding and usage of private mode. Additionally, we find that existing
browser disclosures are not only vague, but also misleading. None of the three
studied browser disclosures communicates or explains the primary security goal
of private browsing. Drawing from the results of our user study, we extract a
set of design recommendations that we encourage browser designers to validate,
in order to design more effective and informative browser disclosures related
to private mode
Elicitation and Empathy with AI-enhanced Adaptive Assistive Technologies (AATs)
Efforts to include people with disabilities in design education are difficult to scale, and dynamics of participation need to be carefully planned to avoid putting unnecessary burdens on users. However, given the scale of emerging AI-enhanced technologies and their potential for creating new vulnerabilities for marginalized populations, new methods for generating empathy and self-reflection in technology design students (as the future creators of such technologies) are needed. We report on a study with Information Systems graduate students where they used a participatory elicitation toolkit to reflect on two cases of end-user privacy perspectives towards AI-enhanced tools in the age of surveillance capitalism: their own when using tools to support learning, and those of older adults using AI-enhanced adaptive assistive technologies (AATs) that help with pointing and typing difficulties. In drawing on the experiences of students with intersectional identities, our exploratory study aimed to incorporate intersectional thinking in privacy elicitation and further understand its role in enabling sustainable, inclusive design practice and education. While aware of the risks to their own privacy and the role of identity and power in shaping experiences of bias, students who used the toolkit were more sanguine about risks faced by AAT usersâassuming more data equates to better technology. Our tool proved valuable for eliciting reflection but not empathy
âWTH..!?!â Experiences, reactions, and expectations related to online privacy panic situations
There are moments in which users might find themselves experiencing feelings of panic with the realization that their privacy or personal information on the Internet might be at risk. We present an exploratory study on common experiences of online privacy-related panic and on users' reactions to frequently occurring privacy incidents. By using the metaphor of a privacy panic button, we also gather users' expectations on the type of help that they would like to obtain in such situations. Through user interviews (n = 16) and a survey (n = 549), we identify 18 scenarios of privacy panic situations. We ranked these scenarios according to their frequency of occurrence and to the concerns of users to become victims of these incidents. We explore users' underlying worries of falling pray for these incidents and other contextual factors common to privacy panic experiences. Based on our findings we present implications for the design of a help system for users experiencing privacy panic situations.Detta paper var publicerat som manuskript i Angelos doktorsavhandling Designing for Usable Privacy and Transparency in Digital Transactions (2015)</p
Examining older usersâ online privacy-enhancing experience from a human-computer interaction perspective
The advancement of Internet technologies, including instant and unlimited access to information and services, has been an excellent source of support for older adults. However, pervasive and continuous online tracking can pose severe threats to older adultsâ information privacy. Surprisingly, very few empirical studies have focused on older usersâ online privacy-enhancing experience from a Human-Computer Interaction perspective. Therefore, it remains unclear how older users protect their online information privacy and what factors influence their online behaviors. Thus, my thesis aims to study older usersâ online privacy-enhancing experience by examining the following questions: 1) what older users know and do to protect their online information privacy, 2) how their emotional state influences their adoption of privacy-enhancing technologies (PETs), and 3) what usability challenges they encounter while using one of the most popular PETs currently available to the public. To examine these questions, a diverse set of empirical approaches was adopted, including a survey, a quasi-experiment, and a usability study.
My research findings suggest that three are three elements that play a crucial role in older users' online privacy-enhancing practices. First, older users' knowledge of online privacy has a significant influence on their daily online privacy protection behaviors. In addition, there seems to be a privacy knowledge gap among older users that reveals the phenomenon of âPrivacy Divide.' Second, the design of privacy-enhancing features affects older usersâ emotional state and their attitudes regarding their future adoption of the tool. Third, the findings of usability study indicate that the current design of a privacy- enhancing browsing tool, Tor Browser, poses particular challenges for older users. For instance, the technical terminologies and recurring warning messages have made Tor Browser more difficult for older users to use. These usability challenges not only decrease older usersâ satisfaction in but also deter their future adoption of the tool. Therefore, it is crucial that current design of PETs considers older usersâ needs.
My thesis research contributes to the privacy literature in several ways. First of all, this is the first empirical research examining older usersâ actual online privacy protection behaviors. In addition, this thesis includes the very first empirical study that illustrate the importance of the role of emotion in usersâ adoption of a privacy-enhancing tool. Furthermore, this thesis provides usability recommendations that can improve the current design of Tor Browser for older user audiences.
As the world's aging population continues to grow and advances in Internet technologies progress rapidly, the design of future technologies, from smart homes to self-driving cars, has to adopt user-centered approach, which consider end-users' needs of all age groups. Also, information privacy has become a significant aspect in our digital world, which makes the design of user-friendly privacy-enhancing tools an essential mission ahead of us. Moreover, knowledge and awareness are a key factor in older usersâ online privacy- enhancing practices. Henceforth, creating educational programs for older adults is extremely important in protecting their online privacy