5 research outputs found
Designing a Serious Game: Teaching Developers to Embed Privacy into Software Systems
Software applications continue to challenge user privacy when users interact
with them. Privacy practices (e.g. Data Minimisation (DM), Privacy by Design
(PbD) or General Data Protection Regulation (GDPR)) and related "privacy
engineering" methodologies exist and provide clear instructions for developers
to implement privacy into software systems they develop that preserve user
privacy. However, those practices and methodologies are not yet a common
practice in the software development community. There has been no previous
research focused on developing "educational" interventions such as serious
games to enhance software developers' coding behaviour. Therefore, this
research proposes a game design framework as an educational tool for software
developers to improve (secure) coding behaviour, so they can develop
privacy-preserving software applications that people can use. The elements of
the proposed framework were incorporated into a gaming application scenario
that enhances the software developers' coding behaviour through their
motivation. The proposed work not only enables the development of
privacy-preserving software systems but also helping the software development
community to put privacy guidelines and engineering methodologies into
practice.Comment:
On a scale from 1 to 10, how private are you? scoring facebook privacy settings
Abstract-As social interactions increasingly move to Facebook, the privacy options offered have come under inspection. Users find the interface confusing, and the impact of the individual settings on a user's overall privacy is difficult to determine. This creates difficulties for both users and researchers: users cannot gauge the privacy of their respective configurations, and researchers cannot easily compare the degree of privacy encapsulated in different users' choices. In this work, we suggest a novel and holistic measure for Facebook privacy settings. Based on a survey of a sample of 189 Facebook users, we incorporate appropriate weights that combine the different options into one numerical measure of privacy. This serves as a building block for measurement and comparison of Facebook users' privacy choices, enabling new inferences and insights
Children Seen But Not Heard: When Parents Compromise Children's Online Privacy
ABSTRACT Children's online privacy has garnered much attention in media, legislation, and industry. Adults are concerned that children may not adequately protect themselves online. However, relatively little discussion has focused on the privacy breaches that may occur to children at the hands of others, namely, their parents and relatives. When adults post information online, they may reveal personal information about their children to other people, online services, data brokers, or surveillant authorities. This information can be gathered in an automated fashion and then linked with other online and offline sources, creating detailed profiles which can be continually enhanced throughout the children's lives. In this paper, we conduct a study to see how widespread these behaviors are among adults on Facebook and Instagram. We use a number of methods. Firstly, we automate a process to examine 2,383 adult users on Facebook for evidence of children in their public photo albums. Using the associated comments in combination with publicly available voter registration records, we are able to infer children's names, faces, birth dates, and addresses. Secondly, in order to understand what additional information is available to Facebook and the users' friends, we survey 357 adult Facebook users about their behaviors and attitudes with regard to posting their children's information online. Thirdly, we analyze 1,089 users on Instagram to infer facts about their children. Finally, we make recommendations for privacy-conscious parents and suggest an interface change through which Facebook can nudge parents towards better stewardship of their children's privacy