41 research outputs found
Implication of Personalized Advertising on Personal Data: A Legal Analysis of the EU General Data Protection Regulation
The accelerating emergence of personalized advertising is mostly driven by data. Accordingly, algorithmic profiling has become a constant experience for every cyber user. However, there has been limited exploration of how personalized advertising invades the privacy and personal data of cyber users. Therefore, this study adopts the doctrinal legal method through the analysis of the European Union General Data Protection Regulation in addressing the protection of personal data profiling and the legal implications arising from the commercialization and abuse of digital usersâ data in personalized advertising. The findings of this paper discuss the main principles to be observed by the data controller in ensuring the legality of processing personal data profiling of personalized advertising.
Keywords: Personalized Advertising; Algorithmic Targeting; Personal Data Profiling; EU General Data Protection Regulation
eISSN: 2398-4287 © 2022. The Authors. Published for AMER ABRA cE-Bs by e-International Publishing House, Ltd., UK. This is an open access article under the CC BYNC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peerâreview under responsibility of AMER (Association of Malaysian Environment-Behaviour Researchers), ABRA (Association of Behavioural Researchers on Asians/Africans/Arabians) and cE-Bs (Centre for Environment-Behaviour Studies), Faculty of Architecture, Planning & Surveying, Universiti Teknologi MARA, Malaysia.
DOI
Implication of Personalized Advertising on Personal Data: A Legal Analysis of the EU General Data Protection Regulation
The accelerating emergence of personalised advertising is mostly driven by data. Accordingly, algorithmic profiling has become a constant experience for every online user in predicting preference and interest. The profiling process raises several issues of human privacy and personal data invasion. Therefore, this study adopts the doctrinal legal method through the analysis of International Instruments and the European Union General Data Protection Regulation as legal avenue to safeguard and protect online activities of the data subjects. The findings of this paper discuss the main principles to be observed by the data controller in ensuring the legality of personal data profiling. This paper suggests the profiling process to be design-based security due to unavailability of system procedure to human knowledge. Keywords: Personalised Advertising; Algorithmic Targeting; Personal Data Profiling; EU General Data Protection Regulation eISSN: 2398-4287 © 2022. The Authors. Published for AMER ABRA cE-Bs by e-International Publishing House, Ltd., UK. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peerâreview under responsibility of AMER (Association of Malaysian Environment-Behaviour Researchers), ABRA (Association of Behavioural Researchers on Asians/Africans/Arabians) and cE-Bs (Centre for Environment-Behaviour Studies), Faculty of Architecture, Planning & Surveying, Universiti Teknologi MARA, Malaysia. DOI: https://doi.org/10.21834/ebpj.v7i22.416
Implication of Personalized Advertising on Personal Data: A Legal Analysis of the EU General Data Protection Regulation
p>The accelerating emergence of personalised advertising is mostly driven by data. Accordingly, algorithmic profiling has become a constant experience for every online user in predicting preference and interest. The profiling process raises several issues of human privacy and personal data invasion. Therefore, this study adopts the doctrinal legal method through the analysis of International Instruments and the European Union General Data Protection Regulation as legal avenue to safeguard and protect online activities of the data subjects. The findings of this paper discuss the main principles to be observed by the data controller in ensuring the legality of personal data profiling. This paper suggests the profiling process to be design-based security due to unavailability of system procedure to human knowledge. Â
Keywords: Personalised Advertising; Algorithmic Targeting; Personal Data Profiling; EU General Data Protection Regulation
eISSN: 2398-4287 © 2022. The Authors. Published for AMER ABRA cE-Bs by e-International Publishing House, Ltd., UK. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peerâreview under responsibility of AMER (Association of Malaysian Environment-Behaviour Researchers), ABRA (Association of Behavioural Researchers on Asians/Africans/Arabians) and cE-Bs (Centre for Environment-Behaviour Studies), Faculty of Architecture, Planning & Surveying, Universiti Teknologi MARA, Malaysia.
DOI: https://doi.org/10.21834/ebpj.v7i22.416
âIt wouldn't happen to meâ:Privacy concerns and perspectives following the Cambridge Analytica scandal
In March 2018, news of the Facebook-Cambridge Analytica scandal made headlines around the world. By inappropriately collecting data from approximately 87 million usersâ Facebook profiles, the data analytics company, Cambridge Analytica, created psychographically tailored advertisements that allegedly aimed to influence people's voting preferences in the 2016 US presidential election. In the aftermath of this incident, we conducted a series of semi-structured interviews with 30 participants based at a UK university, discussing their understanding of online privacy and how they manage it in the wake of the scandal. We analysed this data using an inductive (i.e. âbottom-upâ) thematic analysis approach. Contrary to many opinions reported in the news, the respondents in our sample did not delete their accounts, frantically change their privacy settings, or even express that much concern. As a result, individuals often consider themselves immune to psychographically tailored advertisements, and lack understanding of how automated approaches and algorithms work in relation to their (and their networksâ) personal data. We discuss our findings in relation to wider related research (e.g. crisis fatigue, networked privacy, Protection Motivation Theory) and discuss directions for future research.</p
Algorithmic Transparency in Action: How and Why Do Companies Disclose Information on Algorithms?
With algorithm-based technology becoming increasingly omnipresent, concerns about the often-opaque nature of algorithms along with calls for greater algorithmic transparency (AT) have intensified. The study at hand responds to these calls by addressing the question of how and why companies disclose information on algorithms (âAT in actionâ). Drawing on a multiple-case design involving two companies with algorithm-enabled software offerings, the study finds 14 specific disclosure actions of companies and ten motives for them. The findings confirm the dimensions of input, transformation, and output AT suggested by extant theory and add three additional dimensions of disclosure (to whom, when, and by which means). Also, the findings suggest that norm-based motives (e.g., ethical beliefs to avoid discrimination) play a role for input AT (only), while benefit-based motives (e.g., increasing user adoption) prevail for other dimensions of AT
How to measure musical preference on Facebook? Evidence from a mixed-method data collection
More and more digital data is available for social science analysis. This
provides new ways of measuring several concepts. But when we start using new
data sources, we have to understand how the new data source could be processed
and how it could be analysed effectively. It is especially for Facebook data
since there is no established gold standard analysis-framework. However,
researchers have in-depth knowledge on how to measure different concepts using
survey data. Thus, cross-referencing Facebook data with survey data is a
reasonable way to support Facebook data analysis at different decision points.
In this paper, we present how music preference could be measured by Facebook
data and how survey data could support the selection of main indicators. Based
on our results, we provide some general suggestions for Facebook data
processing and indicator operationalization.Comment: 35 pages 3 figures, 6 table