131 research outputs found
Exploring Consumersâ Attitudes of Smart TV Related Privacy Risks
A number of privacy risks are inherent in the Smart TV ecosystem. It is likely that many consumers are unaware of these privacy risks. Alternatively, they might be aware but consider the privacy risks acceptable. In order to explore this, we carried out an online survey with 200 participants to determine whether consumers were aware of Smart TV related privacy risks. The responses revealed a meagre level of awareness. We also explored consumersâ attitudes towards specific Smart TV related privacy risks.
We isolated a number of factors that influenced rankings and used these to develop awareness-raising messages. We tested these messages in an online survey with 155 participants. The main finding was that participants were generally unwilling to disconnect their Smart TVs from the Internet because they valued the Smart TVâs Internet functionality more than their privacy. We subsequently evaluated the awareness-raising messages in a second survey with 169 participants, framing the question differently. We asked participants to choose between five different Smart TV Internet connection options, two of which retained functionality but entailed expending time and/or effort to preserve privacy
âItâs Shocking!": Analysing the Impact and Reactions to the A3: Android Apps Behaviour Analyser
The lack of privacy awareness in smartphone ecosystems prevents users from being able to compare apps in terms of privacy and from making informed privacy decisions. In this paper we analysed smartphone usersâ privacy perceptions and concerns based on a novel privacy enhancing tool called Android Apps Behaviour Analyser (A3). The A3 tool enables user to behaviourally analyse the privacy aspects of their installed apps and notifies about potential privacy invasive activities. To examine the capabilities of A3 we designed a user study. We captured and contrasted privacy concern and perception of 52 participants, before and after using our tool. The results showed that A3 enables users to easily detect their smartphone appâs privacy violation activities. Further, we found that there is a significant difference between usersâ privacy concern and expectation before and after using A3 and the majority of them were surprised to learn how often their installed apps access personal resources. Overall, we observed that the A3 tool was capable to influence the participantsâ attitude towards protecting their privacy
Privacy in crowdsourcing:a systematic review
The advent of crowdsourcing has brought with it multiple privacy challenges. For example, essential monitoring activities, while necessary and unavoidable, also potentially compromise contributor privacy. We conducted an extensive literature review of the research related to the privacy aspects of crowdsourcing. Our investigation revealed interesting gender differences and also differences in terms of individual perceptions. We conclude by suggesting a number of future research directions.</p
Big data: Finders keepers, losers weepers?
This article argues that big dataâs entrepreneurial potential is based not only on new technological developments that allow for the extraction of non-trivial, new insights out of existing data, but also on an ethical judgment that often remains implicit: namely the ethical judgment that those companies that generate these new insights can legitimately appropriate (the fruits of) these insights. As a result, the business model of big data companies is essentially founded on a libertarian-inspired âfinders, keepersâ ethic. The article argues, next, that this presupposed âfinder, keepersâ ethic is far from unproblematic and relies itself on multiple unconvincing assumptions. This leads to the conclusion that the conduct of companies working with big data might lack ethical justification
Competing jurisdictions: data privacy across the borders
Borderless cloud computing technologies are exacerbating tensions between European and other existing approaches to data privacy. On the one hand, in the European Union (EU), a series of data localisation initiatives are emerging with the objective of preserving Europeâs digital sovereignty, guaranteeing the respect of EU fundamental rights and preventing foreign law enforcement and intelligence agencies from accessing personal data. On the other hand, foreign countries are unilaterally adopting legislation requiring national corporations to disclose data stored in Europe, in this way bypassing jurisdictional boundaries grounded on physical data location. The chapter investigates this twofold dynamics by focusing particularly on the current friction between the EU data protection approach and the data privacy model of the United States (US) in the field of cloud computing
Media, Capabilities, and Justification
In this paper, I evaluate the âcapability approachâ developed by Amartya Sen and Martha Nussbaum as a normative perspective for critical media research. The concept of capabilities provides a valuable way of assessing media and captures important aspects of the relationship between media and equality. However, following Rainer Forstâs critique of outcome- oriented approaches to justice, I argue the capability approach needs to pay more attention to questions of power and process. In particular, when it comes to deciding which capabilities media should promote and what media structure and practices should promote them, the capability approach must accept the priority of deliberative and democratic processes of justification. Once we do this, we are urged to situate the concept of capabilities within a more process-oriented view of justice, focused not on capabilities as such, but on outlining the conditions required for justificatory equality. After discussing the capability approach, I will outline the process-oriented theory of justice Forst has developed around the idea of the âright to justificationâ. While Forst does not discuss media in depth, I argue his theory of justice can provide a valuable alternative normative standpoint for the critical media research
Is Privacy Controllable?
One of the major views of privacy associates privacy with the control over
information. This gives rise to the question how controllable privacy actually
is. In this paper, we adapt certain formal methods of control theory and
investigate the implications of a control theoretic analysis of privacy. We
look at how control and feedback mechanisms have been studied in the privacy
literature. Relying on the control theoretic framework, we develop a simplistic
conceptual control model of privacy, formulate privacy controllability issues
and suggest directions for possible research.Comment: The final publication will be available at Springer via
http://dx.doi.org/ [in press
Data privacy compliance benefits for organisations - a cyber-physical systems and Internet of Things study
The protection of peopleâs privacy is both a legal requirement and a key factor for doing business in many jurisdictions. Organisations thus have a legal obligation to get their privacy compliance in order as a matter of business importance. This applies not only to organisationsâ day-to-day business operations, but also to the information technology systems they use, develop or deploy. However, privacy compliance, like any other legal compliance requirements, is often seen as an extra burden that is both unnecessary and costly. Such a view of compliance can result in negative consequences and lost opportunities for organisations. This paper seeks to position data privacy compliance as a value proposition for organisations by focusing on the benefits that can be derived from data privacy compliance as it applies to a particular subset of information technology systems, namely cyber-physical systems and Internet of Things technologies. A baseline list of data privacy compliance benefits, contextualised for CPSs and IoT with the South African legal landscape is proposed.http://www.springer.comseries/7899hj2021Informatic
Implementing GDPR in the Charity Sector: A Case Study
Due to their organisational characteristics, many charities
are poorly prepared for the General Data Protection Regulation (GDPR).
We present an exemplar process for implementing GDPR and the DPIA
Data Wheel, a DPIA framework devised as part of the case study, that
accounts for these characteristics. We validate this process and framework by conducting a GDPR implementation with a charity that works
with vulnerable adults. This charity processes both special category (sensitive) and personally identifiable data. This GDPR implementation was
conducted and devised for the charity sector, but can be equally applied
in any organisation that needs to implement GDPR or conduct DPIAs
- âŠ