8,438 research outputs found
Android Permissions Remystified: A Field Study on Contextual Integrity
Due to the amount of data that smartphone applications can potentially
access, platforms enforce permission systems that allow users to regulate how
applications access protected resources. If users are asked to make security
decisions too frequently and in benign situations, they may become habituated
and approve all future requests without regard for the consequences. If they
are asked to make too few security decisions, they may become concerned that
the platform is revealing too much sensitive information. To explore this
tradeoff, we instrumented the Android platform to collect data regarding how
often and under what circumstances smartphone applications are accessing
protected resources regulated by permissions. We performed a 36-person field
study to explore the notion of "contextual integrity," that is, how often are
applications accessing protected resources when users are not expecting it?
Based on our collection of 27 million data points and exit interviews with
participants, we examine the situations in which users would like the ability
to deny applications access to protected resources. We found out that at least
80% of our participants would have preferred to prevent at least one permission
request, and overall, they thought that over a third of requests were invasive
and desired a mechanism to block them
Infusion of Smartphone Technologies in Hospitality Service Experience
This study explored the infusion of smartphone technologies into hospitality service experience from a phenomenological approach. In-depth interviews were conducted with smartphone users who had firsthand hospitality service experience. Data analysis revealed four unique patterns within smartphone-equipped customers’ service experiences: the extended control facilitated by smartphone technologies, the functional gap between mediated and interpersonal services, the infusion of solitude into a communal experience, and the interactions among different service processes. Based on these results, this study developed a conceptual framework of the infusion of smartphone technologies in hospitality service experience. Theoretical and managerial implications of the findings were also discussed
The Mediating Role of Awareness in Bridging the Expectancy-Capability Gap in Mobile Identity Protection
AlHelaly, Y., Dhillon, G., & Oliveira, T. (2023). When Expectation Fails and Motivation Prevails: The Mediating Role of Awareness in Bridging the Expectancy-Capability Gap in Mobile Identity Protection. Computers & Security, 134(November), 1-20. [103470]. https://doi.org/10.1016/j.cose.2023.103470Identity theft poses a significant threat to mobile users, yet mobile identity protection is often overlooked in cybersecurity literature. Despite various technical solutions proposed, little attention has been given to the motivational aspects of protection. Moreover, the disparity between individuals' expectations and their ability to safeguard their mobile identities exacerbates the problem. This study adopts a mixed-methods approach and draws on expectancy-value theory to address these gaps and explore the impact of expectations, capabilities, motivational values, technical measures, and awareness on individuals' intentions to achieve mobile identity protection. Our research reveals that protection awareness acts as a crucial mediator between individuals' expectations and capabilities. Additionally, motivational values not only enhance technical protection measures but also significantly influence identity protection intentions. Furthermore, we identify the moderating effect of protection experience on individuals' expectations and perceived value of identity protection. This study contributes to mobile security literature by highlighting the pivotal role of protection awareness in bridging the divide between individual expectations and actual capabilities in mobile identity protection.publishersversionpublishe
Recommended from our members
Disposition toward privacy and information disclosure in the context of emerging health technologies.
ObjectiveWe sought to present a model of privacy disposition and its development based on qualitative research on privacy considerations in the context of emerging health technologies.Materials and methodsWe spoke to 108 participants across 44 interviews and 9 focus groups to understand the range of ways in which individuals value (or do not value) control over their health information. Transcripts of interviews and focus groups were systematically coded and analyzed in ATLAS.ti for privacy considerations expressed by respondents.ResultsThree key findings from the qualitative data suggest a model of privacy disposition. First, participants described privacy related behavior as both contextual and habitual. Second, there are motivations for and deterrents to sharing personal information that do not fit into the analytical categories of risks and benefits. Third, philosophies of privacy, often described as attitudes toward privacy, should be classified as a subtype of motivation or deterrent.DiscussionThis qualitative analysis suggests a simple but potentially powerful conceptual model of privacy disposition, or what makes a person more or less private. Components of privacy disposition are identifiable and measurable through self-report and therefore amenable to operationalization and further quantitative inquiry.ConclusionsWe propose this model as the basis for a psychometric instrument that can be used to identify types of privacy dispositions, with potential applications in research, clinical practice, system design, and policy
Better the devil you know:using lost-smartphone scenarios to explore user perceptions of unauthorised access
Smartphones are a central part of modern life and contain vast amounts of personal and professional data as well as access to sensitive features such as banking and financial apps. As such protecting our smartphones from unauthorised access is of great importance, and users prioritise this over protecting their devices against digital security threats. Previous research has explored user experiences of unauthorised access to their smartphone - though the vast majority of these cases involve an attacker who is known to the user and knows an unlock code for the device. We presented 374 participants with a scenario concerning the loss of their smartphone in a public place. Participants were allocated to one of 3 scenario groups where a different unknown individual with malicious intentions finds the device and attempts to gain access to its contents. After exposure, we ask participants to envision a case where someone they know has a similar opportunity to attempt to gain access to their smartphone. We compare these instances with respect to differences in the motivations of the attacker, their skills and their knowledge of the user. We find that participants underestimate how commonly people who know them may be able to guess their PIN and overestimate the extent to which smartphones can be g'hacked into'. We discuss how concerns over the severity of an attack may cloud perceptions of its likelihood of success, potentially leading users to underestimate the likelihood of unauthorised access occurring from known attackers who can utilize personal knowledge to guess unlock codes.</p
Intention To Disclose Personal Information Via Mobile Applications: A Privacy Calculus Perspective
This study aimed to investigate the issue of consumer intention to disclose personal information via mobile applications (apps). Drawing on the literature of privacy calculus theory, this research examined the factors that influence the trade-off decision of receiving perceived benefits and being penalized with perceived risks through the calculus lens. In particular, two paths of the direct effects on perceived benefits and risks that induce the ultimate intention to disclose personal information via mobile apps were proposed and empirically tested. The analysis showed that self-presentation and personalized services positively influence consumers’ perceived benefits, which in turn positively affects the intention to dis- close personal information. Perceived severity and perceived control serve as the direct antecedents of perceived risks that negatively affect the intention of consumers to disclose personal information. Compared with the perceived risks, the perceived benefits more strongly influence the intention to disclose personal information. This study extends the literature on privacy concerns to consumer intention to disclose personal information by theoretically developing and empirically testing four hypotheses in a research model. Results were validated in the mobile context, and implications and discussions were presented
Perceived privacy risk in the Internet of Things: determinants, consequences, and contingencies in the case of connected cars
The Internet of Things (IoT) is permeating all areas of life. However, connected devices are associated with substantial risks to users’ privacy, as they rely on the collection and exploitation of personal data. The case of connected cars demonstrates that these risks may be more profound in the IoT than in extant contexts, as both a user's informational and physical space are intruded. We leverage this unique setting to collect rich context-immersive interview (n = 33) and large-scale survey data (n = 791). Our work extends prior theory by providing a better understanding of the formation of users’ privacy risk perceptions, the effect such perceptions have on users’ willingness to share data, and how these relationships in turn are affected by inter-individual differences in individuals’ regulatory focus, thinking style, and institutional trust
- …