2 research outputs found
Negative Consequences of Anthropomorphized Technology: A Bias-Threat-Illusion Model
Attributing human-like traits to information technology (IT) â leading to what is called anthropomorphized technology (AT)âis increasingly common by users of IT. Previous IS research has offered varying perspectives on AT, although it primarily focuses on the positive consequences. This paper aims to clarify the construct of AT and proposes a âbiasâthreatâillusionâ model to classify the negative consequences of AT. Drawing on âthree-factor theory of anthropomorphismâ from social psychology and integrating self-regulation theory, we propose that failing to regulate the use of elicited agent knowledge and to control the intensified psychological needs (i.e., sociality and effectance) when interacting with AT leads to negative consequences: âtransferring human bias,â âinducing threat to human agency,â and âcreating illusionary relationship.â Based on this biasâthreatâillusion model, we propose theory-driven remedies to attenuate negative consequences. We conclude with implications for IS theories and practice
Thinking Technology as Human: Affordances, Technology Features, and Egocentric Biases in Technology Anthropomorphism
Advanced information technologies (ITs) are increasingly assuming tasks that have previously required human capabilities, such as learning and judgment. What drives this technology anthropomorphism (TA), or the attribution of humanlike characteristics to IT? What is it about users, IT, and their interactions that influences the extent to which people think of technology as humanlike? While TA can have positive effects, such as increasing user trust in technology, what are the negative consequences of TA? To provide a framework for addressing these questions, we advance a theory of TA that integrates the general three-factor anthropomorphism theory in social and cognitive psychology with the needs-affordances-features perspective from the information systems (IS) literature. The theory we construct helps to explain and predict which technological features and affordances are likely: (1) to satisfy usersâ psychological needs, and (2) to lead to TA. More importantly, we problematize some negative consequences of TA. Technology features and affordances contributing to TA can intensify usersâ anchoring with their elicited agent knowledge and psychological needs and also can weaken the adjustment process in TA under cognitive load. The intensified anchoring and weakened adjustment processes increase egocentric biases that lead to negative consequences. Finally, we propose a research agenda for TA and egocentric biases