Moral conformity in a digital world: Human and nonhuman agents as a source of social pressure for judgments of moral character

Abstract

Could judgments about others’ moral character be changed under group pressure produced by human and virtual agents? In Study 1 (N = 103), participants first judged targets’ moral character privately and two weeks later in the presence of real humans. Analysis of how many times participants changed their private moral judgments under group pressure showed that moral conformity occurred, on average, 43% of the time. In Study 2 (N = 138), we extended this using Virtual Reality, where group pressure was produced either by avatars allegedly controlled by humans or AI. While replicating the effect of moral conformity (at 28% of the time), we find that the moral conformity for the human and AI-controlled avatars did not differ. Our results suggest that human and nonhuman groups shape moral character judgments in both the physical and virtual worlds, shedding new light on the potential social consequences of moral conformity in the modern digital world

    Similar works